CR4 - The Engineer's Place for News and Discussion ®
Login | Register for Engineering Community (CR4)



Notes & Lines

Notes & Lines discusses the intersection of math, science, and technology with performing and visual arts. Topics include bizarre instruments, technically-minded musicians, and cross-pollination of science and art.

A Portable Pipe Organ: the International Touring Organ

Posted July 05, 2016 1:25 PM by BestInShow

Jonathan Fuller describes how the digital organ grew from the desire of a father to give his son a decent instrument to play at home. The desire of a 21st century organ virtuoso for a virtual pipe organ, a personal instrument that would travel with him, led to the development of the International Touring Organ, perhaps the most sophisticated application of digital organ technology – and organ-building ingenuity – to come to concert venues yet. The ITO story, like that of the Allen digital organ, combines the vision of a performer with pioneering technology to produce a stupendous, if sometimes controversial, musical instrument.

Traditional pipe organ. Image credit: Wikimedia Commons.

The artist

Cameron Carpenter, a prodigiously-talented, Julliard-educated organist, is out to change the conception that pipe organs are just for sacred music (with exceptions for baseball parks and movie theaters). His repertoire ranges from Bach to Bernstein to Leonard Cohen. Carpenter began campaigning for a portable pipe organ ten years or so ago. "Unlike a violinist who plays at 28 the instrument they played at eight, I have to go every night and play an instrument I don't know," he said. “What one doesn't see is the countless hours of rehearsals which are spent, not on the music, they're spent on (the organ).” Pipe organs aren’t portable. And they are intricately complex; no two are alike, and a performer typically needs hours to get acquainted with each instrument prior to a performance. Moreover, an instrument fixed in place limits concerts to locations with organs. Carpenter understood that applying the digital technology powering smaller organs to instruments with a grander scale would give him the portable instrument he craved.

The organ builders

Marshall and Ogletree Organ Builders (M and O) is the brainchild of two highly-respected concert organists, Douglas Marshall and David Ogletree. Founded in 2002, the company’s purpose is to build digital organs that meet – or surpass – the sound of the greatest pipe organs in the world. Since its inception the company has built a dozen organs. Each organ’s sounds are based on sound samples from pipe organs, using M and O’s patented process based on the one described in Fuller’s article. The company works in close cooperation with each instrument’s commissioners. Marshall and Ogletree undertook a daring proof of concept: the company built an electronic organ based solely on the sound of one pipe organ, an 1897 George Hutchings organ at Boston's Basilica of Our Lady of Perpetual Help. “Blind” comparisons of the sound of the original organ with the sound of the M and O organ demonstrated the fidelity of the new instrument to the original. Follow the link to do your own comparison.

The collaboration

In 2013, Carpenter commissioned M and O to build his touring organ, an instrument equal in sound to the finest pipe organs in the world that is transportable to concert venues. To create the sounds the artist wanted, M and O synthesized sounds from Carpenter’s favorite organs worldwide, including at least one Wurlitzer and organs he knew from his childhood.

The complexity of this commission goes beyond creating the organ sound; M and O developed the sound system and all of the internal components that make this organ work. Eighteen cases of components fill the truck that accompanies Carpenter to his concerts:

  • The six-manual console
  • Ten cases of speakers
  • Eight cases of subwoofers
  • A supercomputer/amplifier system

Supercomputers – three of them – with extremely fast processing speed manage the conversation between the organ’s manuals and the stored sounds. The flexible sound system can operate on two channels, for a set of headphones, up through 48 channels to fill a large concert venue.

The ingenuity involved in making this whole system portable is just about as impressive as engineering the digital organ sounds. The console breaks down into six pieces. The speakers, subwoofers, and console pieces fit into purpose-designed cases that protect them from the rigors of travel. According to M and O, “the entire organ, including console, racks and 48 channel audio system can be set up from dock to operational in about two hours, and broken down in about the same.” This video shows a greatly speeded-up version of organ set–up.

The result

Carpenter debuted his new International Touring Organ (ITO) March 9, 2014, in New York City’s Alice Tully Hall. The verdict of New York Times music critic Anthony Tommasini on the organ’s sound: “quite terrific.” Later in 2014, Mark Swed in the Los Angeles Times dubbed the ITO “… a genuine dream organ, a fantastically versatile electronic instrument with magnificent sounds…” A more recent Boston Classical review (July 11, 2015) characterizes the ITO as “an orchestra unto itself, capable of shimmering wah-wah effects.” Writing when the ITO made its debut, the artist himself pronounced that “… [The ITO] dares me to make good on my talent …”

Cameron Campbell and the ITO. Image credit: Marshall and Ogletree

Perhaps the most significant technical breakthrough for M and O organ technology is the firm’s ability to build instruments finely tuned to the commissioner’s liking, to fit both the venue(s) and the performer’s repertoire. The ITO’s portability, while authentic, is still limited to locations that its truck can access – and with enough power to pump into its circuits. I’m personally thankful that this somewhat lumpy portability brought Campbell and the ITO to Tanglewood, the Boston Symphony Orchestra’s summer home, last July. I can confirm that Campbell and the ITO do indeed sound quite terrific.

References

http://www.cameroncarpenter.com/

http://www.marshallandogletree.com/

https://en.wikipedia.org/wiki/Cameron_Carpenter

http://www.theverge.com/2014/5/22/5741570/cameron-carpenter-international-touring-organ

http://www.metroweekly.com/2013/10/pipe-dreams/

Add a comment

Snow in June, Frost in July, Ice in August

Posted May 09, 2016 10:32 AM by Jonathan Fuller

This coming summer marks the 200th anniversary of one of the most severe weather anomalies in modern history. The Year Without a Summer, as it's now commonly known, wreaked havoc on much of the Northern Hemisphere. In upstate New York and New England, in the vicinity of CR4 headquarters, snow fell in June, frosts were common from May through August, and temperatures sometimes swung violently between normal summer highs of 90° F or more to near-freezing in a matter of hours.

The climatic conditions of 1816 also resulted in unseasonably low temperatures and heavy rains as far east as China. In Europe famine was widespread and riots, looting, arson and demonstrations were common occurrences. Throughout the hemisphere, farming became nearly impossible, and grain prices increased exponentially. In an age when subsistence farming was the norm and commoners worked their hands to the bone to feed their families, crop failures often meant the possibility for starvation.

Contemporary observers were almost completely perplexed as to the disappearance of the summer of 1816, but scientists now believe it was the result of a few interrelated factors. The most significant of these was the April 1815 eruption of Mount Tambora on the Indonesian island of Sumbawa. Tambora was likely the most powerful volcanic eruption in recorded history, with a column height of over 26 miles and a tephra volume of over 38 cubic miles. Over 70,000 Indonesians were killed following the blast. The enormous amount of volcanic ash that spewed into the atmosphere reflected large quantities of sunlight and lowered Northern Hemisphere temperatures.

To compound the effects of the ash, modern scientists also believe that solar magnetic activity was at a historic low in 1816, the midpoint of a 25-year solar period known as the Dalton Minimum. By studying the presence of carbon-14 in tree rings, solar astronomers have concluded that sunspot activity was abnormally low, reducing the transmission of solar radiation to Earth. Ironically, the Tambora eruption often caused a dry fog to settle over the Northern Hemisphere, producing a reddened and dimmed Sun and causing sunspots to become visible to the naked eye. With little knowledge of the eruption, 19th-century Americans and Europeans often blamed the red, spotty Sun alone for the abnormal weather conditions, while in reality Tambora's ash played a much more significant role.

A third less-studied factor is the possibility of a solar inertial shift. These shifts, occurring every 180 years or so due to the gravitational pull of the largest planets in the Solar System, cause the Sun to wobble on its axis and possibly affect Earth's climate. Scientists point to three of these shifts--in 1632, 1811, and 1990--that correspond to major climatic events: the solar Maunder Minimum from 1645-1715, the Dalton Minimum discussed above, and the eruption of Mount Pinatubo with corresponding global cooling in 1991. This association remains largely hypothetical, however.

The Year Without a Summer produced some interesting and long-lasting cultural effects. Thousands left the American Northeast and settled in the Midwest to escape the frigid summer; Latter-day Saints founder Joseph Smith was forced move from Vermont and settle in Western New York, the first in a series of events that culminated in his writing The Book of Mormon. German inventor Karl Drais may have invited the Laufmaschine, the predecessor of the bicycle, in 1818 in response to the shortage of horses caused by the 1816 crop failure.

That summer may have influenced contemporary art as well. The high concentrations of tephra in the atmosphere led to spectacular yellow and red sunsets, which were captured by J.M.W. Turner's paintings of the 1820s. (If you've ever wondered about the vivid red sky in the more widely known painting The Scream, some modern scholars believe Edvard Munch may have viewed a similarly vivid sunset as a result of the 1883 eruption of Krakatoa.) Trapped inside their Swiss villa due to the excessive rains in June 1816, a group of English writers on holiday passed the time by seeing who could write the most frightening ghost story. Mary Shelley came up with the now-famous Frankenstein which she would finish and publish in 1818, while Lord Byron's unfinished fragment The Burial inspired John William Polidori to write The Vampyre in 1819, effectively launching the still-healthy field of romantic vampire fiction.

The advancement of agricultural technology more or less ensures that we'll never have a comparable subsistence crisis to that of 1816, despite any further severe weather anomalies. Even so, it's chilling to examine that year's events and attitudes toward them, as expressed by surviving journals and works of art.

Image credits: NOAA | Public domain

7 comments; last comment on 05/13/2016
View/add comments

Earth Day, And Problematic Predictions

Posted May 02, 2016 12:00 AM by Jonathan Fuller

There are few more polarizing issues than environmental ones: climate change, the feasibility of alternative energy, and, in recent news, Earth Day. My 2016 Earth Week was spent scrounging for recyclables for a school project in which my son built a robot statue out of milk jugs, cereal boxes, and empty yogurt cups. From where I sit Earth Day is a good thing, a message of "don't be so anthropocentric that you think you can just throw your crap anywhere and let Mother Nature take care of it."

The original 1970 Earth Day was by contrast marked by apocalyptic predictions and fear-based thinking. Americans were surrounded by grim reminders of industrial pollution, such as the 1969 Cuyahoga River fire, the labeling of Lake Erie as a "gigantic cesspool," and heavy smog in urban areas. The original Earth Day saw the prediction of mass starvations, worldwide famines, the extinction of 80% of all living animals, the reduction of ambient sunlight by 50%, 45-year lifespans, and the elimination of all crude oil, all by the year 2000.

All of these predictions fell well short, of course. The 1970 prognosticators were concerned about pollution and fossil fuels but were mostly anxious about overpopulation, a topic that arouses little fear 46 years later. If anything, our world is looking to be in better shape. We're seeing significantly higher crop yields using the same amount of land, lower staple food prices, and no more DDT; we have at least as much fossil fuel to last us for about another century, maybe; and population looks to level off within the next few decades. The Earth Day doomsters, particularly Paul Ehrlich and John Holdren, exercised faulty and outdated logic in assuming that Negative Impact = (Population)(Affluence)(Technology). What they didn't plan for is that technology has the power to boost positive inputs like food production and medicine, and we've learned effective strategies to reduce pollution along the way.

The problem with predictions is that for every correct one, there are countless more that are dead wrong, even those based on rigorous data and scientific extrapolation. So-called futurists predict like it's their job, often assigning a target date to the year, and are typically wrong. Consider, for example, Arthur C. Clarke's predictions for the 21st century. The occurrence of a technological singularity is still a hot topic in AI communities, and most thinkers--building on the assumption that Moore's Law will continue unabated--agree that self-improving artificial general intelligence will occur within the next 50 years or so. At the 2012 Singularity Summit, Stuart Armstrong acknowledged the uncertainty in predicting advanced AI by stating that his "current 80% estimate [for the singularity] is something like 5 to 100 years." Now that's how to make a prediction...

This isn't to say that long-term thinking isn't valuable or honorable. In grad school I became intrigued by the Long Now Foundation, a non-profit working to foster slower/better thinking rather than the prevailing faster/cheaper activities of modern times. Aside from hosting seminars and advocating a five-digit date structure (ie, 02016 for 2016) to anticipate the Year 10,000 problem, the group keeps a record of long-term predictions and bets made by its members for fun and accountability. They're also constructing a clock designed to run for 10,000 years with minimal maintenance using simple tools, and are working on a publicly accessible digital library of all known human languages for posterity.

While the Long Now's activities may seem radical, most examples of true long-term thinking are, given our blindly rushing existence. Shortly before his death, Kurt Vonnegut proposed a presidentially appointed US Secretary of the Future, whose sole duty is to determine an activity's impact on future generations. The Great Law of the Iroquois famously mandated that current generations make decisions that would benefit their descendants seven generations (about 140 years) into the future. The difficulty, of course, is satisfying ourselves in the now as well as in the future. As environmental skeptics point out, it's nearly impossible to plan out our fossil fuel dependency and use for the next decade, let alone for the next hundred years.

Perhaps our best bet is to view the future in terms of possibility. To paraphrase Clarke's first law of prediction: "When a distinguished but elderly scientist states that something is possible, he's almost certainly correct."

Image credit: futureatlas.com / CC BY 2.0

6 comments; last comment on 05/10/2016
View/add comments

The Evolution of Tech Grammar, or: Language is Goofy

Posted April 18, 2016 12:00 AM by Jonathan Fuller
Pathfinder Tags: ap style grammar words

From 2008 to 2009 I worked for HSBC Bank, one of the more interesting workplaces to be in during the Great Recession. A disgruntled non-customer, whom I believe was teetering on the edge of financial oblivion like so many of us, once pointedly asked me what the hell HSBC stood for, anyway. I told him that it was just an acronym*...that the actual name of the business was HSBC Bank--nothing more, nothing less. "But you're like, a Chinese bank, right? Isn't the 'H' for Hong Kong?" I assured him it was not, making him even angrier at the situation.

While I may have been coy about (playfully) screwing with this man, HSBC is a British company; it was originally based in Hong Kong and the acronym once stood for Hong Kong and Shanghai Banking Corporation. In 1991 it reorganized and from then on legally existed as HSBC Holdings plc. I can't say I know why the company disassociated from its Far Eastern roots, but the point is that language and acronyms change for various reasons: falling in and out of common usage, to avoid certain stereotypes or associations, or just because they become too verbose or antiquated.

[*As a CR4 editor pointed out to me after reading this post, HSBC might be more accurately termed an initialism before 1991, and a pseudo-initialism since. Acronyms are "pronounced," like NATO and JPEG, while initialisms are strings of initials.]

A few weeks ago the Associated Press made a major announcement: the 2016 AP Stylebook will lowercase both "internet" and "web." In line with past stylebook changes, it's safe to assume that the AP believes that these two terms are now generic enough to merit lowercased usage. Pro-lowercase activists look to the origin of the word to make their point: the "internet" of old was simply an internetwork of smaller networks using the same protocol. So when we speak about the modern Internet--the one I'm using to research this blog post and connect remotely to my office computer--we're referring to the largest and best-known example of an internet. Also, they say lowercasing is more efficient, saving thousands of Shift-key strokes, and that capitalized nouns are a strain on the eyes, introducing roadblocks into neatly flowing text.

The other side of the battle, on which I sometimes side, takes issue with the word "the." Think about the star at the center of our solar system. A star at the center of some other distant solar system could be called its "sun," but we call the most local and best-known example to us on Earth the Sun, capitalized and all, for clarity. I know of no other significant internets other than THE Internet--if you know of one feel free to comment and enlighten me. And regarding the web, what if we're trying to describe researching spider webs online? Would we look up webs on the web? Isn't the Web clearer? Call me antiquated (my wife does on a daily basis, so I'm used to it), but I like my Internet and Web, even if I'm too lazy to click Shift and actually capitalize them most of the time.

These technologically related style changes happen pretty frequently. For example, AP changed their usage of Web site to website in 2010, and e-mail to email in 2011. These make more sense as generic terms, in my opinion: we surely no longer think of email as "electronic mail." With the slow demise of postal mail, perhaps email will one day be referred to as just "mail," and postal mail will become oldmail or cismail, maybe.

The fluidity of technical terminology is also easily seen in anacronyms, or words that were formerly acronyms but have fallen into common usage. Lasers were originally "light amplification by stimulated emission of radiation," for example. Treating "laser" as a common noun allowed us to back-form the verb "to lase," meaning to produce laser light. Ironically for me as a technical writer and editor, even the verb "to edit" was back-formed from "editor," the original term.

The possibility for confusing variation and evolution in the English language is endless. Who knows? Maybe in 50 years our descendants will just switch on their computers and internet.

Image credit: Stinging Eyes / CC BY-SA 2.0

21 comments; last comment on 05/17/2016
View/add comments

Combining Optics and Audio to Save Historical Recordings

Posted March 31, 2016 12:00 AM by Jonathan Fuller
Pathfinder Tags: archive radio recordings

My sister-in-law works as an archivist, and from what I hear her daily work is pretty much what you'd expect of the job. She spends a lot of time in dark basements, has frequent attacks of dust-triggered sinusitis, sometimes wears white gloves, and most importantly preserves and catalogs old books and papers so they can be accessed by future researchers.

Preserving physically readable materials like books is relatively straightforward, but archivists have run into well-documented problems preserving system-dependent materials like computer files or sounds. In the case of the latter, the earliest examples of recorded sound are becoming more and more difficult to access and play back. Disc records are now generally limited to hi-fi enthusiasts, and maybe 0.5% of the population has ever seen a cylinder phonograph in person, so archivists have been concerned that early recordings may be lost forever.

The US Library of Congress is fighting against that tide thanks in part to IRENE, a device developed at Berkeley Lab by researchers recycling particle physics methodologies. IRENE uses high-res optical technologies to take millions of images of a grooved recording medium and converts the grooves into a sonic waveform. Using optical rather than audio technology has two primary advantages: avoiding further wear on 100+ year old grooves by limiting contact, and the ability to reconstruct sound from broken or unplayable discs or cylinders.

IRENE's name is derived from the first audio extraction performed, a Weavers recording of "Goodnight, Irene," but its name has since become a backronym for "Image, Reconstruct, Erase Noise, Etc." The machine made a splash in 2008 when it reconstructed audio from an 1860 phonautogram recording of the French folk song "Au Clair de la Lune." Prior to this discovery, researchers figured Edison recordings of the 1870s to be the earliest surviving recorded sounds. (True to internet fashion, the entire experimental discography of Édouard-Léon Scott de Martinville, who invented the phonautograph, is on YouTube.)

IRENE has been successfully employed in extracting audio from a wide variety of media since 2008, including Alexander Graham Bell's Volta Labs experiments. The beauty of using optical technology is seen in the last of these examples, an artifact consisting of a wax disc still attached to a primitive recording machine. Researchers simply placed the scanner's beam over the disc and used an external drive to rotate the machine, preserving both the disc and machine.

In a more recent sound preservation effort, the Library of Congress held a Radio Preservation Task Force symposium in late February, part of a larger collaborative effort to preserve early radio recordings. That conference was inspired by a 2013 LoC report that found that many important historical broadcasts were either untraceable or had been destroyed entirely, and that unlike other archival areas, "little is known of what still exists, where it is stored, and in what condition." Seeing as how radio was once the dominant medium for real-time news broadcasts and discussion about niche topics, rediscovery of historic recordings, although it rarely occurs, is a big deal.

Archivists have had perhaps more pressing issues on the digital front as well. Although digital files take up significantly less physical space, they're prone to system compatibility issues resulting from the exponential growth of computing equipment. Whether it's wax cylinders, radio broadcasts, or digital files, sound archivists continue to dutifully perform important, and often thankless, preservation work.

Image credit: Library of Congress Blog

2 comments; last comment on 04/03/2016
View/add comments


Previous in Blog: Organ Pipes: An Enduring Application for Lead  
Show all Blog Entries in this Blog

Advertisement