top of page

JOHN CAGE TRIBUTE: Joel Chadabe

Joel Chadabe on John Cage In The Future of Music: Credo 1937, Cage said, I believe that the use of noise to make music will continue and increase until we reach a music produced through the aid of electrical instruments which will make available for musical purposes any and all sounds that can be heard ..."

100 x John - a centennial tribute project: Joel Chadabe, Joseph Kubera, Richard Lainhart Joel Chadabe on composing Biography

_________________________________________________________________________________


In 1937, in a talk titled The Future of Music: Credo, he said, "I believe that the use of noise to make music will continue and increase until we reach a music produced through the aid of electrical instruments which will make available for musical purposes any and all sounds that can be heard ..."


By "noise", Cage meant found sounds. And in fact, many of his compositions used found sounds, notably Birdcage, which used recordings of birds in aviaries, and Roaratorio, which used the sounds mentioned by Joyce in Finnegan's Wake. 
The use of found sounds also opened the door to different contexts for understanding sound, such as Acoustic Ecology in the 1970s and Ear to the Earth in 2006. Given current technology in recording devices, we think of field recordings as a way of accessing sounds in the world around us. And using sound to learn about the world.


Ear to the Earth (E2E), a program of Electronic Music Foundation founded by Joel Chadabe, is a worldwide network that promotes environmental sound art and image as a powerful way of connecting with the environment and understanding the state of the world. E2E has launched a program “100 x John” – an E2E Soundscape project of 100 compositions to be developed in homage to John Cage. Begun for the occasion of the 100th anniversary year of his birth, 100 x John honors his leadership in what Chadabe called "the great opening up of music to all sounds".


Information and call for participation details here: http://www.eartotheearth.org/events.html


Compositions:


100 x John 1 - Scenes from Staten Island Joseph Kubera, longtime performer of John Cage's music, and Joel Chadabe, longtime collaborator with John Cage, recorded Staten Island sounds to compose 100 x John 1, based on Cage’s compositional processes. Kubera's performance in the 2011 Ear to the Earth Festival concert was a salute to John Cage in the 100th anniversary year of his birth.

The places represented by the field recordings, with some them shown in the images below, are listed in approximate order of their appearance in the composition: the beach, the marina, two ethnic restaurants, the boardwalk, two sports fields, traffic throughout the island, a firetruck siren, the Staten Island train, bells from a church in the Coast Guard station, traffic throughout the island, traffic on the Verrezano Bridge recorded from under the bridge, and at different waterfronts with different activities and views.



100 x John 2 - To Manhattan

This composition is the single recorded sound of the Staten Island ferry traversing lower New York Harbor from Staten Island to Manhattan. From the Staten Island waiting room to the Manhattan entry hall, the trip took about 30 minutes. Recorded by Joseph Kubera and Joel Chadabe in October 2011.




100 x John 3 - by Richard Lainhart A salute to John Cage on the 100th anniversary of his birth, composed for the Ear to the Earth New York Soundscape Project in 2011 as a followup to his commission in 2008, Richard Lainhart's 100 x John is based on field recordings of Manhattan sounds. The recordings he used were a Harlem soundwalk, 125th Street traffic, Morningside Park light rain, 116th Street Shabaaz Market, 116th Street subway, store interiors, and a few more. His technique was to use Kyma, a major sound-production system, to cross-filter the field recorded sounds by playing his guitar. This technique combines the field recordings with the guitar sounds in such a way that you hear the result of the combination rather than the sounds themselves. As Lainhart puts it, "What you hear is the interaction between me and my environment."



Threshold by Richard Lainhart was commissioned by EMF for the Ear to the Earth Festival 2008. "I recorded many ambient sounds in New York, among them the street traffic during a taxi ride from 53rd Street to 89th Street, the lobby of the Museum of Modern Art, the lobby of the Guggenheim Museum, a trash compacter around 55th Street, an air-conditioning compressor around 54th Street, an industrial document shredder on 87th Street, a floor buffer at MOMA, Olafur Elliason's New York Waterfall #4 at Pier 35, and the traffic beneath the FDR Drive at South Street. I then played my guitar and convoluted all of the recorded ambient sounds with my guitar playing. The result is that the sounds are imposed on one another. You hear none of the sounds directly. What you hear is the interaction between me and my environment."




Biography



Joel Chadabe, composer, author, is an internationally recognized pioneer in the development of interactive music systems. He has concertized widely since 1969, with Jan Williams, Bruno Sperri, and other musicians, presenting his music at venues and festivals such as Klangprojektionen 4.4 (Vienna), Ear to the Earth (New York City), Computing Music IV (Cologne), HörZeit-SpielRaum 2005 (Berlin), ISCM Festival (Miami), NYU Interactive (NYC), New Mix (Palais de Tokyo, Paris), Chelsea Art Museum (New York), Expanded Instruments Festival (Engine 27, New York City), Centro Cultural Recoleta (Buenos Aires), Venice Biennale, Wellington Festival (New Zealand), Aarhus Festival (Denmark), De Isbreker (Amsterdam), New Music America, Inventionen (Berlin), IRCAM (Paris), Stedelijk Museum (Amsterdam), Ars Electronica (Linz, Austria), Electronic Music Festival (Stockholm), and New Music New York. His music is recorded on EMF Media, Deep Listening, CDCM, Centaur, Lovely Music, Opus One, CP2, and Folkways labels.


As president of Intelligent Music from 1983-1994, he was responsible for the development and publication of a wide range of innovative and historically important software, including M and Max, as well as the TouchSurface, an xyz touch-sensitive computer input device. In 1977, with Roger Meyers, he co-authored The PLAY Program, the first software sequencer. In 1967, while director of the Electronic Music Studio at State University of New York at Albany (1965 - 1998), he designed the CEMS (Coordinated Electronic Music Studio) System, an analog-programmable electronic music system, and commissioned Robert Moog to build it.

He was keynote speaker at the NIME (New Interfaces for Musical Expression) Conference in 2002 in Dublin, sponsored by the MIT Media Lab; at the International Computer Music Conference in Berlin in 2000; and at the International Music and Technology Conference (University of Melbourne, Australia, 1981), where he first coined the term 'interactive composing'. He has presented papers at EMS05 (Montreal), Resonances (IRCAM, Paris), Intersens (Marseilles), ISEA98 (Liverpool), at several SEAMUS and ICMC conferences, and at many other conferences; participated in panels at WISP (Sydney), ICMC 05 (Barcelona), and at many other conferences and symposia; and presented lectures, workshops, and demonstrations at Florida International University, IRCAM, Zurich Conservatory, Brown University, Experience Music Project (Seattle), University of Californa at Santa Barbara, CCMIX (Paris), University of California at San Diego, and at many other universities and venues. He has received awards, fellowships, and grants from the National Endowment for the Arts, New York State Council on the Arts, Ford Foundation, Rockefeller Foundation, Fulbright Commission, SUNY Research Foundation, New York Foundation for the Arts, and other foundations. He was the winner of the Grossen Preises der Ars Electronica, 2nd Prize (Linz, Austria, 1982), and he is the recipient of the 2007 SEAMUS Lifetime Achievement Award.


As author, his book 'Electric Sound: The Past and Promise of Electronic Music', published by Prentice Hall in November 1996, is the first comprehensive overview of the history of electronic music. His articles on electronic music have appeared in Organized Sound, Leonardo, Computer Music Journal, Contemporary Music Review, Leonardo, Journal of New Music Research, Leonardo Music Journal, Electronic Musician, Perspectives of New Music, Electronic Music Review, Melos, Musique en Jeu, and many other journals and magazines, and several of his articles have been anthologized in books by MIT Press, Routledge, Feltrinelli, and other publishers.


Mr. Chadabe has a B.A. degree from the University of North Carolina at Chapel Hill and an M.M. degree from Yale University, where he studied composition with Elliott Carter. He is currently Professor Emeritus at State University of New York at Albany; Director of the Computer Music Studio at Manhattan School of Music; Visiting Faculty at New York University; and Founder and President of Electronic Music Foundation, a not-for-profit organization that organizes concerts and other events and disseminates information and materials relating to the history and current practice of electronic music.


When people ask me what I do as a composer, I explain that I do not compose pieces, I compose activities. A 'piece', whatever its content, is a construction with a beginning and end that exists independent of its listeners and within its own boundaries of time. An 'activity' unfolds because of the way people perform; and consequently, an activity happens in the time of living; and art comes closer to life.


How does art function in the life of the performer that is participating in one of my musical activities? Well, for one thing, I view the performer as a human being and friend, and not as the executer of a construction; and so I try to design a role for the performer that is challenging, creative, and comfortable. By challenging, I mean that I put the performer in a situation where the demands on the performer's listening and reacting skills are not routine. By creative, I mean that the performer has to make compositional decisions, as against, in a more traditional role, executing the notes in a musical score. By comfortable, I ask the performer to participate in a way that the performer finds agreeable, both physically and musically.


The musical 'instrument' I use is defined by the software I design. It is the software that articulates the interface between instrument and performer, determines how the instrument will react to a performer's actions, and generates the sounds. The activitity that I design, then, is defined by the interface, the way the instrument responds to a performer, and the nature of the sounds. In short, as against a musical score that is played on an instrument, in my music it is the instrument itself that is the work of art. The instrument is inseparable from the music it produces. As Yeats wrote, "How can we know the dancer from the dance?"

Here's a case in point from some recent work. I composed 'Many Times ...', where the elipsis is the name of the performer, for a concert of my music at Engine 27, a sound gallery / performance space in New York City, in the spring of 2001. The underlying idea of the use of technology is that technology expands our capabilities. In 'Many Times ...', the instrument takes the sound produced by a performer and from it produces many different transformed instances of it throughout the performance space, multiplying the performer's actions so that it comes from loudspeakers on the left, on the right, above, behind, from here, there, everywhere.


Those transformations, generated by the software that animates the instrument, are essentially unpredictable. Because of their unpredictability, they provide the performer with something to react to. In other words, the performer is influencing the electronic system by performing, vocally or by playing an acoustic instrument, and the electronic system is influencing the performer by giving the performer something to react to. This is what I call 'interactivity', where the word interactive means 'mutually influential'. I find it a wonderful way to make music. It brings out the best in everyone. And considering that anyone can play the role of performer, it could bring out the best in anyone.

-- Joel Chadabe


Note. This statement was written in December 2001 for Imadjinn, in New Delhi, India, producer of The IDEA, a CD-ROM gazette on media arts and artists. It has been slightly revised since then.


bottom of page