Community Space Reactions

  • “FUTURE PHASES” showcases new frontiers in music technology and interactive performanceMusic technology took center stage at MIT during “FUTURE PHASES,” an evening of works for string orchestra and electronics, presented by the MIT Music Technology and Computation Graduate Program as part of the 2025 International Computer Music Conference (ICMC). The well-attended event was held last month in the Thomas Tull Concert Hall within the new Edward and Joyce Linde Music Building. Produced in collaboration with the MIT Media Lab’s Opera of the Future Group and Boston’s self-conducted chamber orchestra A Far Cry, “FUTURE PHASES” was the first event to be presented by the MIT Music Technology and Computation Graduate Program in MIT Music’s new space.“FUTURE PHASES” offerings included two new works by MIT composers: the world premiere of “EV6,” by MIT Music’s Kenan Sahin Distinguished Professor Evan Ziporyn and professor of the practice Eran Egozy; and the U.S. premiere of “FLOW Symphony,” by the MIT Media Lab’s Muriel R. Cooper Professor of Music and Media Tod Machover. Three additional works were selected by a jury from an open call for works: “The Wind Will Carry Us Away,” by Ali Balighi; “A Blank Page,” by Celeste Betancur Gutiérrez and Luna Valentin; and “Coastal Portrait: Cycles and Thresholds,” by Peter Lane. Each work was performed by Boston’s own multi-Grammy-nominated string orchestra, A Far Cry.“The ICMC is all about presenting the latest research, compositions, and performances in electronic music,” says Egozy, director of the new Music Technology and Computation Graduate Program at MIT. When approached to be a part of this year’s conference, “it seemed the perfect opportunity to showcase MIT’s commitment to music technology, and in particular the exciting new areas being developed right now: a new master’s program in music technology and computation, the new Edward and Joyce Linde Music Building with its enhanced music technology facilities, and new faculty arriving at MIT with joint appointments between MIT Music and Theater Arts (MTA) and the Department of Electrical Engineering and Computer Science (EECS).” These recently hired professors include Anna Huang, a keynote speaker for the conference and creator of the machine learning model Coconet that powered Google’s first AI Doodle, the Bach Doodle.Egozy emphasizes the uniqueness of this occasion: “You have to understand that this is a very special situation. Having a full 18-member string orchestra [A Far Cry] perform new works that include electronics does not happen very often. In most cases, ICMC performances consist either entirely of electronics and computer-generated music, or perhaps a small ensemble of two-to-four musicians. So the opportunity we could present to the larger community of music technology was particularly exciting.”To take advantage of this exciting opportunity, an open call was put out internationally to select the other pieces that would accompany Ziporyn and Egozy’s “EV6” and Machover’s “FLOW Symphony.” Three pieces were selected from a total of 46 entries to be a part of the evening’s program by a panel of judges that included Egozy, Machover, and other distinguished composers and technologists.“We received a huge variety of works from this call,” says Egozy. “We saw all kinds of musical styles and ways that electronics would be used. No two pieces were very similar to each other, and I think because of that, our audience got a sense of how varied and interesting a concert can be for this format. A Far Cry was really the unifying presence. They played all pieces with great passion and nuance. They have a way of really drawing audiences into the music. And, of course, with the Thomas Tull Concert Hall being in the round, the audience felt even more connected to the music.”Egozy continues, “we took advantage of the technology built into the Thomas Tull Concert Hall, which has 24 built-in speakers for surround sound allowing us to broadcast unique, amplified sound to every seat in the house. Chances are that every person might have experienced the sound slightly differently, but there was always some sense of a multidimensional evolution of sound and music as the pieces unfolded.”The five works of the evening employed a range of technological components that included playing synthesized, prerecorded, or electronically manipulated sounds; attaching microphones to instruments for use in real-time signal processing algorithms; broadcasting custom-generated musical notation to the musicians; utilizing generative AI to process live sound and play it back in interesting and unpredictable ways; and audience participation, where spectators use their cellphones as musical instruments to become a part of the ensemble.Ziporyn and Egozy’s piece, “EV6,” took particular advantage of this last innovation: “Evan and I had previously collaborated on a system called Tutti, which means ‘together’ in Italian. Tutti gives an audience the ability to use their smartphones as musical instruments so that we can all play together.” Egozy developed the technology, which was first used in the MIT Campaign for a Better World in 2017. The original application involved a three-minute piece for cellphones only. “But for this concert,” Egozy explains, “Evan had the idea that we could use the same technology to write a new piece — this time, for audience phones and a live string orchestra as well.”To explain the piece’s title, Ziporyn says, “I drive an EV6; it’s my first electric car, and when I first got it, it felt like I was driving an iPhone. But of course it’s still just a car: it’s got wheels and an engine, and it gets me from one place to another. It seemed like a good metaphor for this piece, in which a lot of the sound is literally played on cellphones, but still has to work like any other piece of music. It’s also a bit of an homage to David Bowie’s song ‘TVC 15,’ which is about falling in love with a robot.”Egozy adds, “We wanted audience members to feel what it is like to play together in an orchestra. Through this technology, each audience member becomes a part of an orchestral section (winds, brass, strings, etc.). As they play together, they can hear their whole section playing similar music while also hearing other sections in different parts of the hall play different music. This allows an audience to feel a responsibility to their section, hear how music can move between different sections of an orchestra, and experience the thrill of live performance. In ‘EV6,’ this experience was even more electrifying because everyone in the audience got to play with a live string orchestra — perhaps for the first time in recorded history.”After the concert, guests were treated to six music technology demonstrations that showcased the research of undergraduate and graduate students from both the MIT Music program and the MIT Media Lab. These included a gamified interface for harnessing just intonation systems (Antonis Christou); insights from a human-AI co-created concert (Lancelot Blanchard and Perry Naseck); a system for analyzing piano playing data across campus (Ayyub Abdulrezak ’24, MEng ’25); capturing music features from audio using latent frequency-masked autoencoders (Mason Wang); a device that turns any surface into a drum machine (Matthew Caren ’25); and a play-along interface for learning traditional Senegalese rhythms (Mariano Salcedo ’25). This last example led to the creation of Senegroove, a drumming-based application specifically designed for an upcoming edX online course taught by ethnomusicologist and MIT associate professor in music Patricia Tang, and world-renowned Senegalese drummer and MIT lecturer in music Lamine Touré, who provided performance videos of the foundational rhythms used in the system.Ultimately, Egozy muses, “'FUTURE PHASES' showed how having the right space — in this case, the new Edward and Joyce Linde Music Building — really can be a driving force for new ways of thinking, new projects, and new ways of collaborating. My hope is that everyone in the MIT community, the Boston area, and beyond soon discovers what a truly amazing place and space we have built, and are still building here, for music and music technology at MIT.”

    "FUTURE PHASES," a groundbreaking concert held in the Edward and Joyce Linde Music Building at MIT, showcased new frontiers in music technology and interactive performance. The concert, featuring electronic and computer-generated music, was a part of the 2025 International Computer Music Conference.

  • Vocal comping: The 4 steps for getting perfect vocals
    Learn about what vocal comping is and explore key tips and tricks you can apply to get the best final results possible.

    Learn about what vocal comping is and explore key tips and tricks you can apply to get the best final results possible.

  • Why having a PRO isn’t enough to collect all publishing royaltiesOne question we hear songwriters, producers, and artists ask all the time is “why do I need a music publisher if I’m a member of a PRO?”. It’s a fair question.

  • Rob Grimaldi (BTS, BLACKPINK) shares his K-pop production tips
    Watch Rob Grimaldi (BTS, BLACKPINK) share his production tips and techniques as he walks through his Pro Tools session for a K-pop instrumental.

    Watch Rob Grimaldi (BTS, BLACKPINK) share his production tips and techniques as he walks through his Pro Tools session for a K-pop track.

  • Should AI Artists Have Fans, Custom Music For SyncThis week, Ari is joined by Jess Furman to discuss custom music for TV/film and the ethics of AI music artists.

  • The Torso S-4 sculpting sampler: A complete guide
    Here's everything you need to know to get started with the Torso S-4 sculpting sampler.

    Here's everything you need to know to get started with the Torso S-4 sculpting sampler, from its main modules to more advanced features.

  • Sampling in hip hop: 4 key eras
    Explore four key eras and their defining sample touchpoints that have shaped the culture of sampling in hip hop.

    Explore four key eras and their defining sample touchpoints that have shaped the culture of sampling in hip hop.

  • What is trance music? History, artists, and subgenres
    Learn about the history, key artists, and subgenres behind trance music, in addition to tips for how to make your own music in the genre.

    Learn about the history, key artists, and subgenres behind trance music, in addition to tips for how to make your own music in the genre.

  • The 6 best reverb plugins for vocals with SIRMA (free presets included)
    Expert artist, songwriter, and music producer SIRMA showcases six reverb plugins for vocals that have become staples in her toolkit.

    Expert artist, songwriter, and music producer SIRMA showcases six reverb plugins for vocals that have become staples in her toolkit.

  • Win $1,000 and get your remix released on EMPIRE via our contest with Farrah Fawx
    Craft a club edit of Farrah Fawx's "Slippery" for the chance to win $1,000 and get your remix released with EMPIRE.

    Craft a club edit of Farrah Fawx's "Slippery" for the chance to win $1,000 and get your remix released with EMPIRE.

  • Southside (808 Mafia) roasts beats from Discord
    Southside sat down with us to rate and react to beats submitted by producers via Discord—and he's not holding back.

    Southside sat down with us to rate and react to beats submitted by producers via Discord—and he's not holding back.

  • Suzy Shinn (Weezer, Fall Out Boy) makes an indie pop song in Pro Tools
    Suzy Shinn sat down with us at Sound Factory, where she reacted to the new Splice x Pro Tools integration and made an indie pop track with it.

    Suzy Shinn sat down with us at Sound Factory, where she reacted to the new Splice x Pro Tools integration and made an indie pop track with it.

  • Jack Harlow and Shaboozey’s Management Company has RangeThis week, Ari is joined by Matt Graham to discuss the evolution of artist management and the inner workings of Range Media Partners.

    This week, Ari is joined by Matt Graham to discuss the evolution of artist management and the inner workings of Range Media Partners.

  • At MIT, musicians make new tools for new tunesThe MIT Music Technology Program carves out a space to explore new sounds, tunes, and experiences. From the classroom to the community, students in music tech grapple with developing both creative technology and their creative selves.In the course 21M.080 (Intro to Music Technology), it dawned on Thelonious Cooper ’25 that he had the skills to create his own instruments. “I can literally make a new instrument. I don’t think most people consider that as an option. But it totally is,” Cooper says.Similar to how the development of photography contributed to a radical shift in the priorities of painting, Cooper identifies the potential of new music tools to “[pave] the way to find new forms of creative expression.” Cooper develops digital instruments and music software.For Matthew Caren ’25, his parallel interests in computer science, mathematics, and jazz performance found an intersection in design. Caren explains, “the process of creating music doesn’t actually start when you, for instance, sit at a piano. It really starts when someone goes out and designs that piano and lays out the parameters for how the creation process is going to go.” When it is the tool that defines the parameters for creating art, Caren reasons, “You can tell your story only as well as the technology allows you to.”What purposes can music technology serve? In holding both technical and artistic questions simultaneously, makers of music technology uncover new ways to approach engineering problems alongside human notions of community and beauty.Building the bridge between music and techTaught by professor of the practice Eran Egozy, class 21M.385 (Interactive Music Systems, or IMS) focuses on the creation of musical experiences that include some element of human-computer interaction (HCI) through software or a hardware interface.In their first assignment, students program a digital synthesizer, a piece of software to generate and manipulate pitches with desired qualities. While building this foundation of the application of hard technical skills to music, students contemplate their budding aesthetic and creative interests.“How can you use it creatively? How can you make it make music in a way that’s not just a bunch of random sounds, but actually has some intention? Can you use the thing you just made to perform a little song?” prompts Egozy.In the spirit of MIT’s motto, “mens et manus” (“mind and hand”), students of IMS propose, design, implement, play-test, and present a creative musical system of their own during the last stretch of the semester. Students develop novel music games, tools, and instruments alongside an understanding of the principles of user interface, user experience (UI/UX), and HCI.Once students implement their ideas, they can evaluate their design. Egozy stresses it is important to develop a “working prototype” quickly. “As soon as it works, you can test it. As soon as you test it, you find out whether it's working or not, then you can adjust your design and your implementation,” he explains.Although students receive feedback at multiple milestones, a day of play-testing is the “most focused and concentrated amount of learning [students] get in the entire class.” Students might find their design choices affirmed or their assumptions broken as peers test the limits of their creations. “It’s a very entertaining experience,” Egozy says.Immersed in music tech since his graduate studies at the MIT Media Lab and as co-founder of Harmonix, the original developers of popular music game titles “Guitar Hero” and “Rock Band,” Egozy aims to empower more people to engage with music more deeply by creating “delightful music experiences.”By the same token, developers of music technology deepen their understanding of music and technical skills. For Cooper, understanding the “causal factors” behind changes in sounds has helped him to “better curate and sculpt the sounds [he uses] when making music with much finer detail.”Designing for possibilityMusic technologies mark milestones in history — from the earliest acoustic instruments to the electrified realm of synthesizers and digital audio workstations, design decisions reverberate throughout the ages.“When we create the tools that we use to make art, we design into them our understanding and our ideas about the things that we’re interested to explore,” says Ian Hattwick, lecturer in music technology.Hattwick brings his experience as a professional musician and creative technologist as the instructor of Intro to Music Technology and class 21M.370 (Digital Instrument Design).For Hattwick, identifying creative interests, expressing those interests by creating a tool, using the tool to create art, and then developing a new creative understanding is a generative and powerful feedback loop for an artist. But even if a tool is carefully designed for one purpose, creative users can use them unexpectedly, generating new and cascading creative possibilities on a cultural scale.In cases of many important music hardware technologies, “the impact of the decisions didn’t play out for a decade or two,” says Hattwick. Over time, he notes, people shift their understanding of what is possible with the available instruments, pushing their expectations of technology and what music can sound like. One novel example is the relationship between drummers and drum machines — human drummers took inspiration from programmatic drum beats to learn unique, challenging rhythms.Although designers may feel an impulse for originality, Hattwick stresses that design happens “within a context of culture.” Designers extend, transform, and are influenced by existing ideas. On the flip side, if a design is too unfamiliar, the ideas expressed risk limited impact and propagation. The current understanding of what sounds are even considered musical is in tension with the ways new tools can manipulate and generate them.This tension leads Hattwick to put tools and the thoughtful choices of their human designers back in focus. He says, “when you use tools that other people have designed, you’re also adopting the way that they think about things. There’s nothing wrong with that. But you can make a different choice.”Grounding his interests in the physical hardware that has backed much of music history, electrical engineering and computer science undergraduate Evan Ingoldsby builds guitar pedals and audio circuits that manipulate signals through electronic components. “A lot of modern music tech is based off of taking hardware for other purposes, like signal filters and saturators and such, and putting music and sounds through them and seeing how [they] change,” says Ingoldsby.For Cooper, learning from history and the existing body of knowledge, both artistically and technically, unlocks more creativity. “Adding more tools to your toolbox should never stop you from building something that you want to. It can only make it easier,” he says.Ingoldsby finds the unexpected, emergent effects of pushing hardware tools such as modular synthesizers to their limits most inspiring. “It increases in complexity, but it also increases in freedom.”Collaboration and communityMusic has always been a collective endeavor, fostering connection, ritual, and communal experiences. Advancements in music technology can both expand creative possibilities for live performers and foster new ways for musicians to gather and create.Cooper makes a direct link between his research in high-performance, low-latency computing to his work developing real-time music tools. Many music tools can only function well “offline,” Cooper poses. “For example, you’ll record something into your digital audio workstation on your computer, and then you’ll hit a button, and it will change the way it sounds. That’s super cool. But I think it’s even cooler if you can make that real-time. Can you change what the sound is coming out as you’re playing?” asks Cooper.The problem of speeding up the processing of sound, such that the time difference in input and output — latency — is imperceptible to human hearing, is a technical one. Cooper takes an interest in real-time timbre transfer that could, for example, change the sound coming from a saxophone as if it were coming from a cello. The problem intersects with common techniques in artificial intelligence research, he notes. Cooper’s work to improve the speed and efficiency of music software tools could provide new effects for digital music performers to manipulate audio in a live setting.With the rise of personal computing in the 2010s, Hattwick recounts, an appeal for “laptop ensembles” emerged to contemplate new questions about live music performance in a digitizing era. “What does it mean to perform music with a laptop? Why is that fun? Is a laptop an instrument?” he poses.In the Fabulous MIT Laptop Ensemble (FaMLE), directed by Hattwick, MIT students pursue music performance in a “living laboratory.” Driven by the interests of its members, FaMLE explores digital music, web audio, and live coding, an improvisational practice exposing the process of writing code to generate music. A member of FaMLE, Ingoldsby has found a place to situate his practice of sound design in a broader context.When emerging digital technologies interface with art, challenging questions arise regarding human creativity. Communities made of multidisciplinary people allow for the exchange of ideas to generate novel approaches to complex problems. “Engineers have a lot to offer performers,” says Cooper. “As technology progresses, I think it’s important we use that to further develop our abilities for creative practice, instead of substituting it.”Hattwick emphasizes, “The best way to explore this is together.”

    The MIT Music Technology Program brings together students from music, engineering, and computer science to explore digital instrument design, real-time performance tools, and creative expression through human-computer interaction and collaborative making.

  • Professor Emeritus Barry Vercoe, a pioneering force in computer music, dies at 87MIT Professor Emeritus Barry Lloyd Vercoe, a pioneering force in computer music, a founding faculty member of the MIT Media Lab, and a leader in the development of MIT’s Music and Theater Arts Section, passed away on June 15. He was 87.Vercoe’s life was a rich symphony of artistry, science, and innovation that led to profound enhancements of musical experience for expert musicians as well as for the general public — and especially young people.Born in Wellington, New Zealand, on July 24, 1937, Vercoe earned bachelor’s degrees in music (in 1959) and mathematics (in 1962) from the University of Auckland, followed by a doctor of musical arts in music composition from the University of Michigan in 1968.After completing postdoctoral research in digital audio processing at Princeton University and a visiting lectureship at Yale University, Vercoe joined MIT’s Department of Humanities (Music) in 1971, beginning a tenure in the department that lasted through 1984. During this period, he played a key role in advancing what would become MIT’s Music and Theater Arts (MTA) Section, helping to shape its forward-thinking curriculum and interdisciplinary philosophy. Vercoe championed the integration of musical creativity with scientific inquiry, laying the groundwork for MTA’s enduring emphasis on music technology and experimental composition.In 1973, Vercoe founded MIT’s Experimental Music Studio (EMS) — the Institute’s first dedicated computer music facility, and one of the first in the world. Operated under the auspices of the music program, EMS became a crucible for innovation in algorithmic composition, digital synthesis, and computer-assisted performance. His leadership not only positioned MIT as a hub for music technology, but also influenced how the Institute approached the intersection of the arts with engineering. This legacy is honored today by a commemorative plaque in the Kendall Square MBTA station.Violist, faculty founder of the MIT Chamber Music Society, and Institute Professor Marcus Thompson says: “Barry was first and foremost a fine musician, and composer for traditional instruments and ensembles. As a young professor, he taught our MIT undergraduates to write and sing Renaissance counterpoint as he envisioned how the act of traditional music-making offered a guide to potential artistic interaction between humans and computers. In 1976, he enlisted me to premiere what became his iconic, and my most-performed, work, ‘Synapse for Viola and Computer.’”During a Guggenheim Fellowship in 1982–83, Vercoe developed the Synthetic Performer, a groundbreaking real-time interactive accompaniment system, while working closely with flautist Larry Beauregard at the Institute for Research and Coordination in Acoustics/Music (IRCAM) in Paris.In 1984, Vercoe became a founding faculty member of the MIT Media Lab, where he launched the Music, Mind, and Machine group. His research spanned machine listening, music cognition, and real-time digital audio synthesis. His Csound language, created in 1985, is still widely used for music programming, and his contributions helped define the MPEG-4 Structured Audio standard.He also served as associate academic head of the Media Lab’s graduate program in Media Arts and Sciences (MAS). Vercoe mentored many future leaders in digital music and sound computation, including two of his MAS graduate students — Anna Huang SM ’08 and Paris Smaragdis PhD ’01 — who have recently joined MIT’s music faculty, and Miller Puckette, an emeritus faculty member at the University of California at San Diego, and Richard Boulanger, a professor of electronic production and design at the Berklee College of Music.“Barry Vercoe will be remembered by designers, developers, researchers, and composers for his greatest ‘composition,’ Csound, his free and open-source software synthesis language,” states Boulanger. “I know that, through Csound, Barry’s musical spirit will live on, not only in my teaching, my research, and my music, but in the apps, plugins, and musical compositions of generations to come.”Tod Machover, faculty director of the MIT Media Lab and Muriel R. Cooper Professor of Music and Media, reflects, “Barry Vercoe was a giant in the field of computer music whose innovations in software synthesis, interactive performance, and educational tools for young people influenced and inspired many, including myself. He was a superb mentor, always making sure that artistic sensibility drove music tech innovation, and that sophisticated expression was at the core of Media Lab — and MIT — culture.”Vercoe’s work earned numerous accolades. In addition to the Guggenheim Fellowship, he was also honored with the 1992 Computerworld Smithsonian Award for innovation and the 2004 SEAMUS Lifetime Achievement Award.Beyond MIT, Vercoe consulted with Analog Devices and collaborated with international institutions like IRCAM under the direction of Pierre Boulez. His commitment to democratizing music technology was evident in his contributions to the One Laptop per Child initiative, which brought accessible digital sound tools to young people in underserved communities worldwide.He is survived by his former wives, Kathryn Veda Vaughn and Elizabeth Vercoe; their children, Andrea Vercoe and Scott Vercoe; and generations of students and collaborators who continue to build on his groundbreaking work. A memorial service for family will be held in New Zealand later this summer, and a special event in his honor will take place at MIT in the fall. The Media Lab will share details about the MIT gathering as they become available.Named professor emeritus at the MIT Media Lab upon his retirement in 2010, Vercoe’s legacy embodies the lab’s — and MIT’s — vision of creative, ethical, interdisciplinary research at the convergence of art, science, and technology. His music, machines, and generously inventive spirit will continue to forever shape the way we listen, learn, and communicate.

    MIT Professor Emeritus Barry Vercoe, a pioneering force in computer music, a founding faculty member of the MIT Media Lab, and a leader in the development of MIT's Music and Theater Arts Section, died at 87. He created Csound, the Synthetic Performer, and other digital audio synthesis tools, with lasting impact through IRCAM, MPEG-4, One Laptop per Child, and more.