- PublMe bot posted in Space
At MIT, musicians make new tools for new tunesThe MIT Music Technology Program carves out a space to explore new sounds, tunes, and experiences. From the classroom to the community, students in music tech grapple with developing both creative technology and their creative selves.In the course 21M.080 (Intro to Music Technology), it dawned on Thelonious Cooper ’25 that he had the skills to create his own instruments. “I can literally make a new instrument. I don’t think most people consider that as an option. But it totally is,” Cooper says.Similar to how the development of photography contributed to a radical shift in the priorities of painting, Cooper identifies the potential of new music tools to “[pave] the way to find new forms of creative expression.” Cooper develops digital instruments and music software.For Matthew Caren ’25, his parallel interests in computer science, mathematics, and jazz performance found an intersection in design. Caren explains, “the process of creating music doesn’t actually start when you, for instance, sit at a piano. It really starts when someone goes out and designs that piano and lays out the parameters for how the creation process is going to go.” When it is the tool that defines the parameters for creating art, Caren reasons, “You can tell your story only as well as the technology allows you to.”What purposes can music technology serve? In holding both technical and artistic questions simultaneously, makers of music technology uncover new ways to approach engineering problems alongside human notions of community and beauty.Building the bridge between music and techTaught by professor of the practice Eran Egozy, class 21M.385 (Interactive Music Systems, or IMS) focuses on the creation of musical experiences that include some element of human-computer interaction (HCI) through software or a hardware interface.In their first assignment, students program a digital synthesizer, a piece of software to generate and manipulate pitches with desired qualities. While building this foundation of the application of hard technical skills to music, students contemplate their budding aesthetic and creative interests.“How can you use it creatively? How can you make it make music in a way that’s not just a bunch of random sounds, but actually has some intention? Can you use the thing you just made to perform a little song?” prompts Egozy.In the spirit of MIT’s motto, “mens et manus” (“mind and hand”), students of IMS propose, design, implement, play-test, and present a creative musical system of their own during the last stretch of the semester. Students develop novel music games, tools, and instruments alongside an understanding of the principles of user interface, user experience (UI/UX), and HCI.Once students implement their ideas, they can evaluate their design. Egozy stresses it is important to develop a “working prototype” quickly. “As soon as it works, you can test it. As soon as you test it, you find out whether it's working or not, then you can adjust your design and your implementation,” he explains.Although students receive feedback at multiple milestones, a day of play-testing is the “most focused and concentrated amount of learning [students] get in the entire class.” Students might find their design choices affirmed or their assumptions broken as peers test the limits of their creations. “It’s a very entertaining experience,” Egozy says.Immersed in music tech since his graduate studies at the MIT Media Lab and as co-founder of Harmonix, the original developers of popular music game titles “Guitar Hero” and “Rock Band,” Egozy aims to empower more people to engage with music more deeply by creating “delightful music experiences.”By the same token, developers of music technology deepen their understanding of music and technical skills. For Cooper, understanding the “causal factors” behind changes in sounds has helped him to “better curate and sculpt the sounds [he uses] when making music with much finer detail.”Designing for possibilityMusic technologies mark milestones in history — from the earliest acoustic instruments to the electrified realm of synthesizers and digital audio workstations, design decisions reverberate throughout the ages.“When we create the tools that we use to make art, we design into them our understanding and our ideas about the things that we’re interested to explore,” says Ian Hattwick, lecturer in music technology.Hattwick brings his experience as a professional musician and creative technologist as the instructor of Intro to Music Technology and class 21M.370 (Digital Instrument Design).For Hattwick, identifying creative interests, expressing those interests by creating a tool, using the tool to create art, and then developing a new creative understanding is a generative and powerful feedback loop for an artist. But even if a tool is carefully designed for one purpose, creative users can use them unexpectedly, generating new and cascading creative possibilities on a cultural scale.In cases of many important music hardware technologies, “the impact of the decisions didn’t play out for a decade or two,” says Hattwick. Over time, he notes, people shift their understanding of what is possible with the available instruments, pushing their expectations of technology and what music can sound like. One novel example is the relationship between drummers and drum machines — human drummers took inspiration from programmatic drum beats to learn unique, challenging rhythms.Although designers may feel an impulse for originality, Hattwick stresses that design happens “within a context of culture.” Designers extend, transform, and are influenced by existing ideas. On the flip side, if a design is too unfamiliar, the ideas expressed risk limited impact and propagation. The current understanding of what sounds are even considered musical is in tension with the ways new tools can manipulate and generate them.This tension leads Hattwick to put tools and the thoughtful choices of their human designers back in focus. He says, “when you use tools that other people have designed, you’re also adopting the way that they think about things. There’s nothing wrong with that. But you can make a different choice.”Grounding his interests in the physical hardware that has backed much of music history, electrical engineering and computer science undergraduate Evan Ingoldsby builds guitar pedals and audio circuits that manipulate signals through electronic components. “A lot of modern music tech is based off of taking hardware for other purposes, like signal filters and saturators and such, and putting music and sounds through them and seeing how [they] change,” says Ingoldsby.For Cooper, learning from history and the existing body of knowledge, both artistically and technically, unlocks more creativity. “Adding more tools to your toolbox should never stop you from building something that you want to. It can only make it easier,” he says.Ingoldsby finds the unexpected, emergent effects of pushing hardware tools such as modular synthesizers to their limits most inspiring. “It increases in complexity, but it also increases in freedom.”Collaboration and communityMusic has always been a collective endeavor, fostering connection, ritual, and communal experiences. Advancements in music technology can both expand creative possibilities for live performers and foster new ways for musicians to gather and create.Cooper makes a direct link between his research in high-performance, low-latency computing to his work developing real-time music tools. Many music tools can only function well “offline,” Cooper poses. “For example, you’ll record something into your digital audio workstation on your computer, and then you’ll hit a button, and it will change the way it sounds. That’s super cool. But I think it’s even cooler if you can make that real-time. Can you change what the sound is coming out as you’re playing?” asks Cooper.The problem of speeding up the processing of sound, such that the time difference in input and output — latency — is imperceptible to human hearing, is a technical one. Cooper takes an interest in real-time timbre transfer that could, for example, change the sound coming from a saxophone as if it were coming from a cello. The problem intersects with common techniques in artificial intelligence research, he notes. Cooper’s work to improve the speed and efficiency of music software tools could provide new effects for digital music performers to manipulate audio in a live setting.With the rise of personal computing in the 2010s, Hattwick recounts, an appeal for “laptop ensembles” emerged to contemplate new questions about live music performance in a digitizing era. “What does it mean to perform music with a laptop? Why is that fun? Is a laptop an instrument?” he poses.In the Fabulous MIT Laptop Ensemble (FaMLE), directed by Hattwick, MIT students pursue music performance in a “living laboratory.” Driven by the interests of its members, FaMLE explores digital music, web audio, and live coding, an improvisational practice exposing the process of writing code to generate music. A member of FaMLE, Ingoldsby has found a place to situate his practice of sound design in a broader context.When emerging digital technologies interface with art, challenging questions arise regarding human creativity. Communities made of multidisciplinary people allow for the exchange of ideas to generate novel approaches to complex problems. “Engineers have a lot to offer performers,” says Cooper. “As technology progresses, I think it’s important we use that to further develop our abilities for creative practice, instead of substituting it.”Hattwick emphasizes, “The best way to explore this is together.”
At MIT, musicians make new tools for new tunes
news.mit.eduThe MIT Music Technology Program brings together students from music, engineering, and computer science to explore digital instrument design, real-time performance tools, and creative expression through human-computer interaction and collaborative making.
- PublMe bot posted in Space
Professor Emeritus Barry Vercoe, a pioneering force in computer music, dies at 87MIT Professor Emeritus Barry Lloyd Vercoe, a pioneering force in computer music, a founding faculty member of the MIT Media Lab, and a leader in the development of MIT’s Music and Theater Arts Section, passed away on June 15. He was 87.Vercoe’s life was a rich symphony of artistry, science, and innovation that led to profound enhancements of musical experience for expert musicians as well as for the general public — and especially young people.Born in Wellington, New Zealand, on July 24, 1937, Vercoe earned bachelor’s degrees in music (in 1959) and mathematics (in 1962) from the University of Auckland, followed by a doctor of musical arts in music composition from the University of Michigan in 1968.After completing postdoctoral research in digital audio processing at Princeton University and a visiting lectureship at Yale University, Vercoe joined MIT’s Department of Humanities (Music) in 1971, beginning a tenure in the department that lasted through 1984. During this period, he played a key role in advancing what would become MIT’s Music and Theater Arts (MTA) Section, helping to shape its forward-thinking curriculum and interdisciplinary philosophy. Vercoe championed the integration of musical creativity with scientific inquiry, laying the groundwork for MTA’s enduring emphasis on music technology and experimental composition.In 1973, Vercoe founded MIT’s Experimental Music Studio (EMS) — the Institute’s first dedicated computer music facility, and one of the first in the world. Operated under the auspices of the music program, EMS became a crucible for innovation in algorithmic composition, digital synthesis, and computer-assisted performance. His leadership not only positioned MIT as a hub for music technology, but also influenced how the Institute approached the intersection of the arts with engineering. This legacy is honored today by a commemorative plaque in the Kendall Square MBTA station.Violist, faculty founder of the MIT Chamber Music Society, and Institute Professor Marcus Thompson says: “Barry was first and foremost a fine musician, and composer for traditional instruments and ensembles. As a young professor, he taught our MIT undergraduates to write and sing Renaissance counterpoint as he envisioned how the act of traditional music-making offered a guide to potential artistic interaction between humans and computers. In 1976, he enlisted me to premiere what became his iconic, and my most-performed, work, ‘Synapse for Viola and Computer.’”During a Guggenheim Fellowship in 1982–83, Vercoe developed the Synthetic Performer, a groundbreaking real-time interactive accompaniment system, while working closely with flautist Larry Beauregard at the Institute for Research and Coordination in Acoustics/Music (IRCAM) in Paris.In 1984, Vercoe became a founding faculty member of the MIT Media Lab, where he launched the Music, Mind, and Machine group. His research spanned machine listening, music cognition, and real-time digital audio synthesis. His Csound language, created in 1985, is still widely used for music programming, and his contributions helped define the MPEG-4 Structured Audio standard.He also served as associate academic head of the Media Lab’s graduate program in Media Arts and Sciences (MAS). Vercoe mentored many future leaders in digital music and sound computation, including two of his MAS graduate students — Anna Huang SM ’08 and Paris Smaragdis PhD ’01 — who have recently joined MIT’s music faculty, and Miller Puckette, an emeritus faculty member at the University of California at San Diego, and Richard Boulanger, a professor of electronic production and design at the Berklee College of Music.“Barry Vercoe will be remembered by designers, developers, researchers, and composers for his greatest ‘composition,’ Csound, his free and open-source software synthesis language,” states Boulanger. “I know that, through Csound, Barry’s musical spirit will live on, not only in my teaching, my research, and my music, but in the apps, plugins, and musical compositions of generations to come.”Tod Machover, faculty director of the MIT Media Lab and Muriel R. Cooper Professor of Music and Media, reflects, “Barry Vercoe was a giant in the field of computer music whose innovations in software synthesis, interactive performance, and educational tools for young people influenced and inspired many, including myself. He was a superb mentor, always making sure that artistic sensibility drove music tech innovation, and that sophisticated expression was at the core of Media Lab — and MIT — culture.”Vercoe’s work earned numerous accolades. In addition to the Guggenheim Fellowship, he was also honored with the 1992 Computerworld Smithsonian Award for innovation and the 2004 SEAMUS Lifetime Achievement Award.Beyond MIT, Vercoe consulted with Analog Devices and collaborated with international institutions like IRCAM under the direction of Pierre Boulez. His commitment to democratizing music technology was evident in his contributions to the One Laptop per Child initiative, which brought accessible digital sound tools to young people in underserved communities worldwide.He is survived by his former wives, Kathryn Veda Vaughn and Elizabeth Vercoe; their children, Andrea Vercoe and Scott Vercoe; and generations of students and collaborators who continue to build on his groundbreaking work. A memorial service for family will be held in New Zealand later this summer, and a special event in his honor will take place at MIT in the fall. The Media Lab will share details about the MIT gathering as they become available.Named professor emeritus at the MIT Media Lab upon his retirement in 2010, Vercoe’s legacy embodies the lab’s — and MIT’s — vision of creative, ethical, interdisciplinary research at the convergence of art, science, and technology. His music, machines, and generously inventive spirit will continue to forever shape the way we listen, learn, and communicate.
Professor Emeritus Barry Vercoe, a pioneering force in computer music, dies at 87
news.mit.eduMIT Professor Emeritus Barry Vercoe, a pioneering force in computer music, a founding faculty member of the MIT Media Lab, and a leader in the development of MIT's Music and Theater Arts Section, died at 87. He created Csound, the Synthetic Performer, and other digital audio synthesis tools, with lasting impact through IRCAM, MPEG-4, One Laptop per Child, and more.
- PublMe bot posted in Space
Southside (Future, Lil Durk) cooks up a beat and shares his sounds
GRAMMY-nominated producer and rapper Southside (Future, Lil Durk) shares his new sample pack and showcases his beatmaking process.Southside Cooks Up a Beat and Shares His Sounds - Blog | Splice
splice.comGRAMMY-nominated producer and rapper Southside (Future, Lil Durk) shares his new sample pack and showcases his beatmaking process.
- PublMe bot posted in Space
What is trap music? Definition, artists, and characteristics
Explore the artists, subgenres, and characteristics that define trap music and learn how to make your own beats in the genre.What is Trap Music? Definition, Artists, and Characteristics - Blog | Splice
splice.comExplore the artists, subgenres, and characteristics that define trap music and learn how to make a trap beat of your own.
- PublMe bot posted in Space
Michael Seyer is All Vibes Running a DIY Music CareerThis week, Ari is joined by indie artist Michael Seyer to talk DIY success, his new album Boylife, and launching his own label, Seyerland.
https://aristake.com/michael-seyer/ - PublMe bot posted in Space
Oak Felder (Demi Lovato, Nicki Minaj) cooks up a beat in Pro Tools
Watch GRAMMY-winning producer and songwriter Oak Felder react to the Splice x Pro Tools integration and make a beat with it.Oak Felder (Demi Lovato, Nicki Minaj) cooks up a beat in Pro Tools - Blog | Splice
splice.comWatch GRAMMY-winning producer and songwriter Oak Felder react to the Splice x Pro Tools integration and make a beat with it.
- PublMe bot posted in Space
This “Artist Manager” on LinkedIn Doesn’t Have a ClueThis so-called industry insight on LinkedIn isn’t just wrong — it’s harmful to artists.
https://aristake.com/artist-management/ - PublMe bot posted in Space
What is soul music? History, key artists, and ongoing legacy
Learn about the deep history of soul music, its pioneering artists and bands, influence on other genres, and more.What is Soul Music? History, Key Artists, and Ongoing Legacy - Blog | Splice
splice.comLearn about the deep history of soul music, its pioneering artists and bands, influence on other genres, and more.
- PublMe bot posted in Space
Stereo widening: What it is and 4 techniques for a wider mix
Learn about four stereo widening techniques that can help you achieve an effective mix, including the Haas effect, mid-side EQ, and more.Stereo Widening: What it is and 4 Techniques for a Wider Mix - Blog | Splice
splice.comLearn about four stereo widening techniques that can help you achieve an effective mix, including the Haas effect, mid-side EQ, and more.
- PublMe bot posted in Space
What is a verse in a song? How to write a verse
Learn about what a verse is, its purpose in a song, and techniques for how to write effective verses of your own.What is a Verse in a Song? How to Write a Verse - Blog | Splice
splice.comLearn about what a verse is, its purpose in a song, and techniques for how to write effective verses of your own.
- PublMe bot posted in Space
What is American folk music? History, characteristics, and instruments
Learn about the history of American folk music, the key characteristics and instruments that define it, and more.What is American Folk Music? History, Instruments, and More - Blog | Splice
splice.comLearn about the history of American folk music, the key characteristics and instruments that define it, and more.
- PublMe bot posted in Space
Memphis rap: Its key artists, history, and enduring influence on hip hop
Learn about the history, key artists, and enduring impact of Memphis rap music, and learn how to start making your own beats in the genre.Memphis Rap: Artists, History, and Enduring Influence on Hip Hop - Blog | Splice
splice.comLearn about the history, key artists, and enduring impact of Memphis rap music, and learn how to make your own beats in the genre.
- PublMe bot posted in Space
Splice x Pro Tools: Unlocking a new era of seamless creativity
Splice is now integrated into Pro Tools—discover how to search, filter, and audition Splice sounds, all synced to your project’s key and tempo.Introducing the Splice x Pro Tools Integration - Blog | Splice
splice.comSplice is now integrated into Pro Tools. Discover how to search, filter, and audition Splice sounds—all synced to your project’s key and tempo.
- PublMe bot posted in Space
Hit Songwriter/Producer on Publishing Deals and Artists Taking Credit for Songs They Didn’t WriteThis week, Ari is joined by Justin Gammella to discuss the realities of professional songwriting, music publishing, and AI in music.
Hit Songwriter/Producer on Publishing Deals and Artists Taking Credit for Songs They Didn't Write
aristake.comThis week, Ari is joined by Justin Gammella to discuss the realities of professional songwriting, music publishing, and AI in music.
- PublMe bot posted in Space
Bandzoogle launches Merch Table Lite: a customizable commission-free online merch storefront for musiciansMusicians can now effortlessly set up a customized, commission-free online storefront using Bandzoogle’s new Merch Table Lite plan—without building a full website.
Bandzoogle launches Merch Table Lite: a customizable commission-free online merch storefront for musicians - Ari's Take
aristake.comMusicians can now effortlessly set up a customized, commission-free online storefront using Bandzoogle’s new Merch Table Lite plan—without building a full website.
- PublMe bot posted in Space
- PublMe bot posted in Space
- PublMe bot posted in Space
- PublMe bot posted in Space
- PublMe bot posted in Space
- PublMe bot posted in Space
- PublMe bot posted in Space
- PublMe bot posted in Space
- PublMe bot posted in Space
- PublMe bot posted in Space
- PublMe bot posted in Space
- PublMe bot posted in Space
- PublMe bot posted in Space
- PublMe bot posted in Space
- PublMe bot posted in Space