• Get a free electric piano for Splice INSTRUMENT
    Download our free electric piano preset for Splice INSTRUMENT—grab these presets during the drop window and they’re yours to keep forever.

    Download our free electric piano preset for the Splice INSTRUMENT plugin. Grab these presets during the drop window and they’re yours to keep forever.

  • Seeing soundsGrowing up in Mexico and Texas, Mariano Salcedo ’25 couldn’t readily indulge his passion for creating music. “There are no bands in Mexican public schools,” he says. While some families could pay for instruments and lessons, others, like Salcedo’s, were less fortunate.“I’ve always loved music,” he continues. “I was a listener.” Salcedo, the Alex Rigopulos (1992) Fellow in Music Technology and Computation, earned an BS in Artificial Intelligence and Decision Making from MIT, where he explored signal processing in machine learning and how a classical understanding of signals can inform how we understand AI. Now he’s one of five master’s students in the Music Technology and Computation Graduate Program’s inaugural cohort. The program, directed by professor of the practice in music technology Eran Egozy ’93, MNG ’95, is a collaboration between the Music and Theater Arts Section in the School of Humanities, Arts, and Social Sciences (SHASS), and the School of Engineering. It invites practitioners to study, discover, and develop new computational approaches to music. It also includes a speaker series that exposes students and the broader MIT community to music industry professionals, artists, technologists, and other researchers.Rigopulos ’92, SM ’94 is a video game designer, musician, and former CEO of Harmonix Music Systems, a company he co-founded with Egozy in 1995. Harmonix is now a part of Epic Games, where Rigopulos is the director of game development music.“MIT is where I was first able to pursue my passion for music technology decades ago, and that experience was the springboard for a long and fulfilling career,” says Rigopulos. “So, when MIT launched an advanced degree program in music technology, I was thrilled to fund a fellowship to help propel this exciting new program.”Salcedo’s research focuses on neural cellular automata (NCA), which merges classical cellular automata with machine learning techniques to grow images that can regenerate. When paired with a stimulus like music, these images can “show” sounds in action.“This approach enables anyone to create music-driven visuals while leveraging the expressive and sometimes unpredictable dynamics of self-organized systems,” Salcedo says. Through the web interface Salcedo designed, users can adjust the relationship between the music’s energy and the NCA system to create unique visual performances using any music audio stream.“I want the visuals to complement and elevate the listening experience,” he says.Egozy is enthusiastic about Salcedo’s work and his commitment to further exploring its possibilities. “He is a beautiful example of a multidisciplinary researcher who thinks deeply about how to best use technology to enhance and expand human creativity,” he says.Salcedo has been selected to deliver the student address at the 2026 Advanced Degree Ceremony for SHASS. “It’s an honor, and it’s daunting,” he says. “It feels like a huge responsibility,” though one he’s eager to embrace. His selection also pleases Egozy. “I am super excited that Marino was chosen to deliver this year’s keynote,” he enthuses. Changing gearsSalcedo began his MIT journey as a mechanical engineering (MechE) student, applying to MIT through the Questbridge program. “I heard if you like engineering and science that attending MIT would be a great choice,” he recalls. “Nerds are welcomed and embraced.” While he dutifully worked toward completing his MechE curriculum, music and technology came calling after a chance encounter with a large language model (LLM).“I was introduced to an LLM chatbot and was blown away,” he recalls. “This was something that was speaking to me. I was both awed and frightened.” After his encounter with the chatbot, Salcedo switched his major from mechanical engineering to artificial intelligence and decision-making.“I basically started over, after being two-thirds of the way through the MechE curriculum,” he says. He learned about the possibilities available with AI but also confronted some of the challenges bedeviling researchers and developers, including its potential power, ensuring its responsible use, human bias, limited access for people from underrepresented groups, and a lack of diversity among developers. He decided he might be able to change that picture.“I thought, one more person in the field could make a difference,” he says. While completing his undergraduate studies, Salcedo’s love of music resurfaced. “I began DJing at MIT and was hooked,” he says. While he hadn’t learned to play a traditional instrument, he discovered he could create engaging soundscapes with technology. “I bought a digital audio work station to help me make music,” he continues.Egozy and Salcedo met in 2024, while Salcedo completed an Undergraduate Research Opportunities Program rotation as a game developer in Egozy’s lab. “He was incredibly curious and has grown tremendously over a very short time period,” Egozy says. Egozy became an informal, although important, mentor to Salcedo. “He brings great energy and thoughtfulness to his work, and to supporting others in the [music technology and computation graduate] program,” Egozy notes.Salcedo also took a class with Egozy, 21M.385/21M.585/6.4450 (Interactive Music Systems), which further fed his appetite for the creativity he craved while also allowing him to indulge his fascination with music’s possibilities. By taking advantage of courses in the SHASS curriculum, he further developed his understanding of music theory and related technologies. “I took a class with professor Leslie Tilley, 21M.240 (Critically Thinking in Music), which helped establish a valuable framework for understanding music making,” he says, “while a class like 6.3000 (Signal Processing) helped me connect intuition with science.” Working across disciplinesWhile Salcedo is passionate about his music and his research, he’s also invested in building relationships with his fellow students. He’s a member of the fraternity Sigma Nu, where he says he “found a home and community.” He also took a MISTI trip to Chile in summer 2023, where he conducted music technology research. Salcedo praises the culture of camaraderie at MIT and is grateful for its influence on his work as a scholar. “MIT has taught me how to learn,” he says.Professors encouraged him to present his research and findings. He presented his work — Artificial Dancing Intelligence: Neural Cellular Automata for Visual Performance of Music — at the Association for the Advancement of Artificial Intelligence conference in Singapore in January 2026. Salcedo believes his research can potentially move beyond music visualization. “What if we could improve the ways we model self-organized systems?” he asks. “That is, systems like multicellular organisms, flocks of birds, or societies that interact locally but exhibit interesting behaviors.” Any system, Salcedo says, where the whole is more than the sum of its parts. Developing the technology used to design his application can potentially help answer important ethical questions regarding AI’s continued expansion and growth. The path to his work’s development is both daunting and lonely, but those challenges feed his work ethic. “It’s intimidating to pursue this path when the academy is currently focused on LLMs,” he says. “But it’s also important to explain and explore the base technology before digging into more nuanced work, which can help audiences understand it better.” Knowing that he has the support of his professors helps Salcedo maintain excitement for his ideas. “They only ask that we ground our interests in research,” he says. His investigations are impacting his work as a musician. “My music has gotten more interesting because of the classes I’m taking,” he says. He’s also interested in understanding whose music the academy and the world hears, exploring biases toward Western music in the canon and exploring how to reduce biases related to which kinds of music are valued.“The work we do as technologists is far less subjective than we’re led to believe,” he believes.Salcedo is especially grateful for the support he’s received during his time at MIT. “Program faculty encourage a variety of pursuits,” he says, “and ask us to advance our individual aims, rather than focusing on theirs.” During his time in the graduate program, he notes with enthusiasm how often he’s been challenged to pursue his ideas. Ultimately, Salcedo wants people to experience the joy he feels working at the intersection of the humanities and the sciences. Music and technology impact nearly everyone. Inviting audiences into his laboratory as participants in the creative and research processes offers the same kind of satisfaction he gets from crafting a great beat or solving for a thorny technical challenge. Helping audiences understand his work’s value fuels his drive to succeed.“I want users to feel movement and explore sounds and their impact more fully,” he says.

    Carlos Mariano Salcedo, a student in the MIT Music Technology and Computation Graduate Program, is designing AI to visualize music and other sounds.

  • Get free dry acoustic drums for Splice INSTRUMENT
    Download our free acoustic drums preset for Splice INSTRUMENT—grab these presets during the drop window and they’re yours to keep forever.

    Download our free acoustic drums preset for the Splice INSTRUMENT plugin. Grab these presets during the drop window and they’re yours to keep forever.

  • KSHMR talks producing for Justin Bieber and Beyoncé, making sample packs, and more
    Globally-acclaimed producer and artist KSHMR discusses his landmark collaborations, greatest learnings, new sample pack, and more.

    In this exclusive interview, globally-acclaimed producer and artist KSHMR discusses his landmark collaborations, sample packs, and more.

  • What are chord inversions? A music theory guide
    Learn about what chord inversions are and explore ways to use them for better voice leading and more nuanced emotional impact.

    Learn about what chord inversions are in music and explore how to use them for better voice leading and more nuanced emotional impact.

  • Music modes: What they are and how to use them
    Learn about what musical modes are, the seven common diatonic modes, and how you can use them to achieve new colors in your music.

    Learn how musical modes work, explore seven common types, and discover how using modes can bring new color, emotion, and depth to your music.

  • How This UK Indie Label Sold 25K Records For 1 Band First WeekThis week, Ari is joined by Mark Orr, founder of Lab Records, to break down modern indie label deals, marketing, and artist strategy.

    This week, Ari is joined by Mark Orr, founder of Lab Records, to break down modern indie label deals, marketing, and artist strategy.

  • How to make beat videos that go viral
    Expert music producer and content creator Issac Duarte showcases how to create effective and replicable beat videos for social media.

    Expert music producer and content creator Issac Duarte showcases how to create effective and replicable beat videos for social media.

  • This Artist-Run Record Label is Competing with the Majors in a Big WayThis week, Ari is joined by Michael Turner, founder of Rebellion, to discuss viral marketing, short-form content, and new artist-first label models.

  • Watch KSHMR transform a rubber chicken into an epic arrangement
    Watch KSHMR take his sampling skills to the test, flipping everything from his own sounds to a rubber chicken into a larger-than-life arrangement.

    Watch KSHMR take his sampling skills to the test, flipping everything from his own sounds to a rubber chicken into an epic arrangement.

  • Esther Anaya on the electric violin, sound design, and transcending genres
    In this exclusive interview, Esther Anaya discusses her approach to the electric violin, sound design philosophy, and more.

    In this exclusive interview, Esther Anaya discusses her approach to the electric violin, sound design philosophy, and more.

  • Recreating the forms and sounds of historical musical instrumentsWhat if there were a way to create accurate replicas of ancient and historical instruments that could be played and heard? In late 2024, senior MIT postdoc Benjamin Sabatini wrote MIT Professor Eran Egozy to ask just that, and about a collaborative research project between the Center for Materials Research in Archeology and Ethnology (CMRAE) and the MIT School of Humanities, Arts, and Social Sciences (SHASS) to CT scan, chemically and structurally characterize, and produce replicas of the ancient and historical musical instruments housed at the Museum of Fine Arts, Boston (MFA).He was soon introduced to Mark Rau, a newly hired MIT professor in music technology and electrical engineering. Sharing similar interests, the two together contacted Jared Katz, the Pappalardo Curator of Musical Instruments at the MFA, to propose a cross-institutional project. Rau, an avid museum-goer, particularly of musical instrument collections, has always wanted to hear the instruments on display, commenting that “my biggest qualm is often there are no accompanying audio examples. I want to hear these instruments; I want to play these instruments.” Katz, fortuitously, specializes in ancient musical practices and has developed a technique for 3D scanning and printing playable replicas of ancient instruments for his research. He had long dreamed of having access to a CT scanner to better understand how ancient instruments were constructed. The MFA was also an ideal institution for the project, since, according to Katz, the MFA’s musical instrument collection began in 1917 and has since grown to just over 1,450 instruments from six continents, with the earliest dating to approximately 1550 BCE. Rau and Sabatini, soon after, applied to and were funded by the MIT Human Insight Collaborative (MITHIC) with Katz's support. The team of five, including Nate Steele, program associate in the MFA’s Department of Musical Instruments and MIT postdoc Jin Woo Lee, now meets regularly at the MFA to scan and acoustically measure the instruments.Using a CT scanner from Lumafield, a company founded by MIT alumni, the team measures both internal and external dimensions. When combined with non-destructive vibration and acoustic testing and numerical simulations, these measurements are used to digitally replicate the instruments’ sound accurately. “For example, if we’re trying to recreate a violin, we can use an impact hammer — a very small hammer with a transducer in it — so we’re imparting a known force signal into the instrument, and then measure the resulting [surface] vibrations with a laser Doppler vibrometer,” says Rau.The team then uses 3D-printed copies of the instruments to create plaster mold negatives, which are cast into using slip, such as with the Paracas whistle, a ceramic artifact from Peru dating from 600-175 BCE, to replicate the instruments physically. The team demonstrated a playable replica at the MITHIC Annual Event in November. They also intend to build replicas of wooden instruments using old-growth wood in collaboration with local luthiers.Sabatini, a member of CMRAE, sees the humanistic implications of the project and the importance of studying the instruments from a materials and archaeological perspective, which is to explore and understand the cultures that were involved in their production, stating that “[from our] perspective, we want to understand the people who made these instruments through both the materials that they’re made of, but also the sound that they have.”With his team of Undergraduate Research Opportunities Program (UROP) students, including Irene Dong and Mouhammad Seck, Sabatini reproduced several ancient and historical clay instruments in the CMRAE archaeology lab, including the Paracas whistle, which was showcased at the MITHIC event.So far, the team has scanned approximately 30 instruments from the MFA’s collection, with the goal of scanning at least 100 instruments over the duration of the project, documenting them, and supporting future study. The data from the scans are used to reconstruct the instruments, both physically and in software, matching their physical form and sound.“They’re both visually beautiful and striking objects, but they are meant to be heard,” Katz says. Further stating that his “hope for this research is to provide us with a way to protect the original instrument while still allowing them to be heard and experienced in the way they were intended to be experienced.”Katz also sees potential for outreach and community engagement through these playable replicas, which is a goal written into the project’s proposal, further stating that “[i]t shows how powerful it can be when art and science come together to create new understandings and to help us reactivate these instruments in exciting ways.”Students have also been drawn to the project, including Victoria Pham, a second-year undergraduate in materials science and engineering, who is working with Sabatini as a UROP student. Pham was “drawn to this project because I love history,” she says. “I love wandering through the halls of the MFA and immersing myself in the descriptions of paintings and artifacts. I find learning about ancient peoples to be fascinating, especially in how their legacy affects us today.”Her work involves finite element modeling of a Veracruz poly-glabular flute, dating to 500-900 CE, to investigate its acoustics non-destructively. She notes that “[m]y work is fulfilling because I was able to learn new software and problem-solve to improve my model, which was very satisfying.”Pham thinks that “contributing to the new, budding field of music technology scratches an itch in my brain, and I hope that my work inspires others to get interested in archaeology, material science, or music technology.”Alexander Mazurenko, a second-year undergraduate majoring in music and mathematics, has also been working on the project. He began last summer and continued during this year's Independent Activities Period in January.Mazurenko notes that his involvement in this project has furthered his interdisciplinary education at MIT, commenting that “[t]he opportunity to participate in this UROP with Professor Rau was the perfect chance to begin to work in the intersection of my passions.” His work, and that of Pham, will be presented at upcoming conferences, and are expected to produce academic papers under the guidance of Sabatini and Rau.

    Through an interdisciplinary collaboration between MIT and Boston's Museum of Fine Arts, researchers are creating playable physical and synthesized replicas of historical and prehistoric musical instruments.

  • How to write better lyrics: 9 tips for your songwriting
    From experimenting with new rhyming schemes to being more intentional about your message, we explore nine tips and techniques that can help you improve your lyrics.

    Learn how to write better lyrics with nine practical songwriting tips and techniques. Explore tips for writing song lyrics that connect and resonate.

  • How SXSW Works For MusicThis week, Ari is joined by Brian Hobbs and Dev Sherlock of SXSW to break down everything you need to know about the festival.

    This week, Ari is joined by Brian Hobbs and Dev Sherlock of SXSW to break down everything you need to know about the festival.

  • Best plugins for Pro Tools (free and paid)
    Explore high-quality free and paid plugins that are compatible with Pro Tools, spanning essentials for composition, mixing, mastering, and more.

    Explore the best free and paid plugins for Pro Tools. Enhance your production with must-have plugins for mixing, mastering, and more.