Community Space Reactions
How to make ambient music: Tools and techniques
Learn how to make ambient music that's immersive, expressive, and ever-evolving via our in-depth guide.How to Make Ambient Music - Blog | Splice
splice.comLearn how to make ambient music that’s immersive and expressive. Discover tools and techniques for spacey, textural tracks.
New Global Booking Agency from ATC and Arrival Artists Now Reps 800 ArtistsThis week on the New Music Business podcast, Ari sits down with Ethan Berlin and Skully Kaplan of Roam Artists to break down the modern landscape of booking and live touring.
https://aristake.com/roam-artists/What are harmonics? Exploring the fabric of music
Let's explore what harmonics and overtones are, how they constitute the very fabric of music, and how you can apply an understanding of them to your tracks.What are Harmonics? Definition, Use Cases for Music, and More - Blog | Splice
splice.comLearn what harmonics and overtones are, how they make up the very fabric of music, and how you can apply a knowledge of them to your tracks.
Creating tension, suspense, and release: Tips from a pro film composer
Learn how expert film composer Dave Kropf (Chopped, The Bachelor) uses cinematic effects to create tension, suspense, and release in his cues.Creating Tension & Suspense: Tips from a Pro Film Composer - Blog | Splice
splice.comLearn how expert film composer Dave Kropf (Chopped, The Bachelor) uses cinematic effects for creating tension, suspense, and release.
Analog vs. digital synthesizers: What’s the difference and which should you choose?
Learn about the strengths and limitations of analog vs. digital synthesizers, and when you'd want to reach for each.Analog vs. Digital Synthesizer - Blog | Splice
splice.comWhat’s the real difference between an analog vs. digital synthesizer? Learn which synth type best fits your workflow and production needs.
MIT engineers’ virtual violin produces realistic soundsThere is no question that violin-making is an art form. It requires a musician’s ear, a craftsperson’s skill, and an historian’s appreciation of lessons learned over time. Making a violin also takes trust: Violin makers, or luthiers, often must wait until the instrument is finished before they can hear how all their hard work will sound.But a new tool developed by MIT engineers could help luthiers play around with a violin’s design and tweak its sound even before a single part is carved.In a study appearing today in the journal npj Acoustics, the MIT team reports on a new “computational violin” — a computer simulation that captures the detailed physics of the instrument and realistically produces the sound of a violin when its strings are plucked.While there are software programs and plug-ins that enable users to play around with virtual violins, their sounds are typically the result of sampling and averaging over thousands of notes played by actual violins.In contrast, the new computational violin takes a physics-based approach: It produces sound based on the way the instrument, including its vibrating strings, physically interacts with the surrounding air.As a demonstration, the researchers applied the computational violin to play two short excerpts: one from “Bach’s Fugue in G Minor,” and another from “Daisy Bell” — a nod to the first song that was ever produced by a computer-synthesized voice.The computational violin currently simulates the sound of plucked strings — a type of playing that musicians know as “pizzicato.” Violin bowing, the researchers say, is a much more complicated interaction to model. However, the computational violin represents the first physics-based foundation of a strung violin sound that could one day be paired with a model of bowing to produce realistic, bowed violin music.For now, the team says the new virtual violin could be used in the initial stages of violin design. Luthiers can tweak certain parameters such as a violin’s wood type or the thickness of its body, and then listen to the sound that the instrument would make in response.“These days, people try to improve designs little by little by building a violin, comparing the sound, then making a change to the next instrument,” says Yuming Liu, senior research scientist at MIT. “It’s very slow and expensive. Now they can make a change virtually and see what the sound would be.”“We’re not saying that we can reproduce the artisan’s magic,” adds Nicholas Makris, professor of mechanical engineering at MIT. “We’re just trying to understand the physics of violin sound, and perhaps help luthiers in the design process.”Makris and Liu’s MIT co-authors include Arun Krishnadas PhD ’23 and former postdoc Bryce Campbell, along with Roman Barnas of the North Bennet Street School.Sound matrixThe quality of a violin’s sound is determined by its dimensions and design. The instrument is made from thoughtfully crafted parts and materials that all work to generate and amplify sound. In recent years, scientists have sought to understand what artisans have intuited for centuries, in terms of what specific parameters shape a violin’s sound.In one early effort in 2006, scientists, as part of the Strad3D project, put a rare Stradivarius violin through a CT scanner. The violin was crafted in 1715 by the master violinmaker Antonio Stradivari, during what is considered the “Golden Age” of violin making. To better understand the violin’s anatomy and its relation to sound, the scientists scanned the instrument and produced 600 “slices,” or views, of the violin.The CT scans are available online for people to view and use as data for their own experiments. For their study, Makris and his colleagues first imported the CT scans into a solid modeling software program to generate a detailed three-dimensional model of the violin. They then ran a finite element simulation, essentially dividing the violin into millions of tiny individual cubes, or “elements.”For each cube, they noted its material type, such as if a cube from the violin’s back plate is made from maple or spruce, or if a string is made from steel or natural fibers. They then applied physics-based equations of stress and motion to predict how each material element would move in relation to every other element across the instrument.They also carried out a similar process for the air surrounding the violin, dividing up a roughly cubic-meter volume of air and applying acoustic wave equations to predict how each tiny parcel of air would move and contribute to generating sound.“The entire thing is a matrix of millions of individual elements,” explains Krishnadas. “And ultimately, you see this whole three-dimensional being, which is the violin and the air all connected and interacting with each other.”A plucky modelThe team then simulated how the new computational violin would sound when plucked. When a violinist plucks a string, they pull the string sideways and let it go, causing the string to vibrate. These vibrations travel across the instrument and inside it; the air’s vibrations are amplified as they travel out of the violin and into the surroundings, where a listener hears the vibrations as sound.For their purposes, the engineers simulated a simple string pluck by directing one of the virtual violin’s strings to stretch out and then rebound. The simulation computed all the resulting motions and vibrations of the millions of elements in the violin, and the sound that the pluck would produce.For notes that require pressing down on a violin’s fingerboard, they simulated the same plucking, and in addition, included a condition in which the string is held fixed in the section of the fingerboard where a violinist’s finger would press down.The researchers carried out this computational process to virtually pluck out the notes in several measures of “Daisy Bell” and “Bach’s Fugue in G Minor.”“If there’s anything that’s sounding mechanical to it, it’s because we’re using the exact same time function, or standard way of plucking, for each note,” says Makris, who is himself a lute player. “A musician will adapt the way they’re plucking, to put a little more feeling on certain notes than others. But there could be subtleties which we could incorporate and refine.”As it is, the new computational model is the first to generate realistic sound based on the laws of physics and acoustics. The researchers say that violin makers could use the model to test how a violin might sound when certain dimensions or properties are changed. For instance, when the researchers varied the thickness of the virtual violin’s back plate or changed its wood type, they could hear clear differences in the resulting sounds.“You can tweak the model, to hear the effect on the sound,” Makris says. “Since everything obeys the laws of physics, including a violin and the music it makes, this approach can add an appreciation to what makes violin sound. But ultimately, we get most of our inspiration from the artisans.”This work was supported, in part, by an MIT Bose Research Fellowship.
MIT engineers’ virtual violin produces realistic sounds
news.mit.eduMIT researchers developed a “computational violin” — the first computer simulation that captures the detailed physics of the instrument and realistically produces the sound of a violin when its strings are plucked. Violin makers could use the model to test how a violin might sound when certain dimensions or properties are changed.
Splice is now integrated with MCP: Artist-made sounds, wherever you create
Splice embraces extensibility by supporting MCP, which allows users to build new creative experiences with Splice Sounds.Splice is Now Integrated With MCP: Artist-Made Sounds, Wherever You Create - Blog | Splice
splice.comSplice Sounds is now available in Claude. Learn about how the new MCP integration opens up new ways to search, discover, and curate artist-made sounds.
Tori Letzler and Steven Richard Davis showcase their film scoring setups
Expert composers Tori Letzler and Steven Richard Davis give us an exclusive tour of their two incredible home studios.Tori Letzler & Steven Richard Davis Give an Exclusive Studio Tour - Blog | Splice
splice.comExpert composers Tori Letzler and Steven Richard Davis give us an exclusive tour of their two incredible home studios.
Distortion vs. saturation: The differences and when to use each
Learn about the similarities and differences between distortion and saturation, as well as when to use each.Distortion vs. Saturation Explained - Blog | Splice
splice.comDistortion vs. saturation—what’s the tonal difference? Learn how and when to use both for warmth, grit, or aggressive edge.
Kris Bowers on finding honesty in music and the art of film scoring
In this exclusive interview, award-winning composer Kris Bowers (Bridgerton, The Wild Robot) shares his insights on writing character themes, finding inspiration in honesty, and more.Kris Bowers Interview: Character Themes & the Art of Scoring - Blog | Splice
splice.comIn this exclusive interview, award-winning composer Kris Bowers (Bridgerton, The Wild Robot) shares his insights on the art of film scoring.
How to use the new Splice Sounds Plugin (beta)
Learn how to use the new Splice Sounds Plugin to find loops and one-shots in new ways and get even more out of the Splice Sounds library.Tips on Using the Splice Sounds Plugin (Beta) - Blog | Splice
splice.comLearn how to use the new Splice Sounds Plugin to find loops and one-shots in new ways and get even more out of the Splice Sounds library.
What is a bridge in a song? How to write a bridge
Let’s take a look at what a bridge is, what functions it serves, and how to write an effective one for your own music.What Is a Bridge in a Song? (And How to Write One) - Blog | Splice
splice.comLet’s take a look at what a bridge is, what functions it serves in music, and how to write an effective one for your own compositions.
Q&A: MIT SHASS and the future of education in the age of AIThe MIT School of Humanities, Arts, and Social Sciences (SHASS) was founded in 1950 in response to “a new era emerging from social upheaval and the disasters of war,” as outlined in the 1949 Lewis Committee Report. The report’s findings emphasized MIT’s role and responsibility in the new nuclear age, which called for doubling down on genuine “integration” of scientific and technical topics with humanistic scholarship and teaching. Only that way, the committee wrote, could MIT tackle “the most difficult and complicated problems confronting our generation.”As SHASS marks its 75th anniversary, Dean Agustín Rayo answers questions about why the need for developing students with broad minds and human understanding is as urgent as ever, given pressing challenges in the midst of a new technological revolution.Q: Many universities are responding to artificial intelligence by launching new technical programs or updating curricula. You’ve suggested the change is deeper than that. Why?A: Artificial intelligence isn’t just changing the way students learn — it’s transforming every aspect of society. The labor market is experiencing a dramatic shift, upending traditional paths to financial stability. And AI is changing the ways we bring meaning to our lives: the ways we build relationships, the ways we pay attention, and the things we enjoy doing.The upshot is that the most important question universities need to ask is not how to adapt our pedagogy to AI — although we certainly need to address that. The most important question we need to ask is how to provide an education that brings real value to students in the age of AI. We need to ensure that universities provide students with the tools they need to find a path to financial security and to build meaningful lives.We need to produce students with minds that are both nimble and broad. We need our students to not only be able to execute tasks effectively, but also have the judgment to determine which tasks are worth executing. We need students who have a moral compass, and who understand how the world works, in all of its political, economic, and human complexity. We need students who know how to think critically, and who have excellent communication and leadership skills.Q: What role do the humanities, arts, and social sciences play in preparing MIT students for that future?A: They’re essential, and are rightly a core part of an MIT education: MIT has long required its undergraduates take at least eight courses in HASS disciplines to graduate.Fields like philosophy, political science, economics, literature, history, music, and anthropology are crucial to developing the parts of our lives that are essentially human — the parts that will not be replaced by AI.They are crucial to developing critical thinking and a moral compass. They are crucial to understanding people — our values, institutions, cultures, and ways of thinking. They are crucial to creating students who are broad thinkers who understand the way the world works. They are crucial to developing students who are excellent communicators and are able to describe their projects — and their lives — in a way that endows them with meaning.Our students understand this. Here is how one of them put the point: “Engineering gives me the tools to measure the world; the humanities teach me how to interpret it. That balance has shaped both how I do science and why I do it.” (Full interview here.)Q: Some people worry that emphasizing humanistic study could dilute MIT’s technological edge. How do you respond to that concern?A: I think the opposite is true. MIT is an important engine for social mobility in the United States, and a catalyst for entrepreneurship, which has added billions of dollars to the American economy. That cannot be separated from the fact that we are a technical institution, which brings together the country’s most talented undergraduates — regardless of socioeconomic background — and transforms them into the next generation of our country's top scientific and engineering leaders. MIT plays an incredibly important role in our country. So, the last thing I want to do is mess with our secret sauce.But I also think that the age of AI is forcing us to rethink what it means to be a top engineer. Think about artificial intelligence itself. The challenges we face are not just technical. Issues like bias, accountability, governance, and the societal impact of automation are no less important. Understanding those dimensions helps technologists design better systems and anticipate real-world consequences.Strengthening the humanities at MIT isn’t a departure from our core mission — it’s a way of ensuring that our technical leadership continues to matter in the world.Q: What kinds of changes is MIT SHASS pursuing to support this vision?A: There’s a lot going on! We’ve launched the MIT Human Insight Collaborative (MITHIC) as a way of strengthening research in the humanities, arts, and social sciences, and of deepening collaboration with colleagues across MIT.We’re shaping the undergraduate experience to ensure that every MIT student engages with the big societal questions shaping our time, from democratic resilience to climate change to the ethics of new technologies.We’re building stronger connections through initiatives like the creation of shared faculty positions with the MIT Schwarzman College of Computing (SCC). And we recently launched a new Music Technology and Computation Graduate Program with the School of Engineering.We’re partnering with SERC (the SCC’s Social and Ethical Responsibilities of Computing) to design new classes on the intersection of computing and human-centered issues, such as ethics.And we’re elevating the humanities — for their own sake, and as a space for experimentation, bringing together students, faculty, and partners to explore new forms of research, teaching, and public engagement.This is a very exciting time for SHASS.
Q&A: MIT SHASS and the future of education in the age of AI
news.mit.eduAs the MIT School of Humanities, Arts, and Social Sciences marks its 75th anniversary, Dean Agustín Rayo discusses why the need for developing students with broad minds and human understanding is as urgent as ever, given pressing challenges in the midst of a new technological revolution.
What is modulation in music?
Learn about the different definitions and creative use cases of modulation in music.What Is Modulation in Music? - Blog | Splice
splice.comWhat is modulation in music? Explore how modulation can mean key changes or production effects, and learn how to use both creatively in your tracks.
Gavin Brivik on scoring Faces of Death, sound design techniques, and collaboration
In this interview, award-winning composer Gavin Brivik discusses his creative process behind composing the score for Faces of Death.Gavin Brivik Interview: Scoring Faces of Death, Sound Design, & More - Blog | Splice
splice.comIn this interview, award-winning composer Gavin Brivik discusses his creative process behind composing the score for Faces of Death.
brunodossantosoficial55
@Bruds21pauldesca
@pauldscrcaDJ ZX
@id_9879RAMARIOANTHONY
@id_4297





