Community Space Reactions
OFFstep Is a Tech-Savvy, User-Friendly Hub for Music DistributionThe following was developed in collaboration with OFFstep, a company Ari’s Take is proud to be partnering with. OFFstep is the simplest, easiest-to-use, and most affordable music distribution platform for independent artists. As a sister company to ONErpm, OFFstep uses the same advanced technology to deliver a seamless experience. OFFstep provides a full range of […]
OFFstep Is a Tech-Savvy, User-Friendly Hub for Music Distribution
aristake.comThe following was developed in collaboration with OFFstep, a company Ari’s Take is proud to be partnering with. OFFstep is the simplest, easiest-to-use, and most affordable music distribution platform for independent artists. As a sister company to ONErpm, OFFstep uses the same advanced technology to deliver a seamless experience. OFFstep provides a full range of tools and features, with three plans to choose from: Basic for essential distribution needs, and Intermediate and Advanced for those seeking more robust technology and features. Why OFFstep? Affordable, Hassle Free Distribution OFFstep believes that every artist deserves the chance to share their music without [...]
Access to your Splice library—now in Studio One Pro 7
The first update to the Splice x Studio One Pro Integration brings several key feature additions and improvements that will make your in-DAW Splice experience even better.Splice and Studio One Pro 7: What’s New in V1.1.1 - Blog | Splice
splice.comYou can now access your Splice library, Collections, and Likes directly inside Studio One Pro 7. Click to find the full list of updates to Splice in Studio One.
Win $1,000 and get your song pitched to K-pop artists by MNDR, Margo XS, and Jess Corazza
We're excited to announce that we're partnering with MNDR, We Make Noise, and Soundtoys for our latest K-pop challenge.Win $1,000 and Get Your Song Pitched to K-pop Artists - Blog | Splice
splice.comWe're excited to announce that we're partnering with MNDR, We Make Noise, and Soundtoys for our latest K-pop challenge.
A model of virtuosityA crowd gathered at the MIT Media Lab in September for a concert by musician Jordan Rudess and two collaborators. One of them, violinist and vocalist Camilla Bäckman, has performed with Rudess before. The other — an artificial intelligence model informally dubbed the jam_bot, which Rudess developed with an MIT team over the preceding several months — was making its public debut as a work in progress.Throughout the show, Rudess and Bäckman exchanged the signals and smiles of experienced musicians finding a groove together. Rudess’ interactions with the jam_bot suggested a different and unfamiliar kind of exchange. During one duet inspired by Bach, Rudess alternated between playing a few measures and allowing the AI to continue the music in a similar baroque style. Each time the model took its turn, a range of expressions moved across Rudess’ face: bemusement, concentration, curiosity. At the end of the piece, Rudess admitted to the audience, “That is a combination of a whole lot of fun and really, really challenging.”Rudess is an acclaimed keyboardist — the best of all time, according to one Music Radar magazine poll — known for his work with the platinum-selling, Grammy-winning progressive metal band Dream Theater, which embarks this fall on a 40th anniversary tour. He is also a solo artist whose latest album, “Permission to Fly,” was released on Sept. 6; an educator who shares his skills through detailed online tutorials; and the founder of software company Wizdom Music. His work combines a rigorous classical foundation (he began his piano studies at The Juilliard School at age 9) with a genius for improvisation and an appetite for experimentation.Last spring, Rudess became a visiting artist with the MIT Center for Art, Science and Technology (CAST), collaborating with the MIT Media Lab’s Responsive Environments research group on the creation of new AI-powered music technology. Rudess’ main collaborators in the enterprise are Media Lab graduate students Lancelot Blanchard, who researches musical applications of generative AI (informed by his own studies in classical piano), and Perry Naseck, an artist and engineer specializing in interactive, kinetic, light- and time-based media. Overseeing the project is Professor Joseph Paradiso, head of the Responsive Environments group and a longtime Rudess fan. Paradiso arrived at the Media Lab in 1994 with a CV in physics and engineering and a sideline designing and building synthesizers to explore his avant-garde musical tastes. His group has a tradition of investigating musical frontiers through novel user interfaces, sensor networks, and unconventional datasets.The researchers set out to develop a machine learning model channeling Rudess’ distinctive musical style and technique. In a paper published online by MIT Press in September, co-authored with MIT music technology professor Eran Egozy, they articulate their vision for what they call “symbiotic virtuosity:” for human and computer to duet in real-time, learning from each duet they perform together, and making performance-worthy new music in front of a live audience.Rudess contributed the data on which Blanchard trained the AI model. Rudess also provided continuous testing and feedback, while Naseck experimented with ways of visualizing the technology for the audience.“Audiences are used to seeing lighting, graphics, and scenic elements at many concerts, so we needed a platform to allow the AI to build its own relationship with the audience,” Naseck says. In early demos, this took the form of a sculptural installation with illumination that shifted each time the AI changed chords. During the concert on Sept. 21, a grid of petal-shaped panels mounted behind Rudess came to life through choreography based on the activity and future generation of the AI model.“If you see jazz musicians make eye contact and nod at each other, that gives anticipation to the audience of what’s going to happen,” says Naseck. “The AI is effectively generating sheet music and then playing it. How do we show what’s coming next and communicate that?”Naseck designed and programmed the structure from scratch at the Media Lab with assistance from Brian Mayton (mechanical design) and Carlo Mandolini (fabrication), drawing some of its movements from an experimental machine learning model developed by visiting student Madhav Lavakare that maps music to points moving in space. With the ability to spin and tilt its petals at speeds ranging from subtle to dramatic, the kinetic sculpture distinguished the AI’s contributions during the concert from those of the human performers, while conveying the emotion and energy of its output: swaying gently when Rudess took the lead, for example, or furling and unfurling like a blossom as the AI model generated stately chords for an improvised adagio. The latter was one of Naseck’s favorite moments of the show.“At the end, Jordan and Camilla left the stage and allowed the AI to fully explore its own direction,” he recalls. “The sculpture made this moment very powerful — it allowed the stage to remain animated and intensified the grandiose nature of the chords the AI played. The audience was clearly captivated by this part, sitting at the edges of their seats.”“The goal is to create a musical visual experience,” says Rudess, “to show what’s possible and to up the game.”Musical futuresAs the starting point for his model, Blanchard used a music transformer, an open-source neural network architecture developed by MIT Assistant Professor Anna Huang SM ’08, who joined the MIT faculty in September.“Music transformers work in a similar way as large language models,” Blanchard explains. “The same way that ChatGPT would generate the most probable next word, the model we have would predict the most probable next notes.”Blanchard fine-tuned the model using Rudess’ own playing of elements from bass lines to chords to melodies, variations of which Rudess recorded in his New York studio. Along the way, Blanchard ensured the AI would be nimble enough to respond in real-time to Rudess’ improvisations.“We reframed the project,” says Blanchard, “in terms of musical futures that were hypothesized by the model and that were only being realized at the moment based on what Jordan was deciding.”As Rudess puts it: “How can the AI respond — how can I have a dialogue with it? That’s the cutting-edge part of what we’re doing.”Another priority emerged: “In the field of generative AI and music, you hear about startups like Suno or Udio that are able to generate music based on text prompts. Those are very interesting, but they lack controllability,” says Blanchard. “It was important for Jordan to be able to anticipate what was going to happen. If he could see the AI was going to make a decision he didn’t want, he could restart the generation or have a kill switch so that he can take control again.”In addition to giving Rudess a screen previewing the musical decisions of the model, Blanchard built in different modalities the musician could activate as he plays — prompting the AI to generate chords or lead melodies, for example, or initiating a call-and-response pattern.“Jordan is the mastermind of everything that’s happening,” he says.What would Jordan doThough the residency has wrapped up, the collaborators see many paths for continuing the research. For example, Naseck would like to experiment with more ways Rudess could interact directly with his installation, through features like capacitive sensing. “We hope in the future we’ll be able to work with more of his subtle motions and posture,” Naseck says.While the MIT collaboration focused on how Rudess can use the tool to augment his own performances, it’s easy to imagine other applications. Paradiso recalls an early encounter with the tech: “I played a chord sequence, and Jordan’s model was generating the leads. It was like having a musical ‘bee’ of Jordan Rudess buzzing around the melodic foundation I was laying down, doing something like Jordan would do, but subject to the simple progression I was playing,” he recalls, his face echoing the delight he felt at the time. “You're going to see AI plugins for your favorite musician that you can bring into your own compositions, with some knobs that let you control the particulars,” he posits. “It’s that kind of world we’re opening up with this.”Rudess is also keen to explore educational uses. Because the samples he recorded to train the model were similar to ear-training exercises he’s used with students, he thinks the model itself could someday be used for teaching. “This work has legs beyond just entertainment value,” he says.The foray into artificial intelligence is a natural progression for Rudess’ interest in music technology. “This is the next step,” he believes. When he discusses the work with fellow musicians, however, his enthusiasm for AI often meets with resistance. “I can have sympathy or compassion for a musician who feels threatened, I totally get that,” he allows. “But my mission is to be one of the people who moves this technology toward positive things.”“At the Media Lab, it’s so important to think about how AI and humans come together for the benefit of all,” says Paradiso. “How is AI going to lift us all up? Ideally it will do what so many technologies have done — bring us into another vista where we’re more enabled.”“Jordan is ahead of the pack,” Paradiso adds. “Once it’s established with him, people will follow.”Jamming with MITThe Media Lab first landed on Rudess’ radar before his residency because he wanted to try out the Knitted Keyboard created by another member of Responsive Environments, textile researcher Irmandy Wickasono PhD ’24. From that moment on, “It's been a discovery for me, learning about the cool things that are going on at MIT in the music world,” Rudess says.During two visits to Cambridge last spring (assisted by his wife, theater and music producer Danielle Rudess), Rudess reviewed final projects in Paradiso’s course on electronic music controllers, the syllabus for which included videos of his own past performances. He brought a new gesture-driven synthesizer called Osmose to a class on interactive music systems taught by Egozy, whose credits include the co-creation of the video game “Guitar Hero.” Rudess also provided tips on improvisation to a composition class; played GeoShred, a touchscreen musical instrument he co-created with Stanford University researchers, with student musicians in the MIT Laptop Ensemble and Arts Scholars program; and experienced immersive audio in the MIT Spatial Sound Lab. During his most recent trip to campus in September, he taught a masterclass for pianists in MIT’s Emerson/Harris Program, which provides a total of 67 scholars and fellows with support for conservatory-level musical instruction.“I get a kind of rush whenever I come to the university,” Rudess says. “I feel the sense that, wow, all of my musical ideas and inspiration and interests have come together in this really cool way.”
A model of virtuosity
news.mit.eduAcclaimed keyboardist Jordan Rudess’s collaboration with the MIT Media Lab culminated in a live improvisation between an AI "jam_bot" and the artist.
How To (Officially) Report Shady Spotify PlaylistsWith an estimated 100,000 songs being uploaded to Spotify every day, it’s harder than ever to break through the (literal) noise. And now that AI companies are flooding the DSPs with their so-called “music,” it’s getting even harder to find an audience.
How To (Officially) Report Shady Spotify Playlists
aristake.comWith an estimated 100,000 songs being uploaded to Spotify every day, it’s harder than ever to break through the (literal) noise. And now that AI companies are flooding the DSPs with their so-called “music,” it’s getting even harder to find an audience.
Alissia (Bruno Mars, Kaytranada) makes a track in Studio One for the first time
World-renowned producer, songwriter, and multi-instrumentalist Alissia creates a track in Studio One for the first time.Alissia (Bruno Mars, Kaytranada) Makes a Track in Studio One - Blog | Splice
splice.comWorld-renowned producer, songwriter, and multi-instrumentalist Alissia creates a track in Studio One for the first time.
What is reverb? Definition, types, parameters, and best plugins
Reverb is one of the most fundamental and popular effects in music—in this in-depth guide, learn about its definition, common types, best plugins, and more.What is Reverb? Definition, Types, Parameters, & Best Plugins - Blog | Splice
splice.comReverb is one of the most key effects in music—in this in-depth guide, learn about its definition, common types, best plugins, and more.
Noga Erez on her Major Label Debut Album and Dealing With HateThis week, Ari is joined by alt-pop singer, rapper & multi-hyphenate Noga Erez to discuss the creative process and working with labels.
Noga Erez on her Major Label Debut Album and Dealing With Hate
aristake.comThis week, Ari is joined by alt-pop singer, rapper & multi-hyphenate Noga Erez to discuss the creative process and working with labels.
How to make horror game music and sound effects
From Dead Space to Silent Hill, we break down the auditory elements of several iconic video games to gain insights into how to make our own horror game music and sound effects.How to Make Horror Game Music and Sound Effects - Blog | Splice
splice.comFrom the OSTs of Dead Space to Silent Hill, we break down the auditory elements of some of the best horror game music and sound effects.
Beat block: What it is and how to overcome it
Veteran producer Isaac Duarte showcases how he leverages the latest tools to find inspiration and overcome beat block.Beat Block: What it is and How to Overcome it - Blog | Splice
splice.comVeteran producer Isaac Duarte showcases how he leverages the latest tools to find inspiration and overcome beat block.
How Partisan Records Markets ArtistsThis week, Ari is joined by Zena White, the COO of Partisan Records, to discuss how labels can support and market their artists effectively.
How Partisan Records Markets Artists
aristake.comThis week, Ari is joined by Zena White, the COO of Partisan Records, to discuss how labels can support and market their artists effectively.
Where creativity meets workflow: A letter from our CEO
Over the past few months, we’ve studied how we can get deeper into existing workflows, in order to simplify the experience while making it more creatively fulfilling.Where Creativity Meets Workflow: A Letter From Our CEO - Blog | Splice
splice.comOver the past few months, we’ve studied how we can get deeper into existing workflows, in order to simplify the experience while making it more creatively fulfilling.
Amp simulator plugins: A guide to the best amp sims for guitar
We explore what guitar amp simulator plugins are, how they sound, their pros and cons, and some of the best amp sims out there today.Amp Simulator Plugins: A Guide to the Best Amp Sims for Guitar - Blog | Splice
splice.comWe explore what guitar amp simulator plugins are, how they sound, their pros and cons, and some of the best amp sims out there today.
PR is changing: Alternatives for Today’s Indie ArtistsThe following was developed in collaboration with Groover, a company Ari’s Take is proud to be partnering with. When you purchase from links on this page, we may earn an affiliate commission. PR is changing: Alternatives for Today’s Indie Artists In music, good PR is extremely powerful. A strong relationship with your audience is enough […]
PR is changing: Alternatives for Today’s Indie Artists
aristake.comThe following was developed in collaboration with Groover, a company Ari’s Take is proud to be partnering with. When you purchase from links on this page, we may earn an affiliate commission. PR is changing: Alternatives for Today’s Indie Artists In music, good PR is extremely powerful. A strong relationship with your audience is enough to give your career legs and have it walk you straight into numerous opportunities; a label’s interest piques when they notice a decent online following, venues are further incentivized to book you when they know you’ll bring in a great crowd, other artists will invite [...]
Billboard Exec on Linkin Park and the state of Music JournalismThis week, Ari is joined by Jason Lipshutz, the Executive Director of Music at Billboard to discuss the current state of music journalism.
Billboard Exec on Linkin Park and the state of Music Journalism
aristake.comThis week, Ari is joined by Jason Lipshutz, the Executive Director of Music at Billboard to discuss the current state of music journalism.
Exe. Cutor
@exe.cutorabutilostro
@Teegvngyuntremix
@id_4404Oleg Yershov
@Yorshoff