Reactions

  • Everything you need to know about iZotope RXRX is an audio restoration toolset whose seeds were sown in 2003 with a research project that resulted in three innovative tools: De-click, De-clip and Spectral De-noise. It was 2007, however, when those tools were bundled with a suite of more conventional audio processors and a powerful spectral audio editor, and released to the world as RX.
    We’re used to working with audio in the time domain, but by adding access to the frequency domain spectral editors make it possible to fix audio glitches and problems in ways that are impossible with purely time-domain processing.

    READ MORE: Review: iZotope RX 12’s focus on improved accuracy and quality pays off

    As a result, it’s no big surprise that RX was a big hit in industries where being able to make a quick repair could save hundreds – if not thousands – in re-shooting, re-recording, and/or re-mixing costs. We’re talking TV and film post-production facilities, along with a bit of audio mastering, but there was less initial interest from music producers.
    This started to change with the 2017 release RX 6, the first version of the software to harness the power of machine learning (ML) to enable functionality that seemed to border on the magical.
    Machine Learning Meets Audio Editing
    Image: Press
    ML works by being trained to recognise patterns. Initially recognition is very poor, but over thousands of rounds of training, and refinement of the resulting neural networks, it becomes more and more able to recognise different types of sound. From here, it’s relatively straightforward to create algorithms targeted at the frequencies that constitute a given type of sound, be it to attenuate that sound, such as for noise reduction/removal, or to lift it out of the audio entirely, such as with stem separation.
    In RX’s case musical stem separation is handled by the Music Rebalance module, which allows in-place volume adjustment of vocal, drums, bass and “other” stems (a godsend for mastering engineers!), or can separate those stems entirely. Two similar tools are aimed at TV and film production, namely Dialogue Isolate that removes background noise from dialogue, and the new Scene Rebalance that operates similarly to Music Rebalance but recognises dialogue, music and effects.
    Working with separated stems in RX 12 has been massively improved thanks to the new Stems View which allows you to work with separated stems as lanes within a single window. I talk more about this excellent new feature in my review.
    Not all of RX’s tools use ML, but the vast majority do in some form or other, and iZotope are steadily working through those that don’t, adding ML where there’s some advantage in doing so. For example, in RX 12, the De-bleed and Breath Control modules are the latest to receive an ML-based overhaul, making them both quicker to deploy and more accurate in their results (see my review for more about this).
    Will RX Be Useful For Me?
    Scene Rebalance in iZotope RX 12. Image: Adam Crute
    Although capable of straightforward editing and processing – cut/copy/paste operations, gain and EQ adjustment, and such – RX’s main focus is on restoring damaged and poor quality audio, with a side-order of enhancement tools that can add polish in ways other plugins cannot. A good example of the latter would be using Music Rebalance’s stem splitting during mastering to tame an overly-dynamic drum stem without impacting the rest of the mix.
    It is in restoring audio that RX is truly at its best, however, and it’s easy to see the value of being able to rescue a take or performance that’s perfect but for the squeaking of a piano pedal, the occasional mic pop, or the performer delivering an unexpected spike in volume that causes some clipping? Also invaluable is the ability to repair glitches that you didn’t notice during a recording session, and long after the performer(s) have gone off to do whatever-it-is they do while we’re topping-up our studio tans!
    The newly-released RX 12 brings enhancements and improvements that only add to these restorative abilities. The ML-based modules benefit from increased accuracy and transparency thanks to iZotope’s focus on improving its models through countless rounds of training (using ethically sourced and properly licensed training material). The ML algorithms run more efficiently too, so the improved results are delivered with less hanging-around than before.
    So, if you work in a studio with faultless equipment and perfect noise isolation, RX may be surplus to requirements, but if you work in a studio in the real world then there’s no doubt it will come in useful.
    What Do I Get With RX?
    De-bleed in iZotope RX 12. Image: Adam Crute
    RX comes in three editions aimed at different types of user. Elements is for those who may need fix common audio problems such as clicks, pops, and overly-reverberant signals, but have no need for a full-features spectral editor. As such, the package comprises a set of plugins for loading into your DAW, but no standalone RX editor software.
    Standard edition does include the spectral editor along with all of the restoration and enhancement tools you’re likely to need in a music production context. All of these are available as modules within RX, and many also have realtime plugin counterparts for use in your DAW, including Music Rebalance, Spectral De-noise, De-plosive, and Breath Control.
    Advanced edition includes everything found in Standard along with some very attractive and powerful nuggets such as EQ Match, Ambience Match, Spectral Recovery, and Scene Rebalance. If you often work with audio for visual media then these tools will likely prove exceptionally useful (although the price may make you wince!), but for everyone else it’s unlikely the Advanced-only modules and plugins will prove useful.
    A full list of the modules and plugins included in each edition of RX, along with explanations as to their functions, can be found on iZotope’s website, as can pricing details and upgrade options. Also, be sure to check out my review to find out more about RX 12’s new abilities features.
    The post Everything you need to know about iZotope RX appeared first on MusicTech.

    iZotope just launched the 12th version of its lauded audio restoration tool, RX. Here's everything you need to know about how it works.

  • Restoration of former Beastie Boys recording studio underway, and you can helpA restoration project to revive G-Son Studios in Atwater Village, LA, is underway. G-Son was the former rehearsal and recording space for the Beastie Boys, and was also the headquarters for their own label, Grand Royal.
    The project is being led by LA local, Adam Englander, with a vision to reopen G-Son as a living cultural venue for LA: a place for performances, screenings, workshops, DJ sets, gallery nights, rehearsals, and community programming. A public fundraising round will begin in May via Kickstarter.

    READ MORE: “It’s still our quest to one day have Despacio here”: In Ibiza, Soulwax discuss the future of their famed sound system

    G-Son studios became “one of the most mythic creative hideouts in LA”. On the outside it may have seemed like an unassuming old ballroom, but it was not only a studio, but a clubhouse, skate spot, basketball court, and experimental bunker where the Beastie Boys crafted some of their most famous works, including Check Your Head, Ill Communication, and Hello Nasty, as well as music videos like Pass the Mic.

    After the Beastie Boys left, the building continued to house creative tenants, including Mad Decent Recordings, before eventually facing the risk of being gutted entirely. Other artists who utilised the G-Son space include Beck, Biz Markie, Run-DMC, Redd Kross, and Luscious Jackson.
    According to the restoration team, construction crews were already inside preparing to tear things down when Englander and his project partner, Alex, stepped in. Since then, they’ve been slowly restoring the space by removing a collapsing stage, repairing infrastructure, fixing lighting, and uncovering original artwork that had been covered up, as pictured above. You can see more images from inside the venue below.
    Credit: G-Son Studios
    Credit: G-Son Studios
    In the UK, similar action is being taken to save London’s iconic Battery Studios, which is currently under threat of demolition and redevelopment into residential flats. A petition has been launched to save the venue, which was originally founded in 1967 as Morgan Studios. In the petition, organisers describe the studio as an “irreplaceable cultural landmark”.
    The post Restoration of former Beastie Boys recording studio underway, and you can help appeared first on MusicTech.

    The restoration of G-Son Studios in Atwater Village, LA, is underway. G-Son was the former recording space for the Beastie Boys.

  • Five synths that define Nine Inch Nails’ soundOne of the most anticipated sets of Coachella 2026 was that of Nine Inch Noize, the collaboration between Trent Reznor’s iconic industrial rock band, Nine Inch Nails and the trendsetting techno veteran, Boys Noize.

    READ MORE: Six synths that define Radiohead’s sound

    Boys Noize, Trent Reznor and Reznor’s longtime collaborator and Nine Inch Nails bandmate Atticus Ross had performed together as an opening act for NIN’s recent world tour, but Coachella 2026 marked the first time they debuted their full set.
    “The creative fulfillment of working on the Challengers and TRON scores with Boys Noize led me to think that including him in the Peel It Back tour could be an interesting way to express NIN in more purely electronic terms live – a concept I’ve wanted to explore for some time,” Reznor said in a statement.
    Together, they played a series of Nine Inch Nails songs as well as one cut from Reznor and Ross’s other band, How To Destroy Angels, and even a track from the famed synth-pop outfit, Soft Cell. Except every song was remixed with a new kind of industrial flavor to reflect the fresh union of artists.
    This kind of renewed approach defines Reznor’s career. Nine Inch Nails first formed in 1988, and in the subsequent 38 years, he has consistently tried new pieces of gear, ensuring no two releases sound the same.
    “There’s always a good song in everything, an interesting experience to be had,” Reznor told Synth History in 2022.
    And Reznor found good songs in the following five pieces of gear that ended up on albums such as Pretty Hate Machine (1989), The Downward Spiral (1994), and Hesitation Marks (2013).
    1. E-mu Emax

    The Emax was Reznor’s first real sampler, and according to him, he “got an Emax and that was Pretty Hate Machine.” He used it to make every single drum sound on the now platinum-selling album with a sample from an outside source.
    Another noticeable instance of the Emax comes on Terrible Lie. The screechy synth line at the end of the song started as a woodblock, which Reznor ran through a distortion pedal, sampled with the Emax, and pitched down.
    Emaxes were manufactured between 1986 and 1995, so an original often costs over $1,000. However, a software version is available for $49.
    2. Kurzweil K2000

    Another key piece of Reznor’s arsenal for his earlier, groundbreaking records such as The Downward Spiral, was the Kurzweil K2000 (it was also a favorite of another electronic icon, Jean-Michel Jarre). The synth has Variable Architecture Synthesis Technology (VAST), which allows users access to 31 algorithms of different component configurations. These built-in options empower them to extensively manipulate their chosen sound without getting into the technical minutiae of modular.
    One sound on The Downward Spiral listeners might not expect to be software is the squeaky clean acoustic guitar on Hurt. It’s actually the acoustic guitar patch on the K2000 Orchestral ROM. Going the complete opposite direction, the crunchy computerized drums on The Becoming are also all K2000.
    This vintage synth was mass-produced between 1991 and 2000, so a real one can be pricey, and as of now, there are no direct software versions.
    3. ARP Odyssey

    Reznor’s CV goes far beyond Nine Inch Nails, even winning two Academy Awards for Best Original Score alongside Atticus Ross. Most recently, they won for the Disney/Pixar film, Soul (2020), but their first was for Aaron Sorkin’s Facebook origin story, The Social Network (2010). Just as Mark Zuckerberg, played by Jesse Eisenberg, is sitting down drunk at his computer making a website where users compare women’s looks, the piece In Motion begins, which was created in part with an ARP Odyssey.
    Given the synth was “lying around” in Reznor’s studio as recently as 2022, it naturally made its way onto other records as well. Namely, The Hand That Feeds, the Grammy-nominated cut from With Teeth (2005). The synth and drum break was fueled by the ARP Odyssey.
    Being such a classic, it makes sense that Korg has an ARP Odyssey software version readily available for $49.99.
    4. Waldorf MicroWave

    The Waldorf MicroWave came about when Waldorf set out to make its own version of the PPG Wave 2, and Nine Inch Nails has used several versions of the MicroWave since. Reznor auctioned off a MicroWave XT and a MicroWave Access Programmer around 2009. Waldorf also made a custom MicroWave for Nine Inch Nails.
    They used the Microwaves extensively on The Fragile (1999). Charlie Clouser, who was in Nine Inch Nails from 1994 to 2000, said of the MicroWave:
    “I basically rely on three synths: the Nord, the (Access) Virus, and the MicroWave.”
    5. Native Instruments Maschine

    For Nine Inch Nails’ 2013 album, Hesitation Marks, Reznor composed the majority of the songs using Native Instruments’ Maschine. Despite having a massive arsenal of synths (and the money to buy anything he wanted), he chose to work on one device to use the limitation to his advantage. Beyond the sound banks that come with the product, he also used Soundtoys Native Effects bundle.
    “I liked the limitation that everything was in Maschine; I liked the fact that it could be easily automated with fingers on knobs, and you don’t have to spend time assigning stuff,” Reznor once told Soundtoys. “And I liked the fact that it felt pattern-based.”
    The post Five synths that define Nine Inch Nails’ sound appeared first on MusicTech.

    Trent Reznor is famous for his creative and hardcore use of electronics. Here are five synths he’s used for Nine Inch Nails

  • Baby Audio Atoms is 60% OFF at Plugin Boutique until May 13
    Plugin Boutique is running a 60% discount on Baby Audio Atoms, bringing the physical modeling synth down from $99 to $39 until May 13. Atoms is Baby Audio’s second virtual instrument, following the analog-flavored BA-1, and it’s a very different animal. Instead of oscillators, Atoms generates sound using a physical model of interconnected masses and [...]
    View post: Baby Audio Atoms is 60% OFF at Plugin Boutique until May 13

    Plugin Boutique is running a 60% discount on Baby Audio Atoms, bringing the physical modeling synth down from $99 to $39 until May 13. Atoms is Baby Audio’s second virtual instrument, following the analog-flavored BA-1, and it’s a very different animal. Instead of oscillators, Atoms generates sound using a physical model of interconnected masses and

  • “Creators’ needs are evolving”: Avid introduces Pro Tools 2026.4, with Track Pin – a simple new way to navigate complex sessions – front and centreAvid has unveiled the next version of Pro Tools – version 2026.4 – bringing about a number of upgrades to immersive audio production, workflow improvements and more.
    Headlining the new features for Pro Tools 2026.4 is the introduction of Track Pin, a new way to stay focused while working with large, complex sessions. 
    Professional audio projects can become unwieldy, often amassing hundreds of tracks in a single session. Even if you’re not working with quite that many tracks, you’ll know that it becomes more and more difficult to navigate a session the more complex it becomes.
    With Track Pin, you can now lock important tracks in place within the edit window, meaning they remain visible onscreen as you navigate your session. This means crucial elements of a mix, like a lead vocal, for example, can be pinned and easily referred to, meaning far less unnecessary scrolling and more efficient editing and mixing.

    READ MORE: Is agentic AI coming to Pro Tools? Avid announces “strategic partnership” with Google Cloud to further integrate AI into its portfolio

    Elsewhere, Pro Tools 2026.4 enhances the DAW’s AI-powered speech-to-text capabilities, making lyric- and dialogue-driven workflows “faster, more flexible, and more efficient”. Additionally, transcription data is now automatically carried through to newly rendered files when producers perform processing operations like AudioSuite, Commit and Consolidate.
    Pro Tools 2026.4 also introduces support for MPEG-H, a fast-growing broadcast standard, with a new MPEG-H Renderer plugin, as well as Dolby Headphone Personalisation for producers working with Dolby Atmos.
    Credit: Avid
    There’s also a raft of new instruments and sound content to inspire your projects, including the introduction of Native Instruments’ Massive X Player, a Pro Tools Massive X expansion pack for all users, and NI Lo-fi & Chill Plucks and Haze expansion packs for active subscribers and perpetual license holders.
    “We’re accelerating innovation across our audio portfolio and giving creators faster and smarter tools to deliver their best work – whether they’re weekend warriors creating new tracks, or professionals making the world’s most demanding audio content,” says Chris Winsor, Director of Pro Tools Product Management at Avid, speaking exclusively to MusicTech.
    Winsor notes how producers are increasingly looking for AI tools to help optimise their workflows.
    “Creators’ needs are evolving – they’re looking for faster workflows, more creative freedom and practical AI tools that help them save time and work more efficiently,” he says. “That’s why we’re building strategic partnerships that help artists and producers accelerate delivery and reduce friction with intelligent, integrated technologies. Our Splice integration is a game-changer, putting the best royalty-free sound library and millions of samples at the fingertips of music creators.”
    While AI adoption among musicians and producers has been the subject of some debate in recent years, Winsor reassures Pro Tools users that Avid’s approach to AI is always with the creator in mind.
    “AI and automation are top of mind for the industry, but our approach to AI is simple: put the creator in the driver’s seat and help them accelerate the mundane,” he continues. “Our technologies like Speech-to-Text or AI-powered chord symbols, and carefully curated AI partner integrations are all designed to give creators more control, speed, and freedom in the moments that matter.”
    Learn more about Pro Tools 2026.4 at Avid.
    The post “Creators’ needs are evolving”: Avid introduces Pro Tools 2026.4, with Track Pin – a simple new way to navigate complex sessions – front and centre appeared first on MusicTech.

    There’s also upgrades to the DAW’s AI-powered speech-to-text capabilities, improved support for Dolby Atmos and MPEG-H, plus new virtual instruments and sound libraries.

  • Release Title:
    Экстрасенсы
    Main Artist:
    Даня Юрк
    Release Date:
    22/04/2026
    Primary Genre:
    Hip Hop/Rap
    Secondary Genre:
    Alternative Rap
    https://publme.lnk.to/437285-
    #newmusic #Release #Music #indepedent #artist #hiphop #rap

    Listen to Экстрасенсы by Даня Юрк.

  • Video details
    Video title:
    Милый Котик
    Artist(s):
    Alice MassLove
    Genres:
    Pop, Electronica
    Release Date:
    10 Apr, 2026
    https://www.youtube.com/watch?v=_wBXxithO40
    #newmusicvideo #VEVO #Music #LifeCycleLtd #artist #pop #dance

  • Claude can now be plugged into Ableton to assist with your music projectsClaude – the AI assistant and chatbot from Anthropic – can now be directly plugged into Ableton, as well as a raft of other creative platforms, including Blender and Photoshop.
    The move follows the launch of Claude Design, a new product by Anthropic Labs that lets you collaborate with Claude to create “polished visual work” like designs, one-pagers and more.
    With the new set of connectors for Claude, the popular chatbot is able to plug into Ableton, and act as an AI assistant within your music projects. Anthropic says its partnership with a “coalition of partners” – which also includes Blender, Adobe (Photoshop and Premiere Pro) and Affinity by Canva.

    READ MORE: Focusrite unveils ISA C8X, its first ISA audio interface built on Rupert Neve’s preamp legacy

    Interestingly, Splice is also named in the list of brands integrating Claude into its products. It means producers can now search Splice’s catalogue of royalty-free samples directly within Claude.
    According to a blog post on the Anthropic website, within these platforms, Claude can be used in a variety of ways. Users can ask Claude complex questions about the software, with the chatbot acting as a virtual tutor to help you better understand your workflow.
    Elsewhere, Claude Code can write scripts, plugins, and generative systems for these platforms.
    And perhaps most importantly for creatives, Claude can be used to take care of manual, repetitive tasks that get in the way of the creative process.
    “Claude can’t replace taste or imagination, but it can open up new ways of working – faster and more ambitious ideation, a more expansive skillset, and the ability for creatives to take on larger-scale projects,” Anthropic says [via The Verge]. 
    “AI can also help shoulder the parts of the creative process that eat up time by handling repetitive tasks and eliminating manual toil.”
    Check out the video below for a walkthrough on how to integrate Claude into Splice:

    Anthropic has also now become a Corporate Patron of the Blender Development Fund, helping the open-source platform to stay free, and to allow developers to “keep pursuing projects independently, and to focus on building tools for artists and creators”. Anthropic will give Blender €240,000 every year.
    The post Claude can now be plugged into Ableton to assist with your music projects appeared first on MusicTech.

    Claude – the AI assistant and chatbot from Anthropic – can now be directly plugged into Ableton, as well as a raft of other creative platforms, including Blender and Photoshop.

  • I imagine this is quite logical thing and generally people don't want to listen to #music simply and automatically generated by #AI machines. There is some good stuff for #creativity support and is interesting as well in general. But I think it is good to smartly combine analog, digital and AI worlds...or not...there is not one right answer

  • Analog vs. digital synthesizers: What’s the difference and which should you choose?
    Learn about the strengths and limitations of analog vs. digital synthesizers, and when you'd want to reach for each.

    What’s the real difference between an analog vs. digital synthesizer? Learn which synth type best fits your workflow and production needs.

  • RealOpen and TRON verify $9.4M in USDT for crypto-enabled real estate purchasesRealOpen, the leading platform for buying real estate with crypto, today announced the conclusion of its collaborative "Fast Moves, Fast Payments" Holiday Campaign with TRON

    RealOpen, the leading platform for buying real estate with crypto, today announced the conclusion of its collaborative "Fast Moves, Fast Payments" Holiday Campaign with TRON

  • Google gains 25M subscriptions in Q1, driven by YouTube and Google OneGoogle added 25M paid subscriptions in Q1, reaching 350M total, as YouTube and Google One grow.

    Google added 25M paid subscriptions in Q1, reaching 350M total, as YouTube and Google One grow.

  • Using a VT-100 TodayYou may not know what a ADM-3, a TV910, or a H1420 are, but you probably have at least heard of a VT-100. They are all terminals from around the same time, but the DEC VT-100 is the terminal that practically everything today at least somewhat emulates. Even though a real VT-100 is rare, since it defined what have become ANSI escape sequences, most computers you’ve used in the last few decades speak some variation of the VT-100’s language. [Nikhil] wanted to see if you could use a VT-100 for real work today.
    While the VT-100 wasn’t a general-purpose computer, it did have an 8080 inside. It only had about 3K of RAM, which was enough to act as a serial terminal. A USB serial port and a terminal with modern Linux, how hard could it be?

    As it turns out there were a few issues. MacOS assumes terminals can take data at 9600 baud with no handshaking, apparently. It also means that any application that assumes redrawing the whole terminal is fast will be sorry for that choice.
    Of course, there are commands modern VT-100-like terminals accept that the original didn’t. However, as you’ll see in the post, all of these things you can either live with or solve.
    It is easy to make your own VT-100 replica. While the VT-100 may seem simple today, it was a marvel compared to even older terminals.

    You may not know what a ADM-3, a TV910, or a H1420 are, but you probably have at least heard of a VT-100. They are all terminals from around the same time, but the DEC VT-100 is the terminal that p…

  • Audio Fusion Bureau releases RoomDiY, a FREE acoustic room simulation plugin
    From developer Audio Fusion Bureau comes RoomDiY, a free acoustic room simulation plugin for macOS and Windows. RoomDiY offers advanced real-time acoustic modelling and room analysis. In short, the plugin allows you to design the ideal acoustic space for any given project. We’ve covered many convolution reverb plugins that offer impulse responses of real-world spaces, [...]
    View post: Audio Fusion Bureau releases RoomDiY, a FREE acoustic room simulation plugin

    From developer Audio Fusion Bureau comes RoomDiY, a free acoustic room simulation plugin for macOS and Windows. RoomDiY offers advanced real-time acoustic modelling and room analysis. In short, the plugin allows you to design the ideal acoustic space for any given project. We’ve covered many convolution reverb plugins that offer impulse responses of real-world spaces,

  • UMG generated $3.39 billion in Q1, up 8.1% YoY – driven by BTS, Olivia Dean, Taylor Swift, and moreUniversal Music Group has published its Q1 2026 results for the three months ending March 31
    Source

    Universal Music Group has published its Q1 2026 results for the three months ending March 31…