
Table of Contents
So here’s the deal: The Associated Press just dropped a story that really caught my attention about the rising threat of AI fakes in the music production space. On Wednesday, a group of tech execs, record industry bosses, and country music icon Martina McBride sat in front of Congress and said what a lot of people in music, media, and tech have been saying privately for a while: AI deepfakes are getting out of hand. It’s one thing to play around with voice clones for memes. It’s another to have your voice or face used in a video or song you never made.
That’s why they’re all backing a new bill called the No Fakes Act.
Here’s the heart of it: if someone makes a digital replica of you—your voice, your image, your performance—without your say-so, this bill would make them legally responsible. Doesn’t matter if it’s some kid in their basement or a billion-dollar AI firm. If you fake someone’s likeness and use it in a way that looks or sounds real, you’re on the hook.
Martina McBride Isn’t Playing Around
McBride told lawmakers the tech’s legit, but the way people are using it? That’s a problem. She broke it down in real terms: families getting scammed, girls being targeted with fake explicit images, artists like her having their voices stolen to make fake tracks.
And she’s not alone. The Human Artistry Campaign says nearly 400 artists and performers have come out in support of this bill. That includes big names like Missy Elliott, Scarlett Johansson, and Bette Midler.
Here’s where it gets interesting.
The No Fakes Act would also hold platforms liable if they knowingly host that kind of content. That means places like YouTube, TikTok, or wherever else could be forced to pull content fast—or risk legal trouble.
It also creates a takedown system. So if your likeness gets used in some shady way, there’s a real path to get it removed, without needing a team of lawyers. It’s simple: someone deepfakes you, you notify the platform, they take it down.
Tech Giants Are (Mostly) On Board
Suzana Carlos, who handles music policy at YouTube, showed up to say that YouTube supports the bill—because they want AI to be used right. She pointed out that AI tools can be powerful and helpful, but they can also create a mess if there are no rules. This bill, she said, gives everyone the structure they need.
That structure matters. Right now, it’s kind of the Wild West. AI tools are making everything from fake Drake songs to political deepfakes, and there’s not much anyone can do about it once they go viral. The No Fakes Act tries to fix that by giving creators real protections and a way to fight back.
The Bigger Picture
This hearing came just two days after President Trump signed the Take It Down Act, a bill aimed at stopping revenge porn and AI-made explicit deepfakes. The No Fakes Act builds on that momentum. It widens the lens—now we’re talking about voice clones, fake performances, and impersonations that mess with trust and creativity.
Mitch Glazier from the RIAA nailed it when he said this bill gives people a way to act fast, without waiting for courts. That’s the key here. This isn’t about blocking innovation. It’s about giving artists, creators, and everyday people the power to stop their image from being misused.
AI isn’t going anywhere. But if Congress gets this right, we’ll finally have a way to make sure it’s working for people—not steamrolling them.
The post The No Fakes Act Could Be a Game-Changer for Artists Battling AI Deepfakes appeared first on Magnetic Magazine.