October 16, 2025
3 min read
Jake Paul’s Sora Stunt Previews Risks and Rewards of a Deepfake Marketplace
New video apps like Sora could turn faces into moneymaking assets and hint at a future where everyone can rent out their digital likeness
Boxer Jake Paul shows his Instagram livestream during a press conference in Riyadh, Saudi Arabia, on February 23, 2023, three days before his match against Tommy Fury.
Francois Nel/Getty Images
Jake Paul is all over the Internet. Paul—social media presence turned actor and professional boxer—made history last year when he squared off with boxing legend Mike Tyson in the most streamed sporting event ever. Now he’s back in the limelight, with even more eyes on him. A wave of viral videos posted last week shows him giving makeup tutorials, shoplifting from Taco Bell and holding up a 7-Eleven.
But none of these videos are real—they’re deepfakes minted on OpenAI’s Sora app. The software, which launched September 30, uses artificial intelligence to generate videos and allows people to upload their image as a “cameo” that others can use.
Paul willingly uploaded his cameo for others to use, and then on October 8 he posted a TikTok video in which he threatened to sue anyone spreading deepfakes of him doing things he would never do. As he said this, he began clumsily applying makeup—a gag because this is what many of the deepfakes portrayed him doing. The next day he announced on X that he was a “proud OpenAI investor” and “the first celebrity NIL cameo user” (NIL is the acronym for name, image and likeness) and that the videos generated with his likeness had, in just six days, received more than a billion views.
On supporting science journalism
If you’re enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.
Such endeavors may be the start of a new digital economy for deepfakes—and Sora may be the key driver. The “cameo” feature, which at first seemed a fun add-on to Sora’s abilities, now appears to be one of the main attractions, and OpenAI’s CEO has announced plans to monetize it. The development presents opportunities for some but big risks for others. Depending on how the system is implemented, cameo owners could set terms and cost, which could give some control back to those whose images have been used without their consent. Anyone could share their NIL and, in effect, put their digital double in the equivalent of a stock photo library. Other app users could license these likenesses in small, trackable uses, adhering to rules set by rights holders (no nudity, for example) and paying them for each use. Deepfakes, which until now have largely been used to defame or extort people, would at least be generating royalties for their subjects.
Of course, with the technology moving faster than many users realize—and certainly faster than regulators can keep up with—the risks are substantial. A person could consent to be duplicated and still be harmed by deepfakes that are selectively edited or made with malicious prompts. Their image could easily be stolen and used elsewhere to defraud or misrepresent, and exposed biometric data can’t simply be reset like a leaked password.
Establishing a market for NILs doesn’t resolve the potential harms, but it may create a commercial incentive to prevent them through regulation to maintain the integrity of the marketplace. And in some cases, using a person’s NIL will be impossible for moral reasons, not just legal ones. In the first week of Sora’s launch, families protested deepfakes of the dead, and OpenAI signaled that it would add tools to honor those requests.
And a deepfake economy will likely change things in ways no one can foresee. Music streamers such as Spotify and SoundCloud transformed the music industry, changing how songs are shared and even designed. For instance, they moved musicians toward recording shorter songs that listeners are less likely to skip. Will something similar happen with commodified deepfakes? A marketplace may evolve where people’s images are sold or traded or even adjusted depending on how in demand they become. Would face image values rise and drop with the fluctuations of one’s popularity? It sounds dystopian, but we already live in a culture where image and attention are monetized, and that trend appears likely to keep evolving.
A number of celebrities have already begun exploring commercial deepfakes. In 2023 the musician Grimes offered a 50–50 royalty split to anyone who used her AI voice to create a “successful” song. YouTube’s Dream Track project lets creators make soundtracks with the AI singing voices of Charlie Puth, Demi Lovato and John Legend, among others. Deepfakes of sports stars David Beckham and Peyton Manning have appeared in commercials, and musician FKA twigs created a deepfake to handle her social media interactions while she focuses on making music.
Which brings us back to the logic of Paul’s decision to let people make videos with his cameo on Sora: if attention is the scarce prized good in a saturated digital market, then letting the world generate you on demand gets you more eyes and increases your value. He is both asset and architect, earning attention today so he can be paid in royalties tomorrow.
It’s Time to Stand Up for Science
If you enjoyed this article, I’d like to ask for your support. Scientific American has served as an advocate for science and industry for 180 years, and right now may be the most critical moment in that two-century history.
I’ve been a Scientific American subscriber since I was 12 years old, and it helped shape the way I look at the world. SciAm always educates and delights me, and inspires a sense of awe for our vast, beautiful universe. I hope it does that for you, too.
If you subscribe to Scientific American, you help ensure that our coverage is centered on meaningful research and discovery; that we have the resources to report on the decisions that threaten labs across the U.S.; and that we support both budding and working scientists at a time when the value of science itself too often goes unrecognized.
In return, you get essential news, captivating podcasts, brilliant infographics, can’t-miss newsletters, must-watch videos, challenging games, and the science world’s best writing and reporting. You can even gift someone a subscription.
There has never been a more important time for us to stand up and show why science matters. I hope you’ll support us in that mission.