The New Era of Digital Identity Security
In an industry where fame is the primary currency, the concept of “real estate” is shifting. For Hollywood stars and digital creators, their most valuable asset is no longer just a contract or a film role—This proves their likeness. As AI-generated deepfakes become more sophisticated, the industry is moving toward a model of digital security that mirrors physical security.
YouTube has stepped into this gap by introducing a likeness protection tool. This system allows public figures, creators, and officials to opt-in and upload their likeness, enabling the platform to scan for and flag potential replicas. It is essentially “digital security” for the face and voice, providing a layer of protection against the unauthorized co-opting of a person’s identity.
Navigating the Fine Line Between Parody and Disparagement
One of the most complex trends in synthetic media is determining what stays and what goes. The industry is currently grappling with the distinction between creative fan engagement and harmful content replacement.

YouTube’s current approach distinguishes between “parody and satire,” which are generally allowed under community guidelines, and “realistic and consequential disparagement.” A critical focus is “content replacement”—where a deepfake is used to create a replica of a celebrity’s known work, potentially limiting their livelihood by replacing their actual output with synthetic versions.
We have already seen the volatility of this tech. The launch of AI tools like OpenAI’s Sora led to a flood of IP-driven deepfakes, including figures like Martin Luther King Jr. More recently, videos created by Seedance 2.0—such as a viral clip of Brad Pitt fighting Tom Cruise—served as a wake-up call for the industry regarding the speed of AI progression.
From Protection to Profit: The Future of Likeness Monetization
While the immediate priority is protection, the industry is already eyeing the next phase: monetization. The question is shifting from “How do we stop this?” to “How do we make money from it?”
Some agencies are already preparing for this shift. For example, CAA has developed the “CAA Vault,” a system designed to house the likenesses of clients for potential future monetization opportunities. This suggests a future where celebrities could license their AI likeness for fan-created content or official synthetic performances.
However, this path is fraught with complexity. Licensing becomes difficult when a single video features multiple talents with different consent levels or varying degrees of stardom. Despite these hurdles, the appetite for AI engagement is high. Executives at Warner Bros. Pictures have noted that AI-generated fan trailers—such as those for Practical Magic 2—can actually be a sign of strong audience desire and engagement.
The Evolution of Fan Engagement
Deepfakes are not exclusively viewed as a threat; they are also becoming tools for celebration. Many large creators report that a significant portion of AI-generated content featuring them is benign or even supportive.
As the technology matures, the relationship between stars and fans is evolving. Rather than total suppression, the trend is moving toward a “mechanism to respond.” By having visibility into how their likeness is used on the world’s largest search engines, talent can choose to embrace positive fan engagement while shutting down malicious clones.
Frequently Asked Questions
How does YouTube’s likeness protection tool work?
Eligible public figures or their teams opt-in and upload their likeness. YouTube then scans the platform for replicas and flags them for the team to review, who can then request removal if the content is disparaging or replaces their professional work.
Will all deepfakes be removed from YouTube?
No. Content that falls under parody or satire is generally permitted under community guidelines. Only content that is realistically disparaging or constitutes “content replacement” is eligible for takedown.
Can celebrities make money from deepfakes of themselves?
Currently, YouTube’s tool focuses on protection rather than monetization. However, some agencies are already creating “vaults” of client likenesses to explore future licensing and monetization opportunities.
Join the Conversation
Do you think celebrities should be able to monetize fan-made deepfakes, or should all synthetic likenesses require strict consent? Let us know your thoughts in the comments below or subscribe to our newsletter for more insights into the intersection of AI and entertainment.
