Digital necromancy in Hollywood: the backlash against Val Kilmer’s AI “return.”
The estate authorized it. The daughter signed off. The AI executed it. And no one needed to ask permission from the only person who mattered.
Digital necromancy in Hollywood: the backlash against Val Kilmer’s AI “return.”
Val Kilmer did not die. He was promoted to a digital asset.
The estate authorized it. The daughter signed off. The AI executed it. And no one needed to ask permission from the only person who mattered.
The question is not whether the technology allows this.
The question is who profits when the dead cannot say no.
Val Kilmer died in 2025, at age 65, from cancer. He had been cast in As Deep as the Grave before his death. He was too ill to film. The estate authorized generative AI to complete the role.
The trailer premiered at CinemaCon 2026 in Las Vegas.
Kilmer’s face appears obscured by a deep hood as he rides on horseback. The lighting closes in. The visual style looks like the work of someone trying to hide what the camera cannot sustain.
“Do not fear the dead, and do not fear me,” Kilmer’s digital version tells a child.
No one seemed to notice the irony.
1. This is not a tribute. It is a post-mortem rights transfer with a script attached.
Kilmer’s daughter stated that her father “always viewed emerging technologies with optimism.” She used that as justification for authorizing the production.
That sentence is doing a great deal of work with very little evidence.
Optimism about emerging technology is not consent to star in a film you never read, in scenes you never approved, with a director you never had the final conversation with.
Kilmer did in fact use AI to clone his own voice after losing his speech to throat cancer. That is a real fact. But there is a considerable distance between using AI to restore your own voice and using AI to create an entire character from someone’s digital corpse.
One is autonomy. The other is proxy.
2. The precedent is not new. The barrier to entry is what changed.
Carrie Fisher. Paul Walker. James Dean. Hollywood has resurrected the dead before. The difference is that, until now, this required blockbuster budgets, expensive CGI, and VFX teams that took years to deliver imperfect results.
Now any independent production has access to the same tools.
The democratization of generative AI did not democratize creative talent. It democratized access to the dead.
What used to be a decision made by a billion-dollar studio with a legal department and a PR team for damage control has become a decision any festival filmmaker can make with a laptop and family authorization.
CinemaCon 2026 is exactly where this becomes visible: it was not Disney. It was an independent production company called First Line Films.
3. The union fought for 118 days over AI protections. For the dead, there is no union.
Actor Jackson Rathbone put it bluntly: “Hey SAG-AFTRA, remember that strike we had.”
The 2023 strike lasted 118 days. One of its central demands was protection against unauthorized AI replication of actors. The agreement included safeguards for living performers.
For dead actors, the legal structure is ambiguous or nonexistent. Post-mortem personality rights vary by state, by country, and by pre-existing contract. The person whose consent matters is no longer here to exercise it.
What exists instead is the estate. And the estate has financial interests.
Rathbone was brutally specific when he questioned Kilmer’s daughter: “Or are you exploiting your father’s death for financial gain?”
The business structure does not allow outsiders to distinguish tribute from exploitation. That is the problem.
4. The real problem is not aesthetic. It is accountability.
When the technology still shows its seams—the deep hood, the controlled lighting, the overly sharp visual style—the audience notices. And the perception of having been deceived, even partially, creates a reaction different from aesthetic criticism.
“Grotesque digital necromancy” was the phrase used by analyst JP Gownder on Bluesky.
It is dramatic. But it names something real: there is a moral difference between preserving someone’s memory and putting words in the mouth of someone who can no longer correct them.
5. Where the money is, and what businesses you can build right now
This is the part most commentators ignore because it is uncomfortable to name.
The Kilmer case is not just an ethical scandal. It is a market map.
The technology exists. The regulatory vacuum exists. Demand exists. What does not yet exist at scale is the business infrastructure built around this in a structured, transparent, and legally defensible way.
Whoever builds that infrastructure first will capture a market that no one wants to publicly own yet, but everyone is already moving toward behind the scenes.
The viable businesses right now, with real entry cost and a clear revenue model:
Digital legacy management for public figures
Advisory and legal-technological structuring so that artists, athletes, executives, and influencers can define while alive what may or may not be done with their image, voice, and performance after death. This includes rights contracts, a vault for voice and image assets, and licensing policy. Revenue comes through a monthly retainer or a one-time fee with renewal. Immediate market: artists undergoing medical treatment, athletes nearing retirement, influencers with meaningful audiences.
Post-mortem digital asset licensing for independent producers
Intermediation between estates and production companies that want to use the image or voice of deceased public figures. The business does not produce the content. It structures the contract, defines usage limits, negotiates the revenue split, and monitors compliance. It is a digital rights agency for the dead. Model: percentage of the licensed value.
Auditing and certification of AI-generated content involving real people
A service for platforms, film festivals, and distributors that need to know whether the content they are exhibiting has a documented chain of rights. CinemaCon 2026 screened the Kilmer trailer without any visible verification standard. That will change when the first lawsuit arrives. The business positioned as certifier when that happens will capture compliance contracts across the industry.
Structured consent tools for content creators
A simple SaaS product: a creator records during life what they authorize or forbid to be done with their image and voice through AI after death or incapacity. The document is legally valid, stored on blockchain or in an auditable repository, and accessible to the estate and partner platforms. Low production cost. Revenue through subscription or one-time fee. Initial audience: Substack creators, YouTubers, podcasters with established audiences.
Crisis consulting for estates and families after leaks or unauthorized use
When content escapes before control is established—deepfake, unauthorized commercial use, digital resurrection without consent—the estate needs someone who simultaneously understands AI, image rights, and crisis PR. That professional does not yet exist at scale. It is a consulting vertical with a high ticket and guaranteed growing demand.
The central point: none of these businesses requires you to produce AI content. They exist because someone needs to structure, certify, intermediate, or protect what others are producing. Governance infrastructure has higher margins and less reputational exposure than direct production.
6. Trends to monitor and the real impact of what is moving
This is not mid-term science fiction. These are movements that already have visible structure in 2026 and will determine who is positioned and who will be left scrambling when regulation arrives.
Regulation of post-mortem personality rights will be approved in at least one major market by 2028
The EU already has the GDPR. The next logical step is extending data and identity protections beyond death. California already has precedents in post-mortem publicity rights. The trigger will be a high-profile case that generates litigation and global coverage. The Kilmer case may not be that case. But it is preparing the ground. Whoever has infrastructure in place when regulation arrives will have a first-mover advantage in compliance.
Platforms will require documented chains of rights for AI-generated content involving real people
YouTube, Netflix, and film distributors are already under growing pressure from unions and entertainment lawyers. The next step is documentation requirements before distribution, not after a complaint. That creates a certification and auditing market that does not yet formally exist.
The “digital estates” market will professionalize
Today, what an estate does with a celebrity’s digital assets is often an ad hoc decision made under emotional and financial pressure by heirs who frequently lack the expertise to evaluate what they are giving away. A category of firms specialized in the digital patrimony of public figures will emerge, analogous to what talent agencies do for living artists. This market will consolidate quickly because the number of deceased celebrities with residual commercial value is finite, and whoever controls the management contracts will have a defensible position.
Public reaction will create a tolerance index by type of use
Not every use of AI involving deceased people will generate the same reaction. A James Dean deepfake in a car commercial triggered backlash. Cloning a musician’s voice to complete an unfinished album triggered a more ambiguous reaction. Carrie Fisher’s resurrection in Star Wars sparked debate, not a boycott. The market will calibrate what is tolerated and what is punished, and that calibration will define which products are viable without high reputational risk.
Voice AI will arrive before image AI in terms of normalized adoption
Because it is cheaper, more convincing, and easier to integrate into audio products, podcasts, audiobooks, and dubbing. The market for cloned voices of deceased public figures in audio products will grow faster and face less resistance than the image market. That suggests where infrastructure should be built first.
The consent debate will split the market into two permanent segments
Products with documented lifetime consent versus products authorized post-mortem by third parties. That distinction will matter to platforms, regulators, and the public. It will create two markets with different prices, risks, and reputations. Positioning yourself in the first segment now, before the distinction becomes mandatory, is a real competitive advantage.
Synthesis
No law was broken here.
The estate authorized it. The filmmakers documented it. The daughter gave statements to Variety. The trailer was screened at an official industry event.
Everything happened within the system.
The problem is that the system was not designed for this.
And systems not designed for a new phenomenon create two kinds of actors: those who complain that the system is broken, and those who build the infrastructure the system will need when it finally decides to regulate.
The Kilmer case is not about a bad film.
It is about who gets there first when the post-mortem digital identity market needs structure.
Questions for you to answer:
Is verbal consent in life about being “optimistic about technology” enough to authorize a full posthumous performance?
If the technology reaches the point where the result is indistinguishable, does the debate change or become more urgent?
Who should have veto power when the estate authorizes something and the public rejects it?
Would you rather be on the side of those who complain, or on the side of those who charge to solve it?
#TechGossip #ValKilmer #AIandEthics #DigitalNecromancy #DigitalRights #CinemaCon2026 #DigitalLegacy #DigitalPower #AIMarket
Follow: www.techgossip.com.br


