In short
- 2Wai’s app generates conversational video avatars from minutes of recordings, drawing comparisons to “Black Mirror.”
- Critics warn that the technology exploits vulnerable sufferers and operates in a legal gray area with weak post-mortem privacy protections.
- The launch heightens scrutiny of a painstaking industry struggling with consent, data ownership and the risks of AI-generated “digital ghosts”.
An artificial intelligence startup co-founded by a former Disney Channel actor has launched a mobile app that lets users create interactive digital replicas of their dead loved ones, prompting swift online condemnation and renewed scrutiny of the growing “grief tech” sector.
2 waikfounded by Calum Worthy – known for portraying Dez in the Disney series “Austin & Ally” from 2011 to 2016 – and producer Russell Geyser, released its iOS beta on November 11. The app’s “HoloAvatar” feature generates conversational video avatars from three minutes of footage uploaded in real-time, 4, chat, audio and video uploaded in real-time. languages.
While being marketed as a tool for heritage preservation, the ability to recreate the deceased has dominated the headlines, evoking comparisons to the dystopian 2013 Black Mirror episode “Be Right Back,” in which a grieving widow animates the digital ghost of her late husband.
The promotional video, posted by Worthy to his X account with 1.2 million followers, has garnered 22 million views and more than 5,800 replies. It depicts a pregnant woman video-calling an AI recreation of her late mother for advice, then fast-forwards to the avatar reading bedtime stories to her newborn, and later counseling her grown-up grandson, played by Worthy.
“What if the loved ones we lost could be part of our future?” the clip asks. “With 2Wai, three minutes can last forever.” Worthy continued: “At 2wai, we’re building a living archive of humanity, one story at a time.”
Mechanics and origins
HoloAvatars run on 2Wai’s proprietary FedBrain technology, which processes interactions on the device to ensure privacy and limit responses to user-approved data, reducing AI “hallucinations.” The app also supports live users creating avatars for fan engagement or training — Worthy’s digital twins share in Disney anecdotes.
Currently free in beta, it will move to a tiered subscription model, with an undisclosed price but likely $10-$20 monthly based on comparable AI services.
The company traces back to the SAG-AFTRA strikes of 2023, where performers protested unauthorized AI likenesses.
“Having worked as an actor, writer and producer for the past 20 years, I have experienced firsthand how challenging it is to create a meaningful relationship with fans around the world.” Worthy said at the June launch. “Language barriers, time zones and budgets limit the ability to truly connect.”
2 waik raised $5 million in pre-seed funding in June from undisclosed investors, with the firm saying it is working with the likes of British Telecom and IBM.
Ethical and privacy concerns
Public reaction has been overwhelmingly negative, with X users criticizing the app as “nightmare fuel,” “demon,” “dystopian,” and an exploitative commercialization of pain.
One viral response called it “one of the most evil psychotic things I’ve ever seen,” arguing that it “transforms psychotic human beings” by simulating loss rather than transformation. Another labeled it “beyond vile”, insisting “videos do that” for archiving, not AI convictions.
Legal experts point out that death bots are in a legal and ethical gray area, because they can be built without the explicit consent of the deceased, expose deeply personal data of the deceased and the grieving, and create ambiguity around the ownership of the digital avatar and accompanying data.
Privacy laws typically protect living people and offer little or no post-mortem safeguards, leaving loved ones who survive vulnerable to the commercial exploitation of grief through subscription models and unregulated access to interviews, voice recordings and other sensitive materials. In addition, the ability of such bots to interact, learn and deviate from the recorded data increases the risks for the legacy of the deceased and challenges how society navigates mourning, memory and meaningful closure of the loss.
The app includes opt-in requirements and family approval for dead avatars, but critics question the application. “You prey on the deepest human feelings, looking for ways to exploit them for your profit,” a user X wrotecalling the creators “parasites”.
Investor and industry views
2Wai’s funding reflects cautious optimism in the AI company, but pain monetization remains a “third rail” niche. Venture firms have been defeated by similar startups amid ethical pitfalls; Eternal Digital Assets, a graveyard-AI hybrid, closed last year due to high churn.
2 belt joins a crowded field of dolu-tech. HereAfter AI (founded in 2019) creates “Life Story Avatars” from pre-death interviews, emphasizing consent. StoryFile offers interactive videos from recorded sessions, used in memorials such as Ed Asner; filed for Chapter 11 bankruptcy in 2024, owing $4.5 million, but is reorganizing with fail-safes.
Replika, a chatbot service launched in 2017, allows users to imitate the dead through text or calls, but faced a backlash after a 2023 update that “killed” personalized bots, and the suicide of a Belgian man in 2023 linked to ecological anxiety chats with him.
No federal rules govern posthumous digital likeness, but California’s AB 1836 (signed in September 2024) prohibits unauthorized AI reproductions of voices or visuals of deceased artists in audiovisual works without estate consent, with penalties of up to $10,000 or actual damages. Lawmakers are wary of extensions to non-celebrities, fueled by electoral fraud.
2Wai did not immediately respond Decrypt’s request for comment.
Generally intelligent Newsletter
A weekly AI journey narrated by Gen, a generative AI model.