An AI app called 2wai has gotten some serious backlash recently while promoting a bizarre feature – creating avatars of people’s fallecido loved ones.
All you need to do is upload a short video of them, and the app does the rest, allowing you to chat with someone you lost whenever you feel like it.
I’m sure it was pitched with an intention to help people deal with grief, but it raises some huge ethical issues.
It’s disrespectful and unsettling, and many netizens saw it as a form of necromancy!
We need to talk about what this means for society and our natural grieving process.
A Dangerous Illusion
Creators of 2wai tried to spin their product as something wholesome.
It was pitched as an opportunity for children to meet the relatives who passed away before they were born, or for people to find peace after loss.
It’s promoted as something that would help us heal and make the grieving process easier.
They can sugarcoat it all they want – it’s still creating a digital replica of a person who’s no longer alive. It’s not a memorial, like a photo, but an avatar clone!
It’s meant to mimic their voice and appearance, and make them say whatever you want, which is blatantly disrespectful!
Is It Necromancy?
Many netizens agree that this way of using AI apps comes way too close to necromancy.
It’s using a collection of videos and sounds you have of the deceased person and turning them into an interactive avatar.
In most cultures around the world, almost anything that has to do with the dead is sacred and taboo, as it should be.
Since the dawn of civilization, the way we approach the dead has been heavily regulated because they’re helpless; it was up to the living to uphold their dignity and send them off with honors.
Does playing with an avatar that looks and sounds just like them sound dignified?
If it’s considered unethical to use a living person’s face for an interactive avatar without their consent, the same should go for the deceased.
Disrespecting the Memory of the Deceased
Instead of holding onto the memories you created with a loved one while they were with you, now you’re supposed to have a stiff, AI-generated conversation con ellos.
This whole thing rubs so many people the wrong way, which is honestly a relief; it means our moral compass isn’t completely broken.
Creating a cartoonish avatar of a deceased person sounds a lot like invading their privacy and a breach of moral boundaries.
No matter how well-meaning the intent, turning someone’s image into a talking avatar risks trivializing their life y death.
It undermines the sacredness of their memory.
Interfering with Natural Grieving
Grieving is a process that has to be given time and patience. It involved pain, denial, acceptance, and eventually, moving on.
Using an app like 2wai to create an avatar of them disrupts this natural cycle by artificially “keeping them alive.”
Instead of accepting the reality of death – which is the only healthy way to approach it – some people might get addicted to these avatars.
It prevents emotional healing and allows you to replace genuine connection with a digital copy. This is a recipe for unhealthy attachments!
The last thing people need after losing a loved one is to get stuck in a loop of nostalgia.
Healthy grieving means gradually getting used to life without them; you can’t do that while clinging to their AI clone.
Potential Misuse
Let’s not beat around the bush – the internet is full of all kinds of freaks. It’s only a matter of time before someone misuses this app.
Just look at the way people use AI chatbots, or avatars that mimic celebrities.
The potential for abuse is enormous, and it’s impossible to control how these avatars will be used!
This concept is still in its infancy, but already raising alarms, and that’s a clear sign that we need to be cautious before it’s too late.
Moral Backlash
2wai has sparked outrage among the majority of people who see their use of AI as deeply unethical.
Many feel it’s an insult to the dead and a violation of human dignity.
And this backlash is encouraging; it means people still have a sense of morality and respect for the natural order.
It also shows that most of us haven’t accepted the idea that AI can ever replace real human interaction.
Let’s just take this as a chance to highlight how important it is to establecer límites with new technology, especially when it plays with the dead and human emotions.
It should serve us and help us progress, not manipulate and dishonor our deepest human experiences!
What’s Next?
It’s clear that AI has incredible potential, and it’s already making our lives easier in many ways.
However, when it comes to something as delicate as death, we must tread carefully.
Apps like 2wai don’t serve any genuine purpose; they’re just exploiting people’s grief. Some ethical boundaries need to be established.
Grieving is a deeply personal process that we have to go through in order for our own well-being; we mustn’t let it get disrupted.
AI evidently has a place in our modern lives, but some things, like love, memory, and loss, are best handled authentically.
A little Aquarius, devoted to writing and embroidery. Through my writing, I hope to empower readers to align with their true selves and navigate life’s mysteries with confidence.








