Artificial Intelligence Chatbot Attempts to Mimic a Man’s Deceased Fiancee—But Creators Say it's Dangerous

An artificial intelligence chatbot attempted to mimic a deceased woman, allowing her fiancè, Joshua Barbeau, to speak with her even after she died.

Artificial Intelligence Chatbot Attempts to Mimic a Man’s Deceased Wife—But Creators Says its Dangerous
A robot from the Artificial Intelligence and Intelligent Systems (AIIS) laboratory of Italy's National Interuniversity Consortium for Computer Science (CINI) is displayed at the 7th edition of the Maker Faire 2019, the greatest European event on innovation, on October 18, 2019 in Rome. by ANDREAS SOLARO/AFP via Getty Images

Barbeau only had to provide the old message exchanges from his wife, plus some necessary background information.

Mimicking how a dead person talks are not the only breakthrough that artificial intelligence has boasted. For instance, BBC reported that the said new generation technology has the potential to cure even the dreaded deadly diseases.

Aside from that, on May 11, 2020, Intel and Penn Medicine teamed up to further use artificial intelligence to detect brain tumors early on before it successfully wrecks the body of a person.

AI is not only involved in medical industry feats. Other developers also employ the technology to create cheating software that works even in gaming consoles, enabling aimbots and wallhacks.

Meanwhile, the hyper-realistic chatbot that seemingly brought back a dead person is called Project December.

Artificial Intelligence Chatbot

As per Business Insider, the AI model that the Elon Musk-backed research firm OpenAI designed, which goes by the name GPT-3, powers the chatbot that sounded similar to the wife of a man.

Barbeau narrated how he got to talk again with his wife to the SFChronicle, saying that his ex-fiancee, Jessice Pereira, died eight years ago at aged 23 from a rare liver disease.

Despite th passing years, the freelance writer said that he has never moved on from her.

It was during the birthday month of Barbeau's wife when he stumbled upon Project December, wherein he was sold to create an account for $5. And that was that start as to how he had a chance to talk to his wife again.

The GPT-3 only needed a handsome amount of human texts to mimic human writing, such as personal love letters and even technical academic papers.

It is to note that Reddit threads interestingly significantly helped the AI model to further learn how a person interacts.

Creators Warn About Its Dangers

OpenAI further noted that the language-based tech program is the most sophisticated ever. However, it is also significantly dangerous at the same time.

Even during the early release of the AI model, the GPT-2, the preceding tech before the current GPT-3, the creators already warned that bad actors could likely viciously abuse its sophistication.

To be precise, the research firm noted that criminal minds could instead use the tech to create a network of fake social media content, spreading fake news articles, as well as mimicking people in social media. All of this is possible through automation powered by artificial intelligence.

Artificial Intelligence and Misinformation

That said, OpenAI decided to restrict access to both the GPT-3 and GPT 2 due to the potentially dangerous impact of the new technology on our society, now more than ever as misinformation remains a rampant problem in social media giants, such as Facebook and Twitter.

The United States government even stated that the spread of misinformation in the online platform is taking the lives of the people, BBC said in a separate report.

This article is owned by Tech Times

Written by Teejay Boris

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics