Black mirror episode, except the episode is the real life.

Reading Time: 3 minutes

In 1 e. 2s of “Black Mirror” we observe rather usual situation for sci-fi series: 

Martha’s boyfriend Ash is killed while returning a hire van…Martha learns she is pregnant and service, by aggregating Ash’s many social media posts and online communications, an artificial intelligence (AI) imitation of Ash is created. Martha interacts with him via instant messaging and video call…The next stage is a physical android version of Ash, which begins to make Martha uncomfortable. She argues with him and takes him to a cliff where she orders him to jump, but Martha is then frustrated as the real Ash would not obey such a command. Several years later, on Martha’s daughter’s birthday, she takes a slice of cake to the android Ash kept in the attic, which she is only permitted to visit on weekends.

The idea of talking with deceased isn’t new: from the beginning of humanity, when when we were still living in tribes and using opium, cannabis, alcohol, tobacco, various forms of fungi to experienced enlightenment , throughout Salem witch trials and other religious persecutions, till notorious victorian spiritualism practices, and people still try to reach out to their deceased loved ones, but someone still uses spiritual desk and someone using an AI.

We all, probably understood, that black mirror’s episode plot may be real in some time, but most of us couldn’t think that somebody in only several years would really took a swing at , in some way, sacred and very intimately-emotional thing as death and artificial resurrection of the loved one. 

Isn’t it kinda a Babylon tower thing? 

But, in the same, isn’t the most vulnerable places of human nature can bring the biggest money, huh?

And, what’s more, this somebody is none other than Microsoft, so it means that probably in distant future the project will take big turnover, but the company isn’t planning to turn the technology into an actual product in near future, because, as Tim O’Brien, Microsoft’s general manager of AI programs, said “it’s predates the AI ethics reviews we do today.” Probably, they may be afraid of too strong negative reaction to this product, but should we blame them?

According to the patent information, the tool would cull “social data” (images, social media posts, messages, voice data and written letters) of the individual. That data would be used to train a chatbot to “converse and interact in the personality of the specific person.” It could also rely on outside data sources, if user ask a question that couldn’t be answered based on the person’s social data.

“Conversing in the personality of a specific person may include determining and/or using conversational attributes of the specific person, such as style, diction, tone, voice, intent, sentence/dialogue length and complexity, topic and consistency, as well as using behavioral attributes such as interests and opinions and demographic information such as age, gender and profession, the patent states”.

But the real question is would you use this technology in the future?

https://www.syfy.com/syfy-wire/microsoft-AI-chatbot-patent-talk-to-dead-people

https://www.washingtonpost.com/technology/2021/02/04/chat-bots-reincarnation-dead/

2 thoughts on “Black mirror episode, except the episode is the real life.

  1. 46362 says:

    I remember watching this episode and thinking how creepy it was! Obviously, there’s always a lot of potential to profit from people, who are at their most vulnerable, but I think it’s highly unethical. I understand why someone would like to use it, but in my opinion, it can potentially disrupt the grieving process and, in the end, mess with someone’s mind since they’re at their lowest. The technology is nevertheless, fascinating but I wish it was used for something other than dead-people-chatbot.

  2. 46302 says:

    It’s amazing that we can ‘train” a bot, to behave in a way dead person did. It’s also fascinating to see that every person has something like a pattern of writing etc., but also at the same time scary that a company can have access to all that kind of information, and use it to (in a way) stealing someone’s identity. I think time of grieve is definitely really hard, but I don’t consider it a good solution, for a suffering person, to have a constant reminder of their loss, but of course everyone is different and it may be helpful for some people.

Leave a Reply