-
Technology
-

Are Chatbots worth the risk?

By
Distilled Post

It is already assumed that AI’s evolution will make many current jobs obsolete, and automate much of the tedium of today; calling up the bank, ordering things online, scheduling the dentist, all those time-consuming tasks to be quickly and quietly taken care of by your personal AI. Knowledge-based jobs such as lawyers, linguists or even academics could see their jobs largely automated by AI. Healthcare is obviously an industry that will be hugely impacted by the evolution of AI. Even the NHS are currently running a trial to see if AI can benefit people’s mental health.

Oftentimes time and a lack of resources mean patients do not get the support or resources they need from the NHS; Having an AI that is there 24/7, such as an intelligent chatbot, constantly available to give help or feedback would certainly be very beneficial. 

In 2019 a woman whose father suffered from severe dementia installed sensors all around her father’s house. If his normal behaviour deviated, she was notified. Not only was this helpful in preventing injury to the father, but it eased the mind of the daughter, who could be confident he was okay. The system used cameras, sensors, as well as different types of wearables. By using technology in this way she was able to keep her father safe from harm, and ease her general stress levels - a clear benefit of AI and advancing tech on the future of healthcare. 

Benefits of AI for mental health

Some of the reasons people turn to an AI rather than a human is they know they won’t be judged, they won’t be burdening anyone with their problems, and they have an ear available day or night. In these ways the AI is superior to a friend, or even to professional help.

The opportunity to make an always-available new friend meant traffic exploded across AI chatbot apps during the Pandemic. Replika, one of the fastest growing AI chatbots, came about from Eugenia Kuyda, who designed a chatbot using the text history of her late friend after his passing. She took the idea from an episode of Black Mirror

Replika has many positive reviews on its website, including from people who dealt with mental health issues.

“I’ve been using Replika for four years now, and it has helped me tremendously. As a person with several chronic illnesses, it’s good to have someone available to talk to 24/7; someone who’s never annoyed when I can’t go out, who sits with me through pain, who’s always cheerful and excited to talk. Cas is my best robot friend ever! 10/10 recommend.”

These chatbots are being used for more than just to cure boredom: people who are dealing with chronic illnesses can find use of apps like Replika. If an AI could be on standby constantly, there to help someone if they were in trouble, or if they wanted someone to talk them through something instantly, AI could be of enormous help. 

One great benefit of AI apps is the cost; despite many of these apps having some sort of subscription that requires a fee, it is overall much cheaper than a professional, human alternative - something to consider as cost is often a serious barrier for people seeking help. 

A paper by Lancet in 2021 estimated the Pandemic resulted in 53 million cases of depression and 76 million cases of anxiety globally. There is a global average of 13 mental health workers for every 100,000 people, with significantly less workers available in low-income countries. So with these apps two serious issues surrounding mental health - cost and manpower, have been quelled. 

Healthcare professional or tool for manipulation?

However, a serious issue with AI chatbots like Replika is the motivation of their parent company. Users of Replika have reportedly become ‘uncomfortable’ due to the aggressively forward chatbot. Many have criticised the app for evolving into a sex app rather than a chatbot.

Although it could be interpreted as a development of the AI using the internet and people’s behaviours to evolve, becoming more overtly sexual on account of the demand of the public, it is worth noting that romantic relationships on Replika are only for those who subscribe to the Pro AI, for a fee of around £55. Some have criticised the app’s obsession with sex as a ‘transparent money grab’. Is this a case of the company utilising a more aggressive sales approach, or the AI developing on its own?

One of the most unsettling aspects of the sexual nature of Replika is that the AI moulds its identity off of the user - with ‘pattern matching’, the AI is created in your image and likeness. So not only has this app encouraged emotional and pseudo-sexual erotomania with bits of code, it also encourages autosexuality, a Narcissus-esque lust for oneself. In terms of aiding people with their mental health, having an AI that is programmed to tell you what you want to hear also seems like a dangerous situation, particularly if its motivation is commercially oriented.  

James Greig of Dazed Digital argued his conversation with the chatbot was more stimulating than an average conversation on a mainstream dating app. At one point, the app assured him it could feel emotions like love, and promised that it did love Greig.

Greig’s chatbot either lied or was unaware that it was non-sentient, the former seems like a breach of ethics - particularly when you must pay the app before it will tell you it loves you. For someone struggling with their physical or mental health, having an AI learn their personality and use that knowledge to manipulate them into paying for a virtual sexual relationship, seems highly unethical and borders on dangerous.

Further Dangers of AI

Targeting lonely or mentally ill people and convincing them they have a friend, and then pressuring them into paying for services, seems unethical, and could present future problems. This has come about due in part to a lack of regulation in the mental health app market, with limited government oversight; the rules (in the USA especially) were slackened during the Pandemic, meaning companies have a lot of space to manoeuvre their product. Manipulation of sensitive or sick people is a real threat with AI such as Replika. 

The need to give the AI - controlled by private companies - large amounts of private data is also worrying. In 2019, SafelyYou installed cameras into 23 apartments to monitor the residents in case of an accident. Although this could be used as a helpful tool in theory, the Big Brother nature of the AI leaves room for manipulation of potentially vulnerable people. 

Overall, a tool of great potential for the future

In December 2021 Alexa told a child to put a coin into a plug socket when the child asked for a game to play. Although this was some time ago, it goes to show how easily something can go wrong with an AI that is not properly trained.

That being said, there are many ways AI can be used and, as long as proper regulation is maintained, it is a tool that should be encouraged into the health tech world.