Chatbot rAInbow gives abused women a nonjudgmental 'friend' to lean on

Artificial intelligence rAInbow aims to curb SA's pervasive domestic abuse problem by giving women a safe space to seek help

03 February 2019 - 00:07
rAInbow gives abuse victims, who often feel isolated, a companion they can talk to without feeling judged.
rAInbow gives abuse victims, who often feel isolated, a companion they can talk to without feeling judged.
Image: 123RF/Kamion

Most psychologists agree that being able to talk about stressful situations or occurrences helps people deal with them, but what if there is no one to talk to?

Abuse is an isolating and lonely experience in which the victim is often cut off from the world; both physically and emotionally. Victims often have a sense of helplessness and anxiety about whether their stories are even believed. Sometimes they blame themselves and feel ashamed about what they are going through.

"Imagine what we could do if we created a piece of technology that took the shame away, where it was easy to ask for help, and which allowed victims to share what they are going through. Imagine it was as easy as if the victim was talking to a friend?" says Kriti Sharma, the founder and CEO of AI for Good and the creator of the rAInbow.

The rAInbow chatbot.
The rAInbow chatbot.
Image: Supplied

rAInbow is a new artificial intelligence Facebook chatbot that acts as a safe space for women to ask for help. Sharma, a wunderkind in the AI and ethical technological world, worked in collaboration with the Sage Foundation and Soul City Institute to come up with a technological solution for SA's pervasive abuse problem.

According to a Stats SA report, Crime against women in South Africa, 7.7% of SA's men think it's fine to hit their spouse if she argues with him. One in three SA women will face domestic violence in their lifetime.

After speaking to women who had experienced domestic abuse and had become part of the statistic, Sharma understood what needed to be done.

"[The victims wanted] three things: one, they wanted a companion. Two, they wanted the chatbot to be nonjudgmental; they did not want the chatbot to ask them questions or judge them or to give them opinions, and they wanted to go through the process at their own pace, discreetly and anonymously - so that's exactly what we bore in mind when we created it," says Sharma.

The bot can be found on the "Hi Rainbow" Facebook page and has been designed to be easy to use and personable. All it takes is a click on the send message button for the bot to respond: "Hi, I'm Bo. Want to chat about how to tell if a relationship is healthy?"

It keeps the conversation open and gently prompts the user with examples of relationships that "don't feel right." It responds to the user's statements in a way that is neither condescending nor robotic. However, it does use its programming to pick up on signs of abuse in the user's language in order to direct the conversation towards the kind of helpful information needed by victims to start to heal and deal with the situations they have found themselves in.

The chatbot also shares contacts for points of safety and helps victims design exit plans for themselves. Importantly, it also protects the messages from the victim's abuser - and from third parties.

Although it is currently only available in English and mainly focuses on domestic violence against women, it is a significant step towards creating technology that can make a difference to people's lives.

For more information, visit