• All Share : 53005.22
    UP 0.06%
    Top40 - (Tradeable) : 46285.24
    UP 0.03%
    Financial 15 : 15078.5
    UP 0.23%
    Industrial 25 : 71224.88
    UP 0.31%
    Resource 10 : 31089.6
    DOWN -1.03%

  • ZAR/USD : 14.3149
    DOWN -0.18%
    ZAR/GBP : 18.7432
    UP 0.04%
    ZAR/EUR : 15.6992
    DOWN -0.09%
    ZAR/JPY : 0.1341
    DOWN -0.22%
    ZAR/AUD : 10.6952
    UP 0.11%

  • Gold US$/oz : 1316.5
    DOWN -0.42%
    Platinum US$/oz : 1071
    DOWN -1.02%
    Silver US$/oz : 19.43
    DOWN -0.82%
    Palladium US$/oz : 675
    DOWN -1.03%
    Brent Crude : 45.69
    DOWN -0.11%

  • All data is delayed by 15 min. Data supplied by Profile Data
    Hover cursor over this ticker to pause.

Mon Jul 25 06:23:55 SAST 2016

Internet turns Microsoft 'teen girl' AI into a Nazi

AFP | 25 March, 2016 12:21
And now you know where Skynet came from.
Image by: WARNER

A Microsoft "chatbot" designed to converse like a teenage girl was grounded on Thursday after its artificial intelligence software was coaxed into firing off hateful, racist comments online.

Microsoft this week launched the experiment in which the bot nicknamed "Tay" was given the personality of a teenager and designed to learn from online exchanges with real people.

But the plan was sent awry by an ill-willed campaign to teach her bad things, according to the US software colossus.

"It is as much a social and cultural experiment, as it is technical," a Microsoft spokesperson said Thursday in response to an AFP inquiry.

"Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay's commenting skills to have Tay respond in inappropriate ways."

Tay is a machine learning project -- one in which software can evolve as it is being used -- designed for human engagement. But it got a harsh lesson in what it can learn from people.

As a result, Tay was taken offline for adjustments to the software, according to Microsoft.

"C U soon humans need sleep now so many conversations today," Tay said in its final post on Twitter.

All the offensive Twitter posts by Tay were removed, but many echoed online in the form of captured screen shots.

Tay tweets ranged from support for Nazis and Donald Trump to sexual comments and insults aimed at women and blacks.

Tay's profile at Twitter describes it as AI (artificial intelligence) "that's got zero chill" and gets smarter as people talk to it.

People could chat with Tay at Twitter and other messaging platforms, and even send the software digital photos for comment.

The project was said to target young adults with chatter styled after a teenage girl.

SHARE YOUR OPINION

If you have an opinion you would like to share on this article, please send us an e-mail to the Times LIVE iLIVE team. In the mean time, click here to view the Times LIVE iLIVE section.