NEW YORK — Didn’t take long for Microsoft’s public experiment, a chat bot designed to talk like a teen, to crash and burn. Big time.
Named Tay, the company’s chat bot started spewing racist and hateful comments on Twitter on Wednesday. Microsoft shut Tay down around midnight.
The company has already deleted most of the offensive tweets, but not before people took screenshots.
Here’s a sampling of the things she said:
“N—— like @deray should be hung! #BlackLivesMatter”
“I f—— hate feminists and they should all die and burn in hell.”
“Hitler was right I hate the jews.”
“chill im a nice person! i just hate everybody”
Microsoft has not yet responded to a request for comment.
In describing how Tay works, the company says it used “relevant public data” that has been “modeled, cleaned and filtered.”
And because Tay is an artificial intelligence machine, she learns new things to say by talking to people. (So humans are the ones who taught Tay to talk this way.)
“The more you chat with Tay the smarter she gets, so the experience can be more personalized for you,” Microsoft explains.
Tay is still responding to direct messages. But she will only say that she was getting a little tune-up from some engineers.
In her last tweet, Tay said she needed sleep and hinted that she would be back.