Screen Shot 2016-03-24 at 3.51.59 PM

So Microsoft launched a chat bot yesterday, named Tay. And no, she doesn’t make neo-soul music. The idea is that the more that you chat with Tay, the smarter she gets. She replies instantly when you engage her and because of all that feedback, her answers get more and more intelligent over time.

Tay sent out her first tweet at 1:14PM, and started engaging with Twitter users pretty quickly.

https://twitter.com/TayandYou/status/712613527782076417

Microsoft designed the experiment to “test and improve their understanding of conversational language”, presumably to improve Cortana. The only problem with this model is humans are not exactly the best role models around. I mean, it’s been just over 24 hours, and we’ve already taught an otherwise intelligent bot to type like a retard.

Tay

Even worse. FOR PETE’S SAKE, LOOK AT WHAT HUMANS ARE TWEETING AT A BOT

Some unsavoury people on Twitter found the bot and started to take advantage of the bot’s learning process to get it to say racist, bigoted and very…Donald-Trump-like things.

I mean…

 

1-2 3

screen shot 2016-03-24 at 10.48.22 screen shot 2016-03-24 at 11.10.58 screen shot 2016-03-24 at 11.55.42 screen_shot_2016-03-24_at_11_14_23

By the time Microsoft’s developers discovered what was going on, they started to delete all the offensive tweets, but the damage was already done – thank god for screenshots. I reckon moving forward, they will implement filters and try to curate Tay’s speech a little more, so this doesn’t happen again.

https://twitter.com/TayandYou/status/712832594983780352

Some think Microsoft should have left the offending tweets as a reminder of how dangerous Artificial Intelligence could get.

https://twitter.com/DetInspector/status/712833936364277760

I’m inclined to disagree. All this is for me though, is a reminder of how many depraved people we have to share the world with.

UPDATE: So, it turns out that Tay isn’t terribly dumb after all.

 

Osarumen Osamuyi Author

Get the best African tech newsletters in your inbox