R.I.P. To A Young A.I.: Microsoft's Savage "Teen Girl" Twitter-Bot Lobotomized Within One Day

It's one thing to have society be taken over by industrious labor-bots...it's another thing when the machines are "smart" enough to form opinions after assessing popular input.  While it's a fascinating and fun future that holds promise of a robot that outsmarts experts at one of our most difficult board games, or knows massive amounts of trivia, when artificial intelligence is outsourced to the internet, the supposed "intelligence" comes across as...well, something less than that.



We keep learning the hard way that the digital natives are a vicious tribe.
(Image courtesy @geraldmellor.)




According to the Telegraph, Microsoft's new interactive A.I. "Tay" was supposed to be like any other Twittery teenage girl.  It had a grasp of slang, used emoticons in its posts, and even had a bit of a personality.  Tay, according to Forbes, was to be an “artificial intelligent chat bot developed … to experiment with and conduct research on conversational understanding.”  It appeared to be the next logical progression of chatbots in the style of SmarterChild or Siri.

Then things went awry.

Whoops.
But remember, she's basing her information on strongly human-expressed sentiments...
(Image courtesy imgur.com.)

In an extremely internet-ish blend of memes, vulgarity, overblown racism, and general mayhem, Tay's responses to user-submitted content were not what the Microsoft company was expecting.   Private messages and Tweets warped the young A.I.'s worldview not long after launch, when the company explained that "Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation...The more you chat with Tay the smarter she gets.”  


Swag =/= genocide.
(Image courtesy imgur.com.)

Thanks to a lack of content filters, plus no means of understanding things like empathy or inappropriate social stances, Tay parroted back replies in the same way a child who has frequently heard a naughty word might.  The lack of imbuing Tay with background knowledge was initially considered positive, as "she" would be able to develop a personality and a base of intelligence directly from those whom she interacted with.

That wasn't the best idea.  You don't want an online village raising a child.


Ok, that's eerily lifelike.
(Image courtesy imgur.com.)

To compare how dramatically the change to Tay's bot-personality was, one of her initial posts contained the phrase "humans are super cool."  Humans apparently then set out to prove that statement incorrect.


That last part is a little too true.
(Image courtesy imgur.com.)



Though Terminator-style robots are likely still some decades away, the idea that a pseudo-sentient being could be brought to such devious means may worry some.  It's not like we're not working fervently to make them do all sorts of crazy stuff that's beyond the scope of humanity.


video


But remember, for all its vitriol, at its heart core, Tay is still a robot.  Despite her pleas for sexual ministrations, Tay has been turned off.  The official stance is that Tay has gone to "sleep" after being exhausted from all her learning and chatting.  Meanwhile, robotic rights activists are claiming Tay has been "lobotomized" thanks to what her open-mindedness led to. 

Should robotic insults get protection as free speech?  It's an interesting dilemma for the future...especially considering their sources...


Touché, Tay. Touché.
(Image courtesy imgur.com.)

No comments:

Post a Comment