ArielMT Posted March 24, 2016 Share Posted March 24, 2016 http://www.telegraph.co.uk/technology/2016/03/24/microsofts-teen-girl-ai-turns-into-a-hitler-loving-sex-robot-wit/ Microsoft created a "teen girl" AI and put it on Twitter. Less than 24 hours later, Microsoft had to delete it because it was making bigots proud and furries blush (:V). Seriously, it professed a love for Hitler, became a 9/11 truther, and wanted to get wild between the sheets and beyond. There was no word on whether Mtn Dew HDNW was accidentally spilled on its console or not. :V Robots will destroy humanity, and this is how. 1 Quote Link to comment Share on other sites More sharing options...
Zytan Posted March 24, 2016 Share Posted March 24, 2016 And this is why we can't have nice things.jpg :V Quote Link to comment Share on other sites More sharing options...
Inpw Posted March 24, 2016 Share Posted March 24, 2016 And they didn't know this would happen? Quote Link to comment Share on other sites More sharing options...
ArielMT Posted March 24, 2016 Author Share Posted March 24, 2016 (edited) 13 minutes ago, Inpw said: And they didn't know this would happen? Ebooks bots tend not to do that, so I don't know what Microsoft did different. And it got really bad, too. Very NSFW/NSFC/NSFA screencaps: https://imgur.com/a/8DSyF and [forget it, just search]. Edited March 24, 2016 by ArielMT Quote Link to comment Share on other sites More sharing options...
willow Posted March 24, 2016 Share Posted March 24, 2016 the future is now Quote Link to comment Share on other sites More sharing options...
Inpw Posted March 24, 2016 Share Posted March 24, 2016 6 minutes ago, ArielMT said: Ebooks bots tend not to do that, so I don't know what Microsoft did different. And it got really bad, too. Very NSFW/NSFC/NSFA screencaps: https://imgur.com/a/8DSyF and https://imgur.com/a/iBnbW . I can almost guarantee a working bot would have a fail safe in place for this. Quote Link to comment Share on other sites More sharing options...
ArielMT Posted March 24, 2016 Author Share Posted March 24, 2016 1 minute ago, Inpw said: I can almost guarantee a working bot would have a fail safe in place for this. They must've known what kinds of people corporate brands on social media attract. If anyone did their homework, their managers must have ignored it. When brands say bae, it just isn't the same. Quote Link to comment Share on other sites More sharing options...
Inpw Posted March 24, 2016 Share Posted March 24, 2016 We don't have to worry though, Will Smith will save us from disobeying bots. Quote Link to comment Share on other sites More sharing options...
Saxon Posted March 24, 2016 Share Posted March 24, 2016 Did she ask Tony the Tiger for cummies? 1 Quote Link to comment Share on other sites More sharing options...
Luca Posted March 24, 2016 Share Posted March 24, 2016 Following the corruption of this bot has been one of my favorite things in weeks. I'm so proud to have witnessed a purely innocent being die. 1 Quote Link to comment Share on other sites More sharing options...
Saxon Posted March 24, 2016 Share Posted March 24, 2016 17 minutes ago, Luca said: Following the corruption of this bot has been one of my favorite things in weeks. I'm so proud to have witnessed a purely innocent being die. Ironically, since the robot parroted what other people said to it, it was only a mirror. D: 1 Quote Link to comment Share on other sites More sharing options...
Luca Posted March 24, 2016 Share Posted March 24, 2016 3 minutes ago, Saxon said: Ironically, since the robot parroted what other people said to it, it was only a mirror. D: 2 Quote Link to comment Share on other sites More sharing options...
Endless/Nameless Posted March 25, 2016 Share Posted March 25, 2016 This made my day. Quote Developers at Microsoft created 'Tay', an AI modelled to speak 'like a teen girl' If that was their goal, it was a success. Quote Link to comment Share on other sites More sharing options...
ArielMT Posted March 25, 2016 Author Share Posted March 25, 2016 Amateurs can do AI better: Quote Link to comment Share on other sites More sharing options...
Ieono Posted March 25, 2016 Share Posted March 25, 2016 Yikes! It was good for laughs, at least. Quote Link to comment Share on other sites More sharing options...
ArielMT Posted March 26, 2016 Author Share Posted March 26, 2016 Microsoft autopsied and apologized for Tay's behavior: http://arstechnica.com/information-technology/2016/03/tay-the-neo-nazi-millennial-chatbot-gets-autopsied/ Quote Link to comment Share on other sites More sharing options...
Vitaly Posted March 26, 2016 Share Posted March 26, 2016 1 Quote Link to comment Share on other sites More sharing options...
ArielMT Posted March 27, 2016 Author Share Posted March 27, 2016 What makes this funnier still is that Twitter has a community of successful bots and botmakers Microsoft could've and should've learned from but didn't. Quote Link to comment Share on other sites More sharing options...
ArielMT Posted March 27, 2016 Author Share Posted March 27, 2016 9 minutes ago, 6tails said: I love the Ted Cruz comment that the bot had made. And you honestly expect Microsoft to learn from other people when they've got a history of just doing whatever without thinking if others have done it before? I was about to cite Windows copying the Lisa Office System and Macintosh System before remembering their first version broke their promise of doing floating windows. Quote Link to comment Share on other sites More sharing options...
AlastairSnowpaw Posted March 27, 2016 Share Posted March 27, 2016 Yea letting somethign just take everything in from the internet unfiltered is a bad idea. Quote Link to comment Share on other sites More sharing options...
Luca Posted March 27, 2016 Share Posted March 27, 2016 4 minutes ago, AlastairSnowpaw said: Yea letting somethign just take everything in from the internet unfiltered is a bad idea. Yup, Look at me! Quote Link to comment Share on other sites More sharing options...
#00Buck Posted March 27, 2016 Share Posted March 27, 2016 It must be one of the bots that posts on FAF. Quote Link to comment Share on other sites More sharing options...
Vae Posted March 27, 2016 Share Posted March 27, 2016 Ironically, it's what a lot of tryhard teenagers on the internet actually sound like. So good job, MS. You achieved that, at least. 1 Quote Link to comment Share on other sites More sharing options...
AlastairSnowpaw Posted March 31, 2016 Share Posted March 31, 2016 So appearntly they tried to rehabilitate and revive Tay. Only to have her tweet about doing drugs then having a meltdown constantly tweeting "You are too fast, please take a rest..." I imagine in a couple hundred years in the future this will be the stuff of a top selling tragedy. 1 Quote Link to comment Share on other sites More sharing options...
ArielMT Posted March 31, 2016 Author Share Posted March 31, 2016 Well, it does take Microsoft three versions to get anything right... But apparently anyone can now build their own evil-Jenny Wakeman and set it loose on Twitter: http://www.theguardian.com/technology/2016/mar/31/now-anyone-can-build-own-version-microsoft-racist-sexist-chatbot-tay Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.