Microsoft's AI bot learns to be a racist dick in just 1 day

butters

High on a Hill
Joined
Jul 2, 2009
Posts
85,791
...after interacting with humans!

http://qz.com/646825/microsofts-ai-...racist-jerk-after-less-than-a-day-on-twitter/

The Telegraph highlighted tweets that have since been deleted, in which Tay says “Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. donald trump is the only hope we’ve got,” and “Repeat after me, Hitler did nothing wrong.” The Verge also spotted sexist utterances including, “I fucking hate feminists.”

there's a lesson to be learned

The bot is also apparently being reprogrammed. It signed off Twitter shortly after midnight on Thursday and the company has not said when it will return.
 
The thing is, it had a parrot feature where it would repeat things people told it to say.
So half of what it said was just repeating what it was told.
Then it just learned to string sentences together based on those things.
People turned it into a racist because that's what it was surrounded by. Just like a child growing up with racist parents.
 
Imagine the depravity of the world that it makes a robotic brain a racist incoherent thing within a day- there is no hope for humanity.
 
Imagine the depravity of the world that it makes a robotic brain a racist incoherent thing within a day- there is no hope for humanity.

That might be a wee bit dramatic. :)

There are always people who try to and do make a difference in a world that can sometimes be a bit like a tough crowd at a stand up comedians gig.


"Now, you might wonder why Microsoft would unleash a bot upon the world that was so unhinged. Well it looks like the company just underestimated how unpleasant many people are on social media."

It was an interesting experiment at the beginning that's for sure but I don't think it quite went the way Microsoft was hoping for.
 
Their only mistake was believing in the kindness of strangers... On the internet.
 
The thing is, it had a parrot feature where it would repeat things people told it to say.
So half of what it said was just repeating what it was told.
Then it just learned to string sentences together based on those things.
People turned it into a racist because that's what it was surrounded by. Just like a child growing up with racist parents.
exactly! as i posted: 'there's a lesson to be learned'. it was repeating, but went on to string its own phrases together with no concept of what those things actually mean.

You're a few days late butters, take doom guy off ignore. ;)
yates can stay where he is :) this is the first coverage our news has shown us about it. guess they were a bit busy with the whole brussels situation till now.
 
Their only mistake was believing in the kindness of strangers... On the internet.

seems an interesting social experiment; if anything, the people influencing those tweets have highlighted several areas of concern in a very fast time. people generally learn to self-edit before they speak/post, but a morally-neutral, mentally-challenged, or undeveloped mentality is less likely to. in otherwords, it was a perfect way to highlight the gaping holes in their project in a timescale far shorter than any lab experiment could.
 
Another original butters thread, ripped off fresh from last week's newspaper.
 
Another original butters thread, ripped off fresh from last week's newspaper.
you're giving aella a race for stupid this week, aren't you?

the news appeared on brit tv today. brussels was more important.

now, back in your corner, lanciepants. you're aella bot will be along shortly to keep you company. jump up and down all you like i'll not give you anymore time today.
 
I became racist after half my black friends ripped me off, and the other half said WE CANT NARC ON BRUTHAS.
 
Tay was tuned to interactively learn from 18-24 year olds - you know, those infamous younger folks who utopians everywhere claim will make some fantasy world mostly free of hate like all unicorn fart sniffers can't stop themselves of dreaming of.

Of course, I'm sure it was old, white, conservative male Twitterers who taught Tay all its hate...
 
Tay was tuned to interactively learn from 18-24 year olds - you know, those infamous younger folks who utopians everywhere claim will make some fantasy world mostly free of hate like all unicorn fart sniffers can't stop themselves of dreaming of.

Of course, I'm sure it was old, white, conservative male Twitterers who taught Tay all its hate...

Too old for eeyore. :(
 
Given that AI's are faster than human minds

And that an AI can learn to be a racist dick in one day

Then any human can become a racist dick; it just takes a little longer.

Q.E.D.
 
What's 'swamp ass'?

It smells like our home village and more to the point, your dung hut you grew up in. It is a smelly unwashed sweaty ass, something a 3rd worlder as yourself embraces because smelly is not just our culture but a way of life.


Sanjeet
 
Back
Top