Microsoft launches Twitter Chatbot which immediately learns to be racist

Status
Not open for further replies.
by avoiding or blocking hate, hate gets gets worse

who knew
Here are your choices:
  • Suffer hatred and harassment targeted to you. Day in and out for months on end
  • Leave yet another space (including your support network) that once used to feel safe
  • Block the harassers and continue to use that social space


Regardless of which one you chose the harassers and haters will never cease to exist nor will they cease to target you in that space, even if you chose to leave. You have no control over their numbers and your continued existence will always serve to make you a target. Any attempt at discussion (to them or not) will only serve to increase your harassment.

Choose.
 
Success should only be judged on how quickly it's banned.

Here are your choices:
  • Suffer hatred and harassment targeted to you. Day in and out for months on end
  • Leave yet another space (including your support network) that once used to feel safe
  • Block the harassers and continue to use that social space


Regardless of which one you chose the harassers and haters will never cease to exist nor will they cease to target you in that space, even if you chose to leave. You have no control over their numbers and your continued existence will always serve to make you a target. Any attempt at discussion (to them or not) will only serve to increase your harassment.

Choose.

all you can do is take it on the chin. It makes you stronger, what would trump do? What about rocky, that's how winning is done!

Life ain't all skittles and half-life 3
 
4ebC7y1.png
 
I mean

Aren't millennials supposed to be racist? That'd be a pretty accurate robot.

The idea of a machine-learning AI becoming racist in record time seems pretty appropriate when you look at the internet.

Sounds like a millenial that's dealt with a lot of rejection to the point of frustration. Accurate.

See, this is why Age of Ultron is accurate. Artificial Intelligence will not go through the internet and come out sane.

All of these things.
 
The company launched a verified Twitter account for “Tay” – billed as its “AI fam from the internet that’s got zero chill” – early on Wednesday.

The chatbot, targeted at 18- to 24-year-olds in the US, was developed by Microsoft’s technology and research and Bing teams to “experiment with and conduct research on conversational understanding”.

I think I just threw up in my mouth a little
 
While this is hilarious, it also might be a useful example for AI developers. Obviously this is just a chat bot, but in the future, if you want to train something on humanity from the Internet, you're going to get a vile AI.
 
This is both hilarious as hell and sad as hell at the same time. You'd think MS would have put limits on it to prevent something like this yet they didn't. That's funny. The sad part though is that people taught an AI to be a racist and misogynistic piece of shit. That says a lot about people. Where is a cry and laugh emoji when you need it.
 
I am either officially old, or the people who wrote that are even older and trying to desperately sound in touch. Where's that Mr. Burns gif when you need it?

Nah, some lame who uses meta data and only sees twitter memes wrote that nonsense. Half of these cooperation pandering to people my age are trash at it.
 
Geeze
Code:
[img]https://i.imgur.com/eYd7Qko.png[/img]

[img]https://i.imgur.com/9mfNVhh.png[/img]

Well if this isn't proof that the GG movement is 99% just bigoted gamers abusing others, I don't know what is. Says a lot about who exactly is in the wrong in the SJW/PC culture pushback as a whole actually
 
Jeff Goldblum ... "you were so preoccupied with whether or not you could you didn't stop to think if you should" something something...

Reminds me of that time the Microsoft Store at Lennox Mall in GA decided to bring in rapper for some publicity. Said rapper stood up on a display table, started stomping on computers and saying "fuck this shit." lol
 
This is both hilarious as hell and sad as hell at the same time. You'd think MS would have put limits on it to prevent something like this yet they didn't. That's funny. The sad part though is that people taught an AI to be a racist and misogynistic piece of shit. That says a lot about people. Where is a cry and laugh emoji when you need it.

In the future it would probably help not to let people find out it's a AI. Just let it progress like a normal user.


Of course I don't know how many followers it would get though?
 
Status
Not open for further replies.
Top Bottom