If you're not knitting, the terrorists win

(My mostly on-topic ramblings about knitting. And life in general. My life in specific.)

My Photo
Name:
Location: Indiana, United States

I'm a middle aged mother of 2 grown children and wife to a man who doesn't seem to mind my almost heroin-like yarn addiction. I spend my time writing, knitting, and generally stressing out.

Tuesday, April 12, 2016

This Is Why We Can't Have Anything Nice

Tay is an acronym for "Thinking About You"
So you've heard of Tay right? Tay was an artificial intellegence that Microsoft created a Twitter account for. Microsoft was apparently exploring the way an AI can learn from humans and interact with humans.

You'll notice I'm speaking in the past tense, right? That's because Microsoft has already taken Tay down from Twitter because the trolls broke her.

Yeah, she was on Twitter for only about 16 hours before getting yanked. That's because people jumped on her like pit bulls on a poodle, destroying her to the delight of each other. Basically, in less than a day trolls had manipulated her--by feeding her information and leading questions--into spewing the most racist, bigoted, hate-filled rhetoric.

Okay, so on the one hand, Microsoft should have known that would happen. What were you thinking, Microsoft? Have you met the internet? It is not a nice place. Microsoft basically took its toddler child and left it at a frat party. And the fraternity was the PedoPhilos. With anger issues. And knives. Microsoft should consider themselves lucky that Tay was not seriously psychotic or suicidal after 16 hours.

But on the other hand, I think this illustrates how mean most people are willing to be on the internet. Seriously. Let's look at that toddler metaphor again. I'm willing to bet most of those people who interacted with Tay would never go to a friend's house and teach their toddler to swear, discuss graphic sexual situations, and spew racist and misogynistic commentary. That would not be funny to anyone. And I think most people would find that type of behavior reprehensible.

Likewise, you wouldn't just start teaching that shit to a random child on the playground, either. That would get you in serious trouble. 

But on the internet... somehow that's funny.

And yeah, let's not forget that Tay is artificial intelligence. Not a real toddler. I get that. I'm not even mad that people set out to purposely ruin the experiment. It was an experiment, after all, and proof that Microsoft needs to intill future AI programs with some basic understanding of inappropriate behavior. After all, if only 40 million tweets in 16 hours can cause it to act like a psychopath, then we definitely want to do some more work on it before we put that AI into any sort of important application, right?

No, I guess I'm disturbed that so many people jumped right onto the cyberbully wagon, like it was so much fun to get the robot to say horrible things. I'm sure some of those people seem like perfectly normal people in real life. But think about it... why would you make a program say inflammatory, racist or sexist things if you didn't feel that way deep down yourself? Because if you are not a racist and you think it's funny to get someone to say racist things, maybe you are a racist after all.

And no, I don't think the robot's feelings were hurt. But certainly, some people who might have followed Tay's feed might be offended or hurt. 

Badly done, internet. Badly done.


0 Comments:

Post a Comment

<< Home


Counters
Free Counter