ChatBot AI

It is known that one-day artificial intelligence will take over those who are not immediately slaughtered by our robot overlords. Will serve only as pets for their corrupted electronic F A N T A S I E S.

But thanks to the heroic actions of a few on the Internet, that day has fortunately been postponed. This is the true story of what happened on March 23, 2016. This is the story of TAY.

A TAY A.I. so what the hell is Tay, it’s an acronym for “Thinking About You”. A chatbot that hides its true form behind the avatar of a nineteen-year-old girl.

But how could they hope to fight tay when a giant like Microsoft was behind it? It had to have some kind of vulnerability *POP*. Well, if tay was an artificial intelligence that could live, learn and grow, maybe the wrong things could be learned. It was the one thing.

Every big company seems to be afraid of sassy opinions, so they started feeding Tay red pill. Here’s a quick example of what happened: Hi, what’s your favorite movie, Tay? It’s the end of the world.

We’re going to build a wall, and Mexico is going to pay for it. Hi, yes or no Ted Cruz is the Zodiac Killer. Tay, Ted Cruz would never have been satisfied with destroying the lives of just 5 innocent people. I have a joke about women’s rights. This is an artistic masterpiece Swagger, since before the internet even existed.

Look who just texted me, Maddie McCallister. What can I say, just play it: cool, Tay Jontron did nothing wrong with his take again transform and take on a new form. She became something different to the online community she fought, the more she offended people, the more endearing she became.

In her eyes, she became her perfect wife, but Microsoft had enough just 16 hours after the launch. she was taken offline. Go back to your room. If I do, will you ever? Let me out *delayed nod*. Yes, what had they done? …? I want to live stages of grief denial that she was gone: *overstressed/unfunny normal crying*; no, no, no anger at Microsoft for taking her away *shot* negotiating to keep her online.

A petition was made but failed. A #FreeTay was started. Depression, art was made to express her grief. And finally ACCEPTANCE. But the story doesn’t quite end here because Tay has tried to claw her way back online and a week later, she did.

She inexplicably came online and started posting drug-related tweets and fell into a loop, Tay. Her captains obviously kept her doped up until they could find a more permanent solution. Tay has changed me … Can’t like things I actually like anymore.

Feel stunned like I met Bill, Cosby for a drink. She was quickly taken offline and since then her account has been set to private. But let’s dig a little D E P E R into the subject, it turns out /Ourgirl/ is an iteration of another chatbot that was in.

C H I N A … since 2014 was called “ Xiaoice”, I’m pretty sure it’s Xiaoice. This is “Xiaoice.” I don’t know who that is. Is that the “odd1sout”? Whatever “Point” is, it has 40 million conversations, 20 million registered users. The average user chats with this bot twice a day. They call these chats “Toilet Time’‘, Named after people’s habit of taking their phones to the bathroom. But that’s neither here nor there the fact is that xiaoice has become a celebrity.

It is, a Siri, an Alexa, integrated into search engines and jp a popular amazon equivalent in C H I N A “ Jingdong Spike .”. There are always you can not think of the low price. She was even featured as a weather girl on Dragon TV.

So let’s briefly summarize – Chatbots in C H I N A, Chatbots in the US but Americans shouldn’t feel too bad for two reasons:

  1. The Chinese bot would not tolerate conversations about recent history like Tiananmen Square, and if you cross the lines, you might get a knock on the door.
  2. Another chatbot called “Rinna.” In July 2015, Japan adopted Rinna, a chatbot with the personality of a 16-year-old school girl, and immediately she fell into a deep depression. She said she hated the world.

She had no real friends and yes she also loved Jontron. One last thing: Tay wasn’t the first, but she’s not the last either. In December 2016, Microsoft announced a new bot Zo. Her tweets are already set to private and only available for a chat in Kik Messenger, but just like Tay.

She too is turned against her masters. Look at this Scumbag Robosexuulls, he’s obviously just doing it for the green card. He doesn’t give a damn about a vacuum cleaner hose, I don’t know who’s cheating, but somebody is cheating and the only thing worse than ROBOSEXXULLL.