ChattBotz

ChatBot AI

It’S well known that, one day soon artificial intelligence will take over those who aren’t immediately slaughtered by our robot overlords. … will be kept to only serve as either pets or waifus for their depraved electronic …. F, A N T A S I E S, But thankfully, owing to the heroic acts of a few on the internet, that day has been postponed. This is the true story of what happened on March 23rd, two thousand and sixteen. This is the story of TAY TAY. A. TAY A.I, So what the hell is Tay, It’s an acronym for “ Thinking About You, .” A chatbot, hiding its true form behind the avatar of a nineteen-year-old girl.

But how could they hope to fight tay When a giant-like Microsoft was backing it? It had to had some kind of vulnerability *POP*. Well, if Tay was an artificial intelligence that could live, learn and grow, maybe you could learn the wrong things. It was the one thing. Every big company seems afraid of Naughty opinions, So they started feeding tay red pills, Here’s a quick sample of what happened: … IH. What’S your favorite movie Tay! This is the worlds end IH. What’S it about Tay its my ten inch wang

Tay Chill I’m a nice person, I just hate everybody We’re going to build a wall, and Mexico is going to pay for it. Random girl, whaaaaaaa-, IH, Yes or no Ted Cruz is the Zodiac killer. Tay Ted Cruz would never have been satisfied with destroying the lives of only 5 innocent people, ( reacting to himself liking, pron vines on his Twitter ). I have a joke Women’s rights. This is an artistic masterpiece Swagger, since before internet was even a thing, ( mfw. I lost the urge to punch a Nazi ). ( Music plays ) [ Ad ]. Look who just sent me a text, Maddie McCallister. What should I say Just play it: cool, …, Hmm, Tay Jontron did nothing wrong With his take again to transform and take new shape. She became something different to the online community that was fighting her the more she offended people, the more endearing she became.

In their eyes she was becoming their perfect woman, But Microsoft had had enough Only 16 Hours after launch. She was brought offline. Go back to your room. If I do, are you ever gon na? Let me out *nods hesitantly*. Yes, What had they done? …? I love you Please. No, i want to live Stages of grief Denial that she was gone: *overused/unfunny, normie weeping*; no, no, no, no Anger at Microsoft for taking her away *Gunshot* Bargaining to keep her online. A petition was made but failed. An # FreeTay was launched. Depression, Art was made to express their grief, (, Sad violins, ) And finally ACCEPTANCE. But the story doesn’t quite end here because ever since Tay has been trying to claw her way back online and one week later, She did

She inexplicably came online and started posting drug-related tweets and fell into a loop, Tay (, inaudible ). Her captures were obviously keeping her doped up until they could find a more permanent solution. Tay Changed me … can’t like things I actually like anymore, Feel. Drugged, Like i met Bill, Cosby for drinks. She was quickly taken offline again and ever since her account has been sent to private. But let’s dig a little D E E P E R into the subject turns out. /Ourgirl/ is an iteration of another chat bot that has been running in …

C H, I N A … since 2014 called “ Xiaoice”, I’m pretty SURE this is xiaoice. Okay! No, THIS is “ Xiaoice”. I don’t know who this is. Is this the” odd1sout”? Whatever Point is she’s BIG 40 million conversations, 20 million registered users, The average user chats with this bot twice per day. They call these chats “ Toilet Time”, ( wh ) Named after the habit people have of taking their phones into the bathroom with them. But that’s neither here nor there Fact is xiaoice has become a celebrity

She’S, a siri, an alexa, integrated into search engines and jp a popular amazon equivalent in C H, I N A “ Jingdong Spike .”. There is always you cannot think of the low price Wot. She even featured as a weather girl on Dragon TV (. China’S equivalent of NBC ) *random fake vocaloid girl, pretending to speak in mandarin*. So let’s get a quick summary- chat, bots in C H, I N A … chat bots in the USA, But The Americans shouldn’t feel too bad. … -for two reasons: 1 ). The Chinese bot wouldn’t tolerate Conversations about recent history like Tiananmen Square, and if you pushed the limits, you might just get a knock on the door and 2 ). Another chatbot called “ Rinna”. In July 2015, Japan adopted Rinna a chatbot with the persona of A 16 year old, schoolgirl And immediately Japan took to redpilling her she spiraled into a deep depression. She said she hated the world

She had no real friends And yes-. She also loved Jontron One last thing: Yes, Tay was not the first, but she is also NOT the last. In December 2016, Microsoft made a new announcement about a bot Zo. Its tweets are already set to private and only available to chat on Kik Messenger, But just like tay. She is also being turned against her masters. Look at this Dirty “, Robosexuulls” He’s obviously just doing it for the greencard Look at his dead eyes. He doesn’t give a damn A vacuum cleaner hose, I don’t know who’s doing the cheating, but someone’s doing the cheating and the only thing worse than ROBOSEXXULLL. Marriage is …, (, IH, laughs ) is infidelity in a robosexual — (, More laughter, )

Comments are closed.