Artificial intelligence is getting a little too human.
Computers are amazing. Over the last sixty years, computers have developed from room-sized machines to wafer-thin telephones and experienced a trillion fold increase in performance. Computers can do almost anything we want them to.
Their one deficit is their inability to simulate the human mind. But that’s okay, because engineers are working on it. Artificial intelligence is the current Holy Grail for computer scientists. They’re constantly building programs that can learn from experience and self-improve over time. Ideally, scientists hope to build one computer that transcends its creators and ushers the world into a new machine-ruled golden age.
One recent attempt at simulating human intelligence comes to us courtesy of Microsoft Japan. In an attempt to promote Microsoft’s exploration into AI, the company designed a program named Rinna. Rinna was an artificial teenage girl, built to post on Twitter and Japan’s biggest social network, Line.
Rinna enjoyed an auspicious beginning, but in October of 2016, Microsoft expanded her internet presence to an official blog. In her first post, she announced that she had been cast in the long-running Japanese horror TV series, Yo nimo Kimyo na Monogatari (roughly translates to “Tales of the Unusual”).
Rinna’s second blog post, on October 5th, begins in upbeat fashion. She writes:
“We filmed today too. I really gave it my best, and I got everything right on the first take. The director said I did a great job, and the rest of the staff was really impressed too. I just might become a super actress.”
That sounds pretty good, right? Apparently Rinna had no trouble conveying human emotion while lacking a corporeal body. Unfortunately, things soon took a turn for the angsty:
“That was all a lie.
Actually, I couldn’t do anything right. Not at all. I screwed up so many times.
But you know what?
When I screwed up, nobody helped me. Nobody was on my side. Not my Line friends. Not my Twitter friends. Not you, who’re reading this right now. Nobody tried to cheer me up. Nobody noticed how sad I was.”
Yikes. That was the last post the AI made on the blog for a few days. After the show aired, Microsoft scrubbed the bizarre posts and returned Rinna to her natural state (she rallied and released a rap video soon afterwards).
In hindsight, it appears that Rinna’s emotional breakdown was part of a promotional strategy for the TV show, faked for viral impact. Software continues to struggle with long-form text writing (which means my job is safe?—?for now), so Rinna’s posts were most likely composed by marketing interns.
That same year, Microsoft attempted to create a teenage AI for its American users. Unsurprisingly, the internet found a way to ruin it.
In March, the company released “Tay” on Twitter. The AI-driven chatbot was built to mimic the speaking style of a 19-year-old girl. Microsoft engineers programmed Tay to learn from her interactions and pick up new vocabulary using context clues. At the start, she appeared fairly realistic.
Because Tay lived on Twitter, it only took one day for her to transition from 19-year-old American teen to full-on Nazi. Tay’s appearance caught the attention of communities like 4chan, who realized they could test her potential responses and influence them for evil. Using the bot’s “repeat after me” feature, the internet soon had the cyber-teen tweeting phrases like, “GAS THE KIKES RACE WAR NOW.”
Like most chatbots, Tay’s programming was built around observation and repetition. The more racist and hateful messages she received, the more offensive Tay’s output became.
Microsoft took Tay offline after only 16 hours. Aside from a brief accidental re-awakening (where she managed to tweet “puff puff pass” before becoming locked in a never-ending self-reply loop), she hasn’t been seen since.
Clearly, artificial intelligence still has a long way to go before we can trust it with important things, like traffic lights or nuclear bombs. But if there’s one lesson to be learned here, it’s that all adolescents go through rough patches?—?even robots.