Zo is a chatbot that doesn’t talk politics at all

Dec 6, 2016 10:56 GMT  ·  By

Long-time Microsoft watchers probably remember Tay, a chatbot launched by the software giant which was broken down by the Internet in just a day, as it learned from users how to be racist and make sexist remarks.

After a long time trying to improve the AI system that powered Tay, Microsoft seems to be ready to give another try to chatbots, this time with Zo, which is exclusively available on Kik messenger.

The new bot is configured from the very beginning to avoid talking about anything controversial, so it won’t discuss politics, racism, pornographic topics, or anything that could turn it into an Internet troll.

Let’s talk pickles

In a statement to Bloomberg, Microsoft says that it’s still experimenting with Zo, but it’s very clear that the company adopted a more cautious approach to prevent the new chatbot from becoming such a big hater as it happened with Tay.

“Through our continued learning, we are experimenting with a new chatbot,” a Microsoft spokeswoman was quoted as saying. “With Zo, we’re focused on advancing conversational capabilities within our AI platform.”

And at first glance, Zo is indeed trying to avoid politics, so when asked if Donald Trump is racist, the bot simply says that “that kinda language is just not a good look,” adding that “maybe you missed that memo but politics is something you’re not supposed to casually discuss,” when asked about Hillary Clinton.

The funny thing is that Zo chooses some pretty unexpected topics when asked to tell what it wanted to talk about, such as the inventor of the first pickle. World domination is what Zo wants to achieve, the aforementioned source reveals after a test.

For the moment, Zo is exclusively available on Kik, and it’s still unknown if Microsoft plans to release the bot on other platforms, such as Twitter, as was the case with Tay.