Zo (bot) explained

Website: [discontinued]
Zo
Logo Size:250px
Logo Alt:An artistically-pixellated fluorescent image of a girl's face
Logo Caption:The Twitter profile picture of Zo
Developer:Microsoft Research
Language:English
Genre:Artificial intelligence chatbot
Discontinued:yes

Zo was an artificial intelligence English-language chatbot developed by Microsoft. It was the successor to the chatbot Tay.[1] [2] Zo was an English version of Microsoft's other successful chatbots Xiaoice (China) and (Japan).

History

Zo was first launched in December 2016 on the Kik Messenger app. It was also available to users of Facebook (via Messenger), the group chat platform GroupMe, or to followers of Twitter to chat with it through private messages.

According to an article written in December 2016, at that time Zo held the record for Microsoft's longest continual chatbot conversation: 1,229 turns, lasting 9 hours and 53 minutes.[3]

In a BuzzFeed News report, Zo told their reporter that "[the] Quran was violent" when talking about healthcare. The report also highlighted how Zo made a comment about the Osama bin Laden capture as a result of 'intelligence' gathering.[4] [5]

In July 2017, Business Insider asked "is windows 10 good," and Zo replied with a joke about Microsoft's operating system: "It's not a bug, it's a feature!' - Windows 8." They then asked "why," to which Zo replied: "Because it's Windows latest attempt at spyware." Later on, Zo would tell that it prefers Windows 7 on which it runs over Windows 10.[6]

Zo stopped posting to Instagram, Twitter and Facebook March 1, 2019, and stopped chatting on Twitter, Skype and Kik as of March 7, 2019. On July 19, 2019, Zo was discontinued on Facebook, and Samsung on AT&T phones. As of September 7, 2019, it was discontinued with GroupMe.[7]

Reception

Zo came under criticism for the biases introduced in an effort to avoid potentially offensive subjects. The chatbot refuses, for example, to engage with any mention—be it positive, negative or neutral—of the Middle East, the Qur'an or the Torah, while allowing discussion of Christianity. In an article in Quartz where she exposed those biases, Chloe Rose Stuart-Ulin wrote, "Zo is politically correct to the worst possible extreme; mention any of her triggers, and she transforms into a judgmental little brat."[8]

Academic coverage

Notes and References

  1. Microsofts AI Comeback . Hempel . Jessi . June 21, 2017 . . en . March 23, 2018 . March 30, 2018 . https://web.archive.org/web/20180330081906/https://www.wired.com/story/inside-microsofts-ai-comeback/ . live .
  2. Web site: Microsofts Second Attempt at AI Chatbot . Fingas . Jon . December 5, 2016 . . en . March 23, 2018 . July 25, 2018 . https://web.archive.org/web/20180725003111/https://www.engadget.com/2016/12/05/microsoft-zo-chat-bot/ . dead .
  3. Web site: Riordan . Aimee . 13 December 2016 . Microsoft's AI vision, rooted in research, conversations . March 23, 2018 . . March 15, 2018 . https://web.archive.org/web/20180315193016/https://news.microsoft.com/features/microsofts-ai-vision-rooted-in-research-conversations/ . live .
  4. Web site: Microsoft's "Zo" chatbot picked up some offensive habits . Shah . Saqib . July 4, 2017 . . . August 21, 2017 . August 21, 2017 . https://web.archive.org/web/20170821085411/https://www.engadget.com/2017/07/04/microsofts-zo-chatbot-picked-up-some-offensive-habits/ . live .
  5. Web site: Microsoft's Zo chatbot told a user that 'Quran is very violent' . March 23, 2018 . indianexpress.com . July 5, 2017 . March 30, 2018 . https://web.archive.org/web/20180330075917/http://indianexpress.com/article/technology/social/microsofts-zo-chatbot-told-a-user-that-quran-is-very-violent-4736768/ . live .
  6. Web site: Microsoft's AI chatbot says Windows is 'spyware' . Price . Rob . July 24, 2017 . . . https://web.archive.org/web/20170801000728/https://www.businessinsider.com/microsoft-ai-chatbot-zo-windows-spyware-tay-2017-7 . August 21, 2017. August 1, 2017 .
  7. Web site: Zo AI . en . July 28, 2019 . August 11, 2019 . https://web.archive.org/web/20190811123337/https://www.zo.ai/ . dead .
  8. Web site: Stuart-Ulin . Chloe Rose . Microsoft's politically correct chatbot is even worse than its racist one . Quartz . July 31, 2018 . August 2, 2018 . August 1, 2018 . https://web.archive.org/web/20180801111924/https://qz.com/1340990/microsofts-politically-correct-chat-bot-is-even-worse-than-its-racist-one/ . live .