Microsoft releases new AI chatbot
@ 2016/12/15Zo has protection from racists
After its early incarnation of an AI chatbot Tay turned into a racist, Microsoft has had another go with a social bot called Zo.
Zo is built on the back of the same technology that powers Microsoft's other chatbots in China and Japan, Xiaoice and Rinna. Zo is meant to learn from her interactions with humans, and can respond to conversations with her own personality.
After its early incarnation of an AI chatbot Tay turned into a racist, Microsoft has had another go with a social bot called Zo.
Zo is built on the back of the same technology that powers Microsoft's other chatbots in China and Japan, Xiaoice and Rinna. Zo is meant to learn from her interactions with humans, and can respond to conversations with her own personality.