Microsoft's "Zo" chatbot picked up some offensive habits It seems that creating well-behaved chatbots isn't easy. Over a year after Microsoft's "Tay" bot went full-on racist on Twitter, its successor "Zo" is suffering a similar affliction. https://www.engadget.com/2017/07/04/...ensive-habits/ |
All times are GMT +1. The time now is 15:29. |
Powered by vBulletin® - Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Content Relevant URLs by vBSEO