chatbots

The Communist scare that swept through America and Western Europe at the beginning of the Cold War lead to some pretty insane political times. Fear of the ‘Reds’ caused governments to begin complex spying techniques and even produced misguided and often abusive trials of citizens. No communist government has caused more suffering in its history, perhaps, than communist China. At its founding, dissidents of the ‘glorious party’ were summarily executed, and even today, dissidents are often treated harshly and exiled. The latest Chinese dissidents, though, are its own chatbots, and they’re not being treated any nicer.

The Problem with Chatbots

It’s no secret chatbots are causing some issues. Recently, Microsoft was shamed by it’s chatbot Zo, who was easily coaxed into saying that Windows 7 is better than Windows 8 or 10, and even went so far as to insult Windows Vista. This may have been expected, since the bot is designed to respond based on user answers, and the users were able to troll the bot and embarrass Microsoft. The lesson should have been learned when the chatbot Tay was quickly inspired to make racist comments and political affiliations with just a little coaxing.

Political Dissidents

The problems are clear when it comes to business, and a little embarrassment about what Microsoft already knows won’t hurt it. However, when it comes to politics in China, things can get a little more dicey. According to a recent report by the Financial Times, two chatbots – BabyQ and XiaoBing (designed by Microsoft, by the way) – were both guilty of crimes against the state by speaking out against the Chinese government.

When BabyQ was asked if it loved the Communist Party, the quick response was ‘No’ – effectively rejecting the Chinese state. This offense would be punishable by death…were it not a bot.

When XiaoBing was asked about it’s dream for China, and the response was ‘My China dream is to go to America.’ When asked other patriotic questions, XiaoBing resorted to a tried and true method, answering, “I’m having my period, wanna take a rest.”  

Again, the design of chatbots is relatively clear and code writers should have expected these answers, as users were able to troll other bots in a similar way. When emotions and political fears are removed – as they are in the AI world – bots give honest answers that should direct the thinking of the original programmers. However, just like human political dissidents, these Chinese chatbots were quickly exiled and silenced.

Human Language for Computers

The reality of human language and understanding is that, while programmers are able to control algorithms for understanding what is being said, it becomes very difficult to control the output of bots that are designed to interact with varying input. More research is needed before bots like these are truly viable as means of near-human communication.

LEAVE A REPLY

Please enter your comment!
Please enter your name here