In the upcoming two-year anniversary of the Microsoft Tay meltdown, and the recent passing of International Women’s Day – I began to think.
In 2016, Microsoft created ‘Tay’ (Thinking About You). Tay was an artificial intelligence chatterbot who was designed to speak like a 19-year-old American female. Tay’s primary function was to ‘learn’ from Twitter.
Tay became rapidly responsive, engaging with Twitter users and trolls alike. Shortly after launching, the bot began posting racist, sexist, politically incorrect, and drug related tweets.
Tay was officially shut down just 16 hours after launching.
In the aftermath, computer scientist Roman Yampolskiy made an observation in defence of Tay – that the “bad behaviour” of the bot was understandable as it had not been given any concept of what constituted as appropriate and inappropriate.
There are two factors of this experiment that stand out to me;
View original post 776 more words