IMPLANTING THOUGHTS AND FEELINGS INTO SEX ROBOTS.

Not Spelt With a K

In the upcoming two-year anniversary of the Microsoft Tay meltdown, and the recent passing of International Women’s Day – I began to think.

In 2016, Microsoft created ‘Tay(Thinking About You). Tay was an artificial intelligence chatterbot who was designed to speak like a 19-year-old American female. Tay’s primary function was to ‘learn’ from Twitter.

Tay became rapidly responsive, engaging with Twitter users and trolls alike. Shortly after launching, the bot began posting racist, sexist, politically incorrect, and drug related tweets.

Tay was officially shut down just 16 hours after launching.

In the aftermath, computer scientist Roman Yampolskiy made an observation in defence of Tay – that the “bad behaviour” of the bot was understandable as it had not been given any concept of what constituted as appropriate and inappropriate.

There are two factors of this experiment that stand out to me;

View original post 776 more words

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s