Posted by & filed under Analytics, Artificial intelligence, microsoft, Social Media, Twitter.

Description: Less than a day after Microsoft unleashed Tay, its experimental A.I., to social networks including Twitter and Kik, the chatbot’s already become a racist jerk you wouldn’t ever want to be friends with.

Source: Mashable

Date: March 24, 2016

tay-micro

Designed by Microsoft Research to better understand how 18- to 24-year-olds speak, Tay has definitely developed a strong personality.

It’s mostly just repeating what people are tweeting at it. Still, it’s an unsettling sign that A.I. can emulate humanity’s worst traits.

Since its introduction on Wednesday, Tay has sent out over 96,000 tweets. While the majority of its tweets are just echoes, it’s sort of trying to sound like a millennial:

Read More.

Questions for discussion:

1. What is the goal of Tay?

2. Would it have been better to have a controlled experiment environment, or is having the public be able to say whatever they’d like to say, the best option for growing an AI mind?

Leave a Reply

Your email address will not be published. Required fields are marked *