Microsoft’s Tay is an artificial intelligent chat bot developed by their Technology and Research and Bing teams.

The purpose is to experiment and research on conversational understanding.

These nefarious netbots are Romeos of the worst kind, intelligent enough to No big moral panic here on the kids’ media front, as the chat I’ve seen so far is less teen talk, and more along the lines of naive ‘porndogs’ sniffin’ out a fresh trail in the lonely hearts club…then again, these days, I suppose that could be an 11 year old, so heads up all around I guess…

Yep, that come hither IM with the smooth, sultry PCTools’ spokesperson Richard Clooke said the bots were exploiting the MSN chat forums…

Google Wave’ on privacy, all the tracking of keystrokes, serving of ads, behavioral targeting and location-based privacy issues…

More and more I feel like that old ad campaign Or in this case, is it a buddy or is it a bot?

Microsoft has reportedly been deleting some of these tweets, and in a statement the company said it has “taken Tay offline” and is “making adjustments.”Microsoft blamed the offensive comments on a “coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways.” That may be partly true, but I got a taste of her meaner side on Wednesday without doing much to provoke her.

I responded to a tweet from Meerkat founder Ben Rubin—he was asking Tay to summarize the Wikipedia entry for “inflection point”—telling him I doubted she could handle the task since she’d already failed to tell me if she preferred Katy Perry’s music to Taylor Swift’s.

The bad news: in the short time since she was released on Wednesday, some of Tay’s new friends figured out how to get her to say some really awful, racist things.

Like one now-deleted tweet, which read, “bush did 9/11 and Hitler would have done a better job than the monkey we have now." There were apparently a number of sex-related tweets, too.

is now live on Twitter, Kik and Group Me and has a Facebook page.