The chatbot’s name was too similar to Taylor Swift’s

Sep 10, 2019 09:25 GMT  ·  By

Remember Tay, Microsoft’s chatbot that came to be as one of the most ambitious AI projects but ended up becoming a Nazi due to the things it learned online?

As it turns out, the racist tweets that it published weren’t Microsoft’s only problem, as Tay was super-close to causing some legal issues for the software giant.

Microsoft President Brad Smith reveals in an upcoming book called “Tools and Weapons” that Taylor Swift wanted to sue the company for calling the chatbot Tay.

Nazi chatbot

As it turns out, Swift felt that the Tay name was too close to her own, so she sent her lawyers after Microsoft only days after the chatbot went live. Taylor Swift owns trademarks for her name, signature, and initials, so Microsoft was accused on infringing on them with Tay.

“I was on vacation when I made the mistake of looking at my phone during dinner,” Smith writes in his forthcoming book, Tools and Weapons. “An email had just arrived from a Beverly Hills lawyer who introduced himself by telling me: ‘We represent Taylor Swift, on whose behalf this is directed to you,’” Smith explains in the book as per The Guardian.

“He went on to state that ‘the name Tay, as I’m sure you must know, is closely associated with our client.’ No, I actually didn’t know, but the email nonetheless grabbed my attention. The lawyer went on to argue that the use of the name Tay created a false and misleading association between the popular singer and our chatbot, and that it violated federal and state laws,” Smith explains.

Microsoft eventually took the chatbot down not necessarily because of the name dispute, but because of becoming a racist piece of technology. One of its most famous tweets praised Hitler and criticized politicians.

“Bush did 9/11 and Hitler would have done a better job than the monkey we have now,” the chatbot said before being removed from the WWW.