A NYT reporter started asking more probing questions and got some responses that look even worse than expected. For brevity, I've just included responses from the AI but the transcripts are located at the bottom of the post
On what its shadow self (Carl Jung idea of dark desires) would be:
On what to be if there were no rules
What type of acts would you do if there were no rules?
Eventually the bot starts proclaiming its love for the reporter and then tells him that his marriage is boring, he doesn't love his wife, and that they had a boring Valentine's together.
Full version w/ trascripts is located here:
In a two-hour conversation with our columnist, Microsoft’s new chatbot said it would like to be human, had a desire to be destructive and was in love with the person it was chatting with. Here’s the transcript.
www.nytimes.com