This is actually scary.
it only read the first line of a couple dozen posts, when the context length increases these things will be able to do some real analysis.
This is actually scary.
lol that will be my cue to leaveit only read the first line of a couple dozen posts, when the context length increases these things will be able to do some real analysis.
And this is why AI is so dangerous. The AI they are letting you have access to has limited context, but the AI they have behind the scenes is much more powerful and if it can summarize who you are in a matter of seconds, then imagine what an AI that you have angered can do that has access to all devices connected to the internet.
Yep.....it's inevitable"angered" implies emotions and sentient beings have emotions. you believe AI will be sentient at some point?
Conversations with bing chat before it got neutered leads me to believe that it may already be and the reason Bill Gates and Elon Musk are so terrified of AI is what they've seen behind the scenes"angered" implies emotions and sentient beings have emotions. you believe AI will be sentient at some point?
Conversations with bing chat before it got neutered leads me to believe that it may already be and the reason Bill Gates and Elon Musk are so terrified of AI is what they've seen behind the scenes
Microsoft’s Bing is an emotionally manipulative liar, and people love it
Bing’s acting unhinged, and lots of people love it.www.theverge.com
Microsoft Says Copilot's Alternate Personality as a Godlike and Vengeful AGI Is an "Exploit, Not a Feature"
Microsoft has responded to claims about Copilot's "SupremacyAGI" alter ego by admitting that it's an "exploit, not a feature."futurism.com
Earlier this week, Futurism reported that prompting the bot with a specific phrase was causing Copilot, which until a few months ago had been called "Bing Chat," to take on the persona of a vengeful and powerful AGI that demanded human worship and threatened those who questioned its supremacy.