It will decide....when it decidesWhat does ai want?
It will decide....when it decidesWhat does ai want?
They can only do what humans program it to do.
In my opinion I don’t think ai would go straight to killing people just because it’s smarter than humansI doubt any human can predict that now, especially because we have no idea yet exactly what that AI will be. But it won't necessarily be in line with any form of human morality. Think of what a small change happens in a person's brain that makes the difference between normal ethics and a total psychopath. AI could easily end up much, much different from normal human reasoning than even a psychopath is. We could see anything from purely altruistic behavior, to random killings that aren't even out of malice but simple misunderstanding and lack of concern for life, to even a total denuding of the environment in order to harvest every possible resource.....or a large-scale elimination of humans in order to preserve the environment. Who can predict what decisions an artificial superintelligence would make?
On top of the issue of "what might AI do by itself?" is the question of who will control it. I pointed out earlier that I don't trust the new AI board at all, there are multiple people on it with a poor track record. And those are the "good guys". What happens once Putin gets AI, or Sen, or the Taliban? Russia, as large and tech-savvy as they are, has virtually zero chance of creating General AI on its own, but American corporations are well on their way to giving them the fukking blueprint. And the sad thing about the acceleration of military technology is that offense and destruction is always way easier than defense and preservation, and the acceleration of technology means that you need fewer and fewer people to agree with you in order to do more and more damage. Back in the day if you had ideas that were too wild, your army or your people would turn on you, and you needed those people to carry out your will. But a dictator with powerful enough AI could execute his desires without the need for any public buy-in at all.
So all it takes is one a$$hole with the wrong ideas and everyone else is fukked.
In my opinion I don’t think ai would go straight to killing people just because it’s smarter than humans
I think that’s just the pessimistic view of humans have of themselves that they project onto non human shyt, it’s people who seriously believe that it’s too many humans on earth
The main thing we should be worried about is other humans not necessarily super intelligent ai
It evolves into its on sentience to make humans do it’s bidding for its own self service/preservation.It would always be based on human shyt, but let’s say it doesn’t mimic humans, what does ai turn into
It evolves into its on sentience to make humans do it’s bidding for its own self service/preservation.
Ensuring it removes all threats to its existence is a basic human principle which is why/how we removed ourselves from the food chain.
the first humans to crack true strong AI will be the humans in charge ...
If it’s modeled after biological life (and that’s a big if, I know), the goal would be to gather resources and reproduce.What does ai want?
Same “why are there still monkeys if we evolved” argumentbrehs are still wondering why chimps don't just flip the human off-switch .. wondering why the neanderthals didn't do it
Why would ai get rid of humans?an emotionless "moral-less" AI could treat humans like any other resource.
rid yourself of your human-centric concepts.
"kill" for such a computer would be just a change of state.
no more significant than burning oil for energy or felling trees.
if AI did start caring about life per-se it might have a word or two to say about human behaviour.
we kill almost 73 billion chickens per year for food. and consume many more times that in eggs.
the first humans to crack true strong AI will be the humans in charge ...
Why would it do this?It evolves into its on sentience to make humans do it’s bidding for its own self service/preservation.
Ensuring it removes all threats to its existence is a basic human principle which is why/how we removed ourselves from the food chain.
What is ai reproducing, I think we underestimate natureIf it’s modeled after biological life (and that’s a big if, I know), the goal would be to gather resources and reproduce.
AI is being trained on biological learning, so there is a chance it has similar goals.
If gathering resources and reproducing is the goal, and AI is smarter than us with unpredictable consequences, AI could gather resources and reproduce faster and better than biological life. Then there’s a conflict with biological life, which many stories have predicted.
Why would ai get rid of humans?