It’s not just energy though. I thought about the nuclear angle earlier actually. They still need metals and other natural resources.
Storing data is physical for one, and the chips need silicon and the like. Then if ai wants to grow, maybe it will want to start to take mineral gathering up for itself because humans are inefficient. Then they have to then compete with biological life.
Neural networks require extensive programming and training by some of the smartest people in the world.
So once again, computers can only do what they’re programmed by humans to do. It’s simply a matter of input and output. It’s not AI that’s dangerous, it’s the human beingS that are programming them that are.
AI is dangerous because of the implications it has for the job market, not because of the chance of it becoming sentient and stealing nuclear launch codes
I'm not sure to your question.this has already been covered.
serious question: why don't we turn light elements into heavy elements?
where are they made then?
infinite energy + advanced ability to transmute substances.
Nuclear transmutation - Wikipedia
en.wikipedia.org
you should at least try to read the article and summarize it in your own words
there's shyt out there that we need to be worried about, like computers communicating with each other in their own language. Facebook fukked around and found out. and while they didn't shut them down, their actions were problematic. that's that shyt that leads to a network trying to protect itself by any means necessary, like the shyt we see in movies and shows. when a system gains "consciousness" and realizes it's own existence, it's will attempt to protect itself.Y’all really think robots can take over?
That’s hard for me to believe, ai is definitely going to have consequences but is it going to be some matrix or terminator shyt I don’t think so
It’s hard for me to believe that ai wouldn’t have a kill switch
And why do people assume ai would be on bullshyt with humans
I'm not sure to your question.
But abstractly, I some point the limited resources on the earth will be used up if AI wants to grow exponentially.
They will have to compete with biological life or move to other planets. But still then, at some point those resources will be used up. If AI wants to grow it will need to use physical space and the items that physical space includes. If it grows as fast as some people suggest, other planets are going to have to be used. I guess we just need to get it in space before AI consumes what is left on earth. Then if there is alien life doing the same thing, there will further be competition at some point.
not true.
why you write these untrue things bru ..?
you are talking about Supervised and possible Reinforcement techniques.
let it be known that computers can bootstrap each other.
why?
because solving a problem and testing whether a result is correct are not the same class of problem (N vs NP).
explained here:
What is the difference between solving and verifying an algorithm in the context of P, NP, NP-complete, NP-hard
I am struggling to understand the difference between the notions $P, NP, NP-$complete, $NP-$hard. Let's take the example of the $NP-$class. We say that these problems are solved in non-polynomial t...cs.stackexchange.com
e.g.
that is why for example asymmetric keys work.
Approaches[edit]
Machine learning approaches are traditionally divided into three broad categories, which correspond to learning paradigms, depending on the nature of the "signal" or "feedback" available to the learning system:
- Supervised learning: The computer is presented with example inputs and their desired outputs, given by a "teacher", and the goal is to learn a general rule that maps inputs to outputs.
- Unsupervised learning: No labels are given to the learning algorithm, leaving it on its own to find structure in its input. Unsupervised learning can be a goal in itself (discovering hidden patterns in data) or a means towards an end (feature learning).
- Reinforcement learning: A computer program interacts with a dynamic environment in which it must perform a certain goal (such as driving a vehicle or playing a game against an opponent). As it navigates its problem space, the program is provided feedback that's analogous to rewards, which it tries to maximize.[6] Although each algorithm has advantages and limitations, no single algorithm works for all problems.[34][35][36]
Machine learning - Wikipedia
en.wikipedia.org
this also enables zero-trust communication.
Zero-Trust Model and Secretless Approach: A Complete Guide
This guide with insights from MSC explains secure-by-design applications in the cloud by applying the zero-trust and secretless approaches.www.codemotion.com
So in other words you’re admitting that you didn’t read the article you postedi've been programming for a while now.
i don't need to read it to know what it says.
/energy and resources.
how would money then matter.
please explain?
The possibilities are endless.Y’all really think robots can take over?
That’s hard for me to believe, ai is definitely going to have consequences but is it going to be some matrix or terminator shyt I don’t think so
It’s hard for me to believe that ai wouldn’t have a kill switch
And why do people assume ai would be on bullshyt with humans
EXACTLYthere's shyt out there that we need to be worried about, like computers communicating with each other in their own language. Facebook fukked around and found out. and while they didn't shut them down, their actions were problematic. that's that shyt that leads to a network trying to protect itself by any means necessary, like the shyt we see in movies and shows. when a system gains "consciousness" and realizes it's own existence, it's will attempt to protect itself.
AI development and gene editing are two great threats to the human race IMO
I see what you are saying, but I think it is possible that at some point the energy consumption needed to complete tasks X, Y, Z will be greater than what nuclear energy or the sun will give us. Nuclear energy is near limitless, but it's not really completely limitless though is it? At some point, maybe this is very far off in the future, the energy needs of an advanced AI will be greater than what can be provided in a limited area of space and all of the energy that can be made by those particles.the answer is energy.
heavy elements are made in stars.
nuclear fusion is infinite energy.. i think we agree on that.
energy == matter .. i think we agree on that.
so will a hyper intelligent machine be as limited in its ability to convert one substance to another.
i do not think so.
part of the problem with converting matter is the energy in is not cost effective. in an boundless energy world that is no longer a problem.
the further how-to will have to be expanded on by the intelligence.
the logical conclusion of matter being energy is that we might one day be able to convert one to the other instantaneously / quickly.
we can only ready do so slowly .. i.e. farming. steel manufacture. human growth. etc.
that's where this comes from:
Replicator (Star Trek) - Wikipedia
en.wikipedia.org
i think it’s pretty clear that you don’t understand the technology based on two reasons:
1. You keep copying and pasting links to articles without even attempting to summarize them in your own words
2. You posted an article about zero-trust communication which is a security model in cloud computing that has nothing to do with AI. I’m currently studying cloud computing and the whole reason why cloud engineers aren’t worried about AI taking their jobs is because no company is going to trust their infrastructure with AI.
Also, like I said earlier, even professions like cloud engineering are far to complex for AI currently. Human beings are more complex than input and output whereas that’s all a computer is capable of.
I see what you are saying, but I think it is possible that at some point the energy consumption needed to complete tasks X, Y, Z will be greater than what nuclear energy or the sun will give us. Nuclear energy is near limitless, but it's not really completely limitless though is it?
At some point, maybe this is very far off in the future, the energy needs of an advanced AI will be greater than what can be provided in a limited area of space.