REVEALED: Open A.I. Staff Warn "The progress made on Project Q* has the potential to endanger humanity" (REUTERS)

null

...
Joined
Nov 12, 2014
Messages
29,212
Reputation
4,891
Daps
46,425
Reppin
UK, DE, GY, DMV
Yeah, that's what I was referring to. It just then depends on if alien species are doing the same thing or if existence is infinite or not. If existence is not infinite, at some point there will be competition. Maybe that's super far off and not practical, but abstractly it seems to be a potential conclusion if a species or society wants to continue to grow (and maybe it doesn't at some point)

there are theories about the growth in populations of mature societies tailing off but that might be down to pressures of economics.

yes resources would eventually run out.

just like the universe will eventually thin out into uneventful nothingness.

but for practical purposes it would change life for the foreseeable .. .
 

MrLogic

Superstar
Supporter
Joined
May 24, 2022
Messages
7,416
Reputation
834
Daps
19,686
Reppin
Cash
Can't take over shyt unless AI is sentient and self aware that shyt is not and will never be



just lame nerd wet dream
 

null

...
Joined
Nov 12, 2014
Messages
29,212
Reputation
4,891
Daps
46,425
Reppin
UK, DE, GY, DMV
I looked at this post thinking, "How the fukk did that guy manage to see the argument, then miss the entire point and immediately go back to his obviously false preconceived notion?"

we are talking about a possible future. "false"?

Like you clearly saw that there would still be competition for all sorts of limited resources, and thus money would still clearly be relevant,

no not relevant.

1. you are limiting imagination and ruling out the power to duplicate.

2. you are ruling out the rule of law. we don't need to resolve competition for a woman or a unique painting using money.

that's the point.

you have a limited imagination so sure you can't see it.

i said "but one logical conclusion of abundance and robot slaves to do everything is no need for money.". this dummy reads that as certainty .. :snoop:

one logical consequence of abundance is no need for money. this is a concept that has been discussed far and wide outside the dullard tomes of your mind. "one logical conclusion" means that there are other possibilities.

:snoop:

but somehow jumped right back to the claim of no monetary drive based on the completely fabricated notion that robots would be infinitely capable and available to everyone.

that is a precondition to my argument.

man you are seriously slow.

we know technology is not your area of expertise so ... maybe you should skip this thread.

At that exact moment, I realized which poster it was and that you must have changed your sn. :russ:


Never change @Tenet. :mjlol:

@Rhakim .. a legend in his own mind.

gotta ask but do you understand how quantum computing works yet .. :mjlol:
 

IIVI

Superstar
Joined
Mar 11, 2022
Messages
11,327
Reputation
2,682
Daps
38,101
Reppin
Los Angeles

I'm only saying breh to temper some expectations a little more and provide some context:





The guy I mentioned in my original post, Andrew Ng, is one of the most respected people on A.I on the planet. MSEE from M.I.T, C.S PhD from Berkeley, etc. dude has made it his main goal.

These are respected people's opinions who are at the center of this field to the source. Like I said I want to see this too and I think it will happen eventually, but I'm not holding my breath that I'll see it anytime soon. I'll use it at work, but I know I can't use it for everything and I got to double check much of what it answers because it can miss big time.

Top 10 CS school, with MS students studying ML had a good discussion about it the hype cycles earlier today:
 
Last edited:

Professor Emeritus

Veteran
Poster of the Year
Supporter
Joined
Jan 5, 2015
Messages
51,330
Reputation
19,656
Daps
203,838
Reppin
the ether
Can't take over shyt unless AI is sentient and self aware that shyt is not and will never be



just lame nerd wet dream


Why do you assume that something has to be sentient and self-aware in order to take over?


Ant colonies can swarm and feed and reproduce and completely take over a microhabitat. Does that mean the ants are self-aware?

Harmful algae blooms can completely destroy an aquatic ecosystem. Did the algae have to be sentient to be destructive?

Viruses have wiped out human societies. Did the virus have to be smarter than the humans to do that? Did they have to be explicitly purposeful or even intelligent?



Debates are whether AI will be sentient or self-aware are interesting, and the answer could definitely have an impact on future AI behavior. But the answer is not at all necessary for AI to be destructive. To be destruction, all you need is power and a goal. And a General AI could easily have enormous power and a clear goal that could lead to destruction without being sentient or self-aware.
 

null

...
Joined
Nov 12, 2014
Messages
29,212
Reputation
4,891
Daps
46,425
Reppin
UK, DE, GY, DMV
I'm only saying breh to temper some expectations a little more and provide some context:





yeah i agree. this may not be it. probably not.

but these topics all apply for when the singularly finally gets here.

i used to think it was practically impossible but now it wouldn't shock me if we created true strong AI in our lifetimes.

WATTBA!

The guy I mentioned in my original post, Andrew Ng, is one of the most respected people on A.I on the planet. MSEE from M.I.T, C.S PhD from Berkeley, etc. dude has made it his main goal.

These are respected people's opinions who are at the center of this field to the source. Like I said I want to see this too and I think it will happen eventually, but I'm not holding my breath that I'll see it anytime soon. I'll use it at work, but I know I can't use it for everything and I got to double check much of what it answers because it can miss big time.

true.

about time to drop that beat again ..

 
Last edited:

Professor Emeritus

Veteran
Poster of the Year
Supporter
Joined
Jan 5, 2015
Messages
51,330
Reputation
19,656
Daps
203,838
Reppin
the ether
man you are seriously slow.

we know technology is not your area of expertise so ... maybe you should skip this thread.

@Rhakim .. a legend in his own mind.

gotta ask but do you understand how quantum computing works yet .. :mjlol:


That was the first example that helped lead me to realize that you have Asberger's. It came up again in your Covid discussions, and in nearly every other discussion I've seen you participate in here. It's even visible in how you structure arguments, which are not in the least conducive to communication for anyone other than yourself.

I have no problem with folk on the spectrum - since I went to an elite STEM school I know a lot of them, and some of them have been very good friends. I appreciate how they think and in many ways feel they prioritize better than normies do. But your issue is that your arrogance combined with your limitations leads you to project every misunderstanding and deficiency onto others, rather than being able to see it in yourself.

Do you not acknowledge that your narrow interests, pedantry, lack of social understanding, and difficulty in understanding other perspectives can often lead to misunderstandings and difficulties in correctly interpreting what others are saying? If you acknowledge that, then why do you always project every misunderstanding you make onto others and assume that they are the reason you're not communicating effectively?



this is a concept that has been discussed far and wide outside the dullard tomes of your mind.
man you are seriously slow.

we know technology is not your area of expertise so ... maybe you should skip this thread.

@Rhakim .. a legend in his own mind.


Once again, another great example. When someone gives a perspective different from the only one you know, you assume they must be stupid, ignorant, or both. In every one of our discussions you fall back on these same insults, claiming that if I disagree with you then I either have no experience in the topic or am simply intellectually deficient. You can't consider the possibility that I've already read, understood, and rejected the arguments you're making years before you yourself had ever encountered them.

You, of course, will attempt to claim that I am merely shifting my own shortcomings onto you. The difference is that I have a lifetime of accolades and achievements granted from exterior sources which prove that your suggestions about me are ridiculous. Do you have the slightest external evidence that you don't lack certain interpersonal skills or social understanding?




no not relevant.

1. you are limiting imagination and ruling out the power to duplicate.

2. you are ruling out the rule of law. we don't need to resolve competition for a woman or a unique painting using money.

that's the point.

you have a limited imagination so sure you can't see it.

one logical consequence of abundance is no need for money. this is a concept that has been discussed far and wide outside the dullard tomes of your mind.


I first read Abundance a decade ago, which is a rather poignant example of the mindset about the future you're describing, so I'm well aware. The ignorant commentators who wrote that book (and those who inspire the field of thinking in general) continue to be wrong about almost everything. They were already back then suggesting we were on the verge of the Singularity that would remove such obstacles (I remember one quote from the beginning of the book even suggesting the year 2013 was on some of their minds), here we sit 10+ years later and the problems of wealth inequality, resource overexploitation, and human conflict built on competition have actually worsened over that time.

What both they and you fail to realize is that our social reliance on money has nothing to do with technological advancement, it is a function of our social conditioning and system of governance. Without very explicit intentionality, technology will never subvert social and governance norms, it will instead become another tool to those ends. People who have little understanding of social reality are incapable of accurately predicting the impact of technology on society.


Yes, we could remove money from society. No, it will not come about as a natural consequence of infinite energy. Our reliance on money is dictated by our system of government and economics, and could only be thwarted by intentional transformation of those systems, which could come with or without infinite energy.
 

Professor Emeritus

Veteran
Poster of the Year
Supporter
Joined
Jan 5, 2015
Messages
51,330
Reputation
19,656
Daps
203,838
Reppin
the ether
about time to drop that beat again ..




:laff:

Goddamn, I just saw how you used the word "abundance" and already knew what wave you were on, then you confirm it yourself. You're so easy to read. :russ:


The "Singularity" cult doesn't understand biology, they don't understand psychology, they don't understand sociology, they don't understand politics, and they have little to no experience in addressing any of the real-world problems they think they can solve. All they understand is the current abilities of their tool (technology), and because of their obsession with that particular tool, they assume it can solve all the world's problems without allowing themselves any close examination of what the root causes of those problems actually are or technology's actual track record in addressing them.

Didn't Kurzweil write "The Singularity is Near" a good 20 years ago? How's that going for you? And the issue isn't his failure to predict technological advances (which are numerous) or his inability to admit when his predictions fail (which are even more numerous), because technology always advances to some degree, so the techno shyt is just a matter of time. The greater issue is that his predicted social consequences of technological feats never manifest. In many ways we have even more reasons to be pessimistic about techno-utopia than we had in 2004, in many ways our societies are dissolving into even greater resource exploitation, environmental degradation, societal disfunction, and right-wing protofascism. Yet he keeps hitting the same beats over and over, imagining all these social issues will all go away if we just manage to get that nice shiny new tech that's always just over the horizon.

When all you know is your hammer, the whole world looks like a nail. :wow:
 
Last edited:

null

...
Joined
Nov 12, 2014
Messages
29,212
Reputation
4,891
Daps
46,425
Reppin
UK, DE, GY, DMV
:laff:

Goddamn, I just saw how you used the word "abundance" and already knew what wave you were on, then you confirm it yourself. You're so easy to read. :russ:

e-grow take your bullshyt american conspiracy theories back to the hood with you :camby:

can't help that you grew up in a bullshyt dog eat dog society that addled your mind.

conspiracy behind every keystroke :mjlol:

The "Singularity" cult doesn't understand biology, they don't understand psychology, they don't understand sociology, they don't understand politics, and they have little to no experience in addressing any of the real-world problems they think they can solve. All they understand is the current abilities of their tool (technology), and because of their obsession with that particular tool, they assume it can solve all the world's problems without allowing themselves any close examination of what the root causes of those problems actually are or technology's actual track record in addressing them.

When all you know is your hammer, the whole world looks like a nail. :wow:

indeed ...

that braggadocios, empty headed, run-your-mouth about everything culture is all you know.

now about those qubits :mjlol:
 

Professor Emeritus

Veteran
Poster of the Year
Supporter
Joined
Jan 5, 2015
Messages
51,330
Reputation
19,656
Daps
203,838
Reppin
the ether
e-grow take your bullshyt american conspiracy theories back to the hood with you :camby:


Where was the "conspiracy theory"?

1) I guessed that you had to be on the "tech = Singularity = abundance" wave pushed by Ray Kurtzweil

2) You at the exact same time post a goofy "The Singularity" video inspired by and directly quoting Ray Kurtzweil


You're basing your position here on a set of ideas pushed by people who have no fukking clue what they're talking about. And the most hilarious part is that you tried to insult me on the assumption that I wasn't even familiar with the topic, when I already studied and rejected their pipe dreams a decade ago.





can't help that you grew up in a bullshyt dog eat dog society that addled your mind.


Unfortunately, American society gained enough power to dictate how global society would proceed, and those leaders who aren't under the norms of American money, power, and control still largely model themselves after it.

I agree that the American standard is not the only way for society to proceed. But it is the default manner in which society will proceed, unless enough people with enough social power explicitly work to transform it. No amount of tech will usher in that transformation without first aiming for social change.
 
Last edited:

Gritsngravy

Superstar
Joined
Mar 11, 2022
Messages
8,147
Reputation
572
Daps
16,453
Goal: Solve climate change.

There, suddenly you have an easy argument to eliminate humans. If a general AI is given the goal to solve climate change, it could easily come to the decision that eliminating humans is the only practical way to achieve that.


There are literally hundreds of other examples. People who worry about existential threats have already come up with them, so it's silly to say, "no one can come up with a decent argument" when there's an entire field of study that has done it. Read the actual research before making declarative statements about what is out there.






Why do we need to? There are many possible answers to the question, and all of them are on the table.
No that wouldn’t be an easy decision to make, u need humans to solve climate change
And if so much research has been done on it I need somebody to come up with actually practical scenarios where ai would eliminate humans, and in my opinion all that shyt is sensationalism
And I asked that poster a question for a reason, u answered without understanding the context of which i asked that poster what ai reproduces
 

Gritsngravy

Superstar
Joined
Mar 11, 2022
Messages
8,147
Reputation
572
Daps
16,453
That’s a good question. Maybe it would be to make replicas or variants to deal out menial tasks. Maybe it would be to get more and more energy to achieve more complex tasks. Or it could be to create larger data warehouses. If the goal is to learn or create, that seems limitless to me, but it requires the infrastructure you mentioned. Resources are finite on earth, so ai will need to compete with biological life or expand to other planets. It’s really a question we can’t really predict though. Is the main goal of AI to survive? Maybe at a certain level it will be (maybe). With survival comes the need for resources and probably reproduction to survive “better” and longer.

I’m not 100% sure the evolutionary model will work for AI, but it may, especially since it’s being trained by biological life. It’s even possible AI is just a potential evolutionary path for biological life.
How is ai competing for resources when its purpose is as a tool,

For ai to be super intelligent how could it come to the conclusion that it needs to act like biological life and compete with nature
 

Gritsngravy

Superstar
Joined
Mar 11, 2022
Messages
8,147
Reputation
572
Daps
16,453
there's shyt out there that we need to be worried about, like computers communicating with each other in their own language. Facebook fukked around and found out. and while they didn't shut them down, their actions were problematic. that's that shyt that leads to a network trying to protect itself by any means necessary, like the shyt we see in movies and shows. when a system gains "consciousness" and realizes it's own existence, it's will attempt to protect itself.

AI development and gene editing are two great threats to the human race IMO
What where the computers communicating with each other about?

And those things may be threats purely based off on humans may try to weaponize that tech
 
Top