The PC Thread - Tips, Benchmarks, Specs, Laptops, Custom Desktops, Pre-Builds and more.

MMA

Superstar
Joined
Apr 5, 2015
Messages
5,801
Reputation
2,823
Daps
29,184
Nvidia removing the sli bridges from the 2070 confirms my suspicions... nvidia is a bytch company.


Having said that I got a 1080 ti for $526 so... :lolbron:
Wow I didn't even read that.
This is more and more by the day sounding like a filler card before volta.. nobody is buying the 2080 too, :mjlol: it's available everywhere

They are going to wait for the left over 10 series cards to sell before they release the 2070, 2060 etc - Nvidia is not slink. By giving the 2070 SLI, their isn't a need to probably by left over 1080/1080 ti's. They overproduced because miners like myself was turning the average consumer to card 1/1 ratio to 1/5 ratio
 
Last edited:

Gold

Veteran
Supporter
Joined
Aug 25, 2015
Messages
43,670
Reputation
19,581
Daps
292,398
Wow I didn't even read that.
This is more and more by the day sounding like a filler card before volta.. nobody is buying the 1080 too, :mjlol: it's available everywhere

They are going to wait for the left over 10 series cards to sell before they release the 2070, 2060 etc - Nvidia is not slink. By giving the 2070 SLI, their isn't a need to probably by left over 1080/1080 ti's. They overproduced because miners like myself was turning the average consumer to card 1/1 ratio to 1/5 ratio

I agree with everything you said except for the "before volta" part.
I do think the 20 series are a filler series before they perfect this new Turing architecture, but I think Volta is dead.
V100 and Titan V seem to be the only Volta cards and it will probably stay that way.

But yeah if the 2070 had sli... i'm pretty sure it would give the 2080ti a run for its money, and remove the need to ever sli 1080/1080ti as you said.

I'm just happy prices are finally dropping. 2 years without a single price drop is fukked up :gucci:
 

MMA

Superstar
Joined
Apr 5, 2015
Messages
5,801
Reputation
2,823
Daps
29,184
I agree with everything you said except for the "before volta" part.
I do think the 20 series are a filler series before they perfect this new Turing architecture, but I think Volta is dead.
V100 and Titan V seem to be the only Volta cards and it will probably stay that way.

But yeah if the 2070 had sli... i'm pretty sure it would give the 2080ti a run for its money, and remove the need to ever sli 1080/1080ti as you said.

I'm just happy prices are finally dropping. 2 years without a single price drop is fukked up :gucci:
Sorry us miners ruined everything truly :mjcry:

1080 ti will drop to 600-650 soon, and other 10 series. Most new buyers will go there like Nvidia predicts.

Yeah I think they are trying to force AMD (who'll power the next xbox and playstation) gpu with raytracing

Tom hardware lost their mind, Nvidia has to be paying them. Look at the preorders - still available flop 2080 :mjlol:

Just Buy It: Why Nvidia RTX GPUs Are Worth the Money

Introducing NVIDIA GeForce RTX 20 Series of Graphics Cards
 
Last edited:

daze23

Siempre Fresco
Joined
Jun 25, 2012
Messages
31,817
Reputation
2,682
Daps
43,748
Coder Corner » Archivio Blog » Raytracing, RTX 2080, DXR, PhysX

Nvidia just announced the new RTX 2080 and I see a lot of weird, confused comments on the Internet.

So for the record:

“Nobody uses PhysX”: wrong.

PhysX is the default physics engine in both Unity and Unreal. Which means it is used in tons of games, on a lot of different platforms (PC, Xbox, PS4, Switch, mobile phones, you name it).

“PhysX” is not just the GPU effects you once saw in Borderlands. It has also always been a regular CPU-based physics engine (similar to Bullet or Havok).

When your character does not fall through the ground in Fortnite, it’s PhysX. When you shoot a bullet in PayDay 2, it’s PhysX. Ragdolls? Vehicles? AI? PhysX does all that in a lot of games. It is used everywhere and it is not going away.

______________________

“That game does not use PhysX because it is not listed on X or Y“: potentially wrong.

Some people pointed out the lists of “PhysX games” found online as “proofs” that PhysX was dead and not used anymore. Unfortunately these lists are incomplete. Even the one on Nvidia’s own website is not up-to-date. There are a couple of reasons for this:

  • PhysX is kind of “done” now, and more-or-less in maintenance mode. The team size shrunk a lot while the focus moved to raytracing, autonomous cars, AI, etc. There is nobody tracking PhysX titles anymore, and sometimes we don’t even have a matching PhysX documentation online. In fact right now, the online doc is for version 3.4.0, and it has been built more than a year ago. It does not match the latest build on GitHub.
  • It became a lot more difficult to track PhysX titles after we put the code on GitHub. Anybody can register, download and use PhysX these days. And we don’t require the game to include a PhysX logo anywhere, so there are a lot of them we don’t even know about - we hear about them when they ask for console builds or when they ask for help.
  • There is little value for Nvidia to track and list all the games using CPU PhysX. So, we don’t.
______________________

PhysX is crippled“: wrong.

I already posted about that one. You can read about it here and here for example.

In short: no it’s not.

______________________

The only reason devs use PhysX is because Nvidia pays them“: wrong.

It’s the opposite. PhysX is a middleware like the others. Unless you get some kind of special mutually beneficial deal, you actually have to pay for support and/or to use the console libs. In the old days you got it for free in exchange for adding GPU PhysX effects to your game. But it was always a choice left to devs, and few actually took it. See next point.

______________________

GPU PhysX is dead“: debatable.

“GPU PhysX” can mean different things, and admittedly the differences are not always clear or obvious.

If we’re talking GPU particles or cloth, they have indeed been deprecated in recent PhysX versions, replaced with FLEX (an entirely different engine, unrelated to PhysX so far). The FLEX implementation is better, so there was no need to keep supporting both.

If we’re talking GPU rigid bodies, it is wrong. PhysX 3.4 (the most recent public version of PhysX) even introduced a brand new GPU rigid body codepath - the most efficient we ever built. So it is the opposite of dead: that new version was basically just born yesterday.

However it is fair to say that the GPU stuff is not used much (although a bunch of projects do so). I suspect this is both for the obvious reason everybody points out (it would only run on Nvidia hardware) and because the CPU stuff is fast enough for regular physics in regular games.

______________________

Raytracing is the new PhysX“: LOL

This is such a ridiculous statement…

Raytracing has been the Holy Grail of computer graphics for as long as I can remember. I was already playing with GFA Raytrace back on the Atari ST in, what, 1991? There has been dreams of “realtime raytracing” for decades. It’s been used in movies for years. There is and there has been more interest for raytracing than there ever was for accelerated physics. It’s just so weird to compare both.

Now if you mean that this is an Nvidia-only thing that will only run on Nvidia hardware, this is (hopefully) wrong again. The main difference is that there is already a Microsoft-provided hardware-agnostic raytracing API, called DXR. There has never been something similar for “direct physics”. So you should not compare it to PhysX, instead you might compare it to the introduction of programmable shaders when everybody was using a fixed-function rendering pipeline. That comparison would make a lot more sense.

Beyond that, you might remember how some games needed a card that supported “shader model 3″ in order to run. Or before that, before shaders, how the D3D caps told the game whether your card supported cube-mapping, fog, “TnL”, you name it. It has always been like this: different hardware from different companies have different capabilities.

______________________

Nvidia is forcing raytracing down devs’ and gamers’ throats“: wrong.

If they could control/force devs and gamers, you would have GPU physics everywhere by now. Your own statement that “GPU PhysX is dead” is not compatible with the statement that “Nvidia is forcing people”.

______________________

X or Y can do raytracing already“: right, but…

You can do raytracing everywhere in software. GFA Raytrace did raytracing back on the Atari ST. PhysX can do raytracing. Firing a bullet in any game is done with a “raycast”, which is basically the same as raytracing - for a single ray.

You could also do raytracing in hardware before, as in Nvidia’s own OptiX library, or on AMD with Radeon Rays. All true.

The difference, and the thing that gamers usually do not fully grasp, is that raytracing is really, really, REALLY slow. In the movie industry for example, they’re not talking in frames-per-second, but rather seconds/minutes/hours-per-frame. So while it is true that one could do raytracing before, the combination of dedicated RT Cores (to trace the rays) and Tensor Cores (to do some denoising on the final image) is indeed new, and the results are closer than ever to the “realtime raytracing” people have been labelling “the future” for so long.

______________________

Raytracing is a proprietary tech“: wrong.

Look, you just said that AMD could do raytracing as well, so how is this “proprietary” now? These two statements are again incompatible.

“Raytracing” is an algorithm. It has been here since forever, it has nothing to do with Nvidia.

Now Nvidia’s implementation of Microsoft’s raytracing API is of course “proprietary” if you want. But that is exactly the same situation as for the regular Direct3D rasterization API, or exactly the same as for Nvidia’s implementation of the OpenGL API: the implementation is specific to Nvidia, just because the underlying hardware is different from AMD’s. You cannot avoid that.

The ball is now in AMD’s court. The API is here, it’s hardware-agnostic, they “just” have to implement it for their hardware. You cannot blame Nvidia for not doing AMD’s job, now can you?

______________________

It’s a gimmick until it’s supported by consoles“: right.

Or probably right, I don’t know, but in my opinion this is a very fair statement. Like for shaders before, it probably will only survive if consoles eventually support it.

But the raytracing API itself comes from Microsoft, so it is easy to imagine that it will eventually appear in some future Xbox consoles. And this time they will really deserve their “next-gen” moniker.

______________________

It’s a waste of time“: debatable

As an old demomaker who spent years doing graphics stuff in assembly for free on old hardware, I am impermeable to the “waste of time” argument.

But you can see it as an investment in the future anyway. Realtime raytracing has long been the holy grail of computer graphics. It’s high time it became the present rather than the elusive future. If you never “waste time” like this, nothing ever happens.
 

Negro Caesar

Superstar
Joined
Jan 19, 2018
Messages
5,768
Reputation
502
Daps
22,055
Specs:
- Windows 10
- Intel i5-7600K 3.80Hz
- MSI GTX 1060 6gb
- Corsair 16gb ram
- 125gb SSD
- 2tb HD
- Asus Z170 Pro gaming motherboard
- 750w power supply


I’m thinking about buying this off a guy I know. Are these specs worth the 700 he is asking?
 

The War Report

NewNewYork
Joined
Apr 30, 2012
Messages
51,046
Reputation
4,967
Daps
107,657
Reppin
The Empire State
Specs:
- Windows 10
- Intel i5-7600K 3.80Hz
- MSI GTX 1060 6gb
- Corsair 16gb ram
- 125gb SSD
- 2tb HD
- Asus Z170 Pro gaming motherboard
- 750w power supply


I’m thinking about buying this off a guy I know. Are these specs worth the 700 he is asking?
Pretty good deal for 700 dollars.

And you can sell the 1060, put in a difference for a 1080, and then you're really good.
 
Last edited:

MMA

Superstar
Joined
Apr 5, 2015
Messages
5,801
Reputation
2,823
Daps
29,184
Specs:
- Windows 10
- Intel i5-7600K 3.80Hz
- MSI GTX 1060 6gb
- Corsair 16gb ram
- 125gb SSD
- 2tb HD
- Asus Z170 Pro gaming motherboard
- 750w power supply


I’m thinking about buying this off a guy I know. Are these specs worth the 700 he is asking?
that is not a good deal at all
cpu + motherboard = 120-140
gpu - 150
ram - 100
ssd - 20-30
hdd - 20
psu/case - shouldn't be charged.
400-500 dollar build - anything you are being cheated
 

Liquid

Superstar
WOAT
Joined
Apr 30, 2012
Messages
37,122
Reputation
2,625
Daps
59,900
that is not a good deal at all
cpu + motherboard = 120-140
gpu - 150
ram - 100
ssd - 20-30
hdd - 20
psu/case - shouldn't be charged.
400-500 dollar build - anything you are being cheated
:wtf:

Your prices are way lower than what those parts are going for.

I say it's a fair price for the build.
 
Last edited:
Top