The Witcher 3: Wild Hunt (PC, Xbox Series X|S, Xbox One, Switch, PS5, PS4)

Ciggavelli

|∞||∞||∞||∞|
Supporter
Joined
May 21, 2012
Messages
27,987
Reputation
6,572
Daps
57,316
Reppin
Houston

daze23

Siempre Fresco
Joined
Jun 25, 2012
Messages
31,780
Reputation
2,681
Daps
43,700
Well you might not be disappointed, but judging by reactions elsewhere, people are pretty pissed that 780s can't run this game very well. shyt titan xs aren't even running this well. I just got the new drivers, so hopefully they're good.

If I just bought a 970 and found out I can't run this at ultra 60fps, I'd be pissed. Just like I'm gonna be pissed if I can't run this at ultra 1440p with my titan x sli setup.

The 780s aren't that old. To have to turn down so much is kinda crazy. I'm sure it'll run find at medium/high (with hairworks off). I just expected it to be a bit more optimized. The game looks good, but not that good.

We'll see in 6 hours
people with unrealistic expectations get disappointed everyday b

I don't agree that a game isn't "optimized" just because the latest expensive GPU can't run it with all the settings turned all the way up

I'll quote Durante (again...)

Performance at "max" settings, without context and deep understanding what these settings entail, is completely irrelevant for judging the technical quality of a game, and it's highly damaging how often it seems to be used to evaluate the same. I've wanted to make a thread about this for a while, and seeing how there is right now another one on the front page with "max settings" in the title it seems as good a time as ever.

These days, many people seem to judge the "optimization" (a broadly misunderstood term if I ever saw one!) of games on how they run at "max" settings. What does this mean in practise? Let's say I'm porting a game to PC, and I'm trying to decide which options to include. I could easily add the option of rendering shadow depth buffers at 32 bit precision and up to 4096x4096 instead of the 16 bit and 1024² default. But what would this actually cause to happen? Basically, it will improve IQ and image stability, especially at very high resolution. However, let's assume for the sake of argument that it also halves the framerate of my port, when enabled.

In the prevailing simplistic mindset, I just went from a "great, optimized port" to a "piece of shyt port showing how my company is disrespectful of PC gamers" merely by adding an option to my game.

I hope everyone can see how fukking insane this is. As a developer aware of this, I basically have 2 options:
  1. Only allow access to higher-end settings via some ini file or other method which is not easily accessible.
  2. Simply don't bother with higher-end settings at all.
The first point wouldn't be too bad, but it seems like the much more rare choice. If the prevailing opinion of my game's technical quality actually goes down by including high-end options, then why bother at all?

Of course, gamers are not to blame for this exclusively. Review sites got into the habit of benchmarking only "max" settings, especially during the latter part of the PS360 generation, simply because GPUs wouldn't be challenged at all in the vast majority of games otherwise.
 

Ciggavelli

|∞||∞||∞||∞|
Supporter
Joined
May 21, 2012
Messages
27,987
Reputation
6,572
Daps
57,316
Reppin
Houston
the benchmark in that link doesn't seem to include the 780
look at the second set of benchmarks. It's there. It's the benchmarks without nvidia gameworks stuff on. Not many people are gonna be able to run this with nvidia gameworks anyway, so it's a better benchmark
 

daze23

Siempre Fresco
Joined
Jun 25, 2012
Messages
31,780
Reputation
2,681
Daps
43,700
look at the second set of benchmarks. It's there. It's the benchmarks without nvidia gameworks stuff on. Not many people are gonna be able to run this with nvidia gameworks anyway, so it's a better benchmark
ok I see it now. it could be because Maxwell is supposed to be better at 'compute' than Kepler
 

Kamikaze Revy

Bwana ni mwokozi wangu
Supporter
Joined
Sep 4, 2012
Messages
29,653
Reputation
9,356
Daps
75,923
Reppin
Outer Heaven
tumblr_mjywiba4uw1s14aeuo1_500.gif
 

daze23

Siempre Fresco
Joined
Jun 25, 2012
Messages
31,780
Reputation
2,681
Daps
43,700
yeah, project cars was the same way

http://www.techspot.com/review/1000-project-cars-benchmarks/page2.html

it's just weird that a 960 is better than a 780. Logically that makes no sense, ya know?
these Project Cars benchmarks say otherwise

http://www.computerbase.de/2015-05/project-cars-guide-grafikkarte-prozessor-vergleich/2/

http://www.pcgameshardware.de/Project-CARS-PC-238576/Specials/Benchmark-Test-1158026/

http://pclab.pl/art63572-10.html

could just be anecdotal and/or an issue with their testing methodologies. we'll have to wait for more benchmarks I guess (don't you do meta-analysis?)
 

winb83

52 Years Young
Supporter
Joined
May 28, 2012
Messages
45,029
Reputation
3,748
Daps
68,242
Reppin
Michigan
I'm glad I dumped my 780 when I did to upgrade to a 980. Kinda feel bad for the 780 crowd cause it's a good card too.
 
Top