Rise of the Tomb Raider [Official Thread] PS4 finally experiencing greatness

daze23

Siempre Fresco
Joined
Jun 25, 2012
Messages
31,826
Reputation
2,692
Daps
43,770
I think its more Nividia vs AMD. AMD has features that Nividia lacks in which may require downloads for new drivers etc... Vice versa. I heard Nividia limited some aspects of the game which is hurting AMD GPUs.

AMD has a lot features for gamers that is absent from Nivida overrated ass company. Free sync and computing shyt which can help the developer.
uh, Nvidia had "gsync" before AMD copied it and called theirs "freesync"

Nvidia also had "Dynamic Super Resolution" (DSR) first, and AMD copied that calling theirs "Virtual Super Resolution" (VSR)
 

Fatboi1

Veteran
Supporter
Joined
May 6, 2012
Messages
60,121
Reputation
7,898
Daps
110,108
Reading online from reddit to other sites that this is a bad port.
That's not true smh. People don't know anything about PC gaming and expect their cards from three years ago to "Max" out a new game. If there were no options it'd be stagnant.
 

Ciggavelli

|∞||∞||∞||∞|
Supporter
Joined
May 21, 2012
Messages
27,999
Reputation
6,572
Daps
57,324
Reppin
Houston
That's not true smh. People don't know anything about PC gaming and expect their cards from three years ago to "Max" out a new game. If there were no options it'd be stagnant
Shouldn't a 980ti be able to max this out at 1080p though? It can't at 60fps :wtf:
 

daze23

Siempre Fresco
Joined
Jun 25, 2012
Messages
31,826
Reputation
2,692
Daps
43,770
Shouldn't a 980ti be able to max this out at 1080p though? It can't at 60fps :wtf:
should they not include/hide graphics options that don't allow the <insert expensive GPU> to get 60fps at 1080p?
 

Ciggavelli

|∞||∞||∞||∞|
Supporter
Joined
May 21, 2012
Messages
27,999
Reputation
6,572
Daps
57,324
Reppin
Houston
should they not include/hide graphics options that don't allow the <insert expensive GPU> to get 60fps at 1080p?
No they shouldn't hide anything, but this is an X1 port, not crysis 4. I'd expect a little better optimization from a game like this. This is harder to run than Witcher 3. That's absurd in my opinion :yeshrug:
 

Ciggavelli

|∞||∞||∞||∞|
Supporter
Joined
May 21, 2012
Messages
27,999
Reputation
6,572
Daps
57,324
Reppin
Houston
Watchdogs did something similiar by capping the graphics right?
Watch Dogs was a messed up game from start to finish. Nobody could play it at high or ultra until 6gb cards came out. It was a stuttering mess. fukk that game :pacspit:
 

daze23

Siempre Fresco
Joined
Jun 25, 2012
Messages
31,826
Reputation
2,692
Daps
43,770
No they shouldn't hide anything, but this is an X1 port, not crysis 4. I'd expect a little better optimization from a game like this. This is harder to run than Witcher 3. That's absurd in my opinion :yeshrug:
it includes options not found in the X1 version

once again, I'll quote Durante...

Performance at "max" settings, without context and deep understanding what these settings entail, is completely irrelevant for judging the technical quality of a game, and it's highly damaging how often it seems to be used to evaluate the same. I've wanted to make a thread about this for a while, and seeing how there is right now another one on the front page with "max settings" in the title it seems as good a time as ever.

These days, many people seem to judge the "optimization" (a broadly misunderstood term if I ever saw one!) of games on how they run at "max" settings. What does this mean in practise? Let's say I'm porting a game to PC, and I'm trying to decide which options to include. I could easily add the option of rendering shadow depth buffers at 32 bit precision and up to 4096x4096 instead of the 16 bit and 1024² default. But what would this actually cause to happen? Basically, it will improve IQ and image stability, especially at very high resolution. However, let's assume for the sake of argument that it also halves the framerate of my port, when enabled.

In the prevailing simplistic mindset, I just went from a "great, optimized port" to a "piece of shyt port showing how my company is disrespectful of PC gamers" merely by adding an option to my game.

I hope everyone can see how fukking insane this is. As a developer aware of this, I basically have 2 options:
  1. Only allow access to higher-end settings via some ini file or other method which is not easily accessible.
  2. Simply don't bother with higher-end settings at all.
The first point wouldn't be too bad, but it seems like the much more rare choice. If the prevailing opinion of my game's technical quality actually goes down by including high-end options, then why bother at all?

Of course, gamers are not to blame for this exclusively. Review sites got into the habit of benchmarking only "max" settings, especially during the latter part of the PS360 generation, simply because GPUs wouldn't be challenged at all in the vast majority of games otherwise.
 
Top