spliz
SplizThaDon
U mean the majority of the PC gaming world? LolRise of the Tomb Raider Graphics & Performance Guide
that's gonna lead to a lot of people with 4gb (or less) cards claiming the game isn't well optimized
U mean the majority of the PC gaming world? LolRise of the Tomb Raider Graphics & Performance Guide
that's gonna lead to a lot of people with 4gb (or less) cards claiming the game isn't well optimized
uh, Nvidia had "gsync" before AMD copied it and called theirs "freesync"I think its more Nividia vs AMD. AMD has features that Nividia lacks in which may require downloads for new drivers etc... Vice versa. I heard Nividia limited some aspects of the game which is hurting AMD GPUs.
AMD has a lot features for gamers that is absent from Nivida overrated ass company. Free sync and computing shyt which can help the developer.
Rise of the Tomb Raider Graphics & Performance Guide
that's gonna lead to a lot of people with 4gb (or less) cards claiming the game isn't well optimized
That's not true smh. People don't know anything about PC gaming and expect their cards from three years ago to "Max" out a new game. If there were no options it'd be stagnant.Reading online from reddit to other sites that this is a bad port.
That's not true smh. People don't know anything about PC gaming and expect their cards from three years ago to "Max" out a new game. If there were no options it'd be stagnant
Shouldn't a 980ti be able to max this out at 1080p though? It can't at 60fpsThat's not true smh. People don't know anything about PC gaming and expect their cards from three years ago to "Max" out a new game. If there were no options it'd be stagnant
yes, the majority should not be using "very high" texturesU mean the majority of the PC gaming world? Lol
should they not include/hide graphics options that don't allow the <insert expensive GPU> to get 60fps at 1080p?Shouldn't a 980ti be able to max this out at 1080p though? It can't at 60fps
No they shouldn't hide anything, but this is an X1 port, not crysis 4. I'd expect a little better optimization from a game like this. This is harder to run than Witcher 3. That's absurd in my opinionshould they not include/hide graphics options that don't allow the <insert expensive GPU> to get 60fps at 1080p?
should they not include/hide graphics options that don't allow the <insert expensive GPU> to get 60fps at 1080p?
Watch Dogs was a messed up game from start to finish. Nobody could play it at high or ultra until 6gb cards came out. It was a stuttering mess. fukk that gameWatchdogs did something similiar by capping the graphics right?
I don't think the shyt looks great enough to need a damn 980ti to run it decent on very high but hey. that's the story of this gen. lol.yes, the majority should not be using "very high" textures
it includes options not found in the X1 versionNo they shouldn't hide anything, but this is an X1 port, not crysis 4. I'd expect a little better optimization from a game like this. This is harder to run than Witcher 3. That's absurd in my opinion
Performance at "max" settings, without context and deep understanding what these settings entail, is completely irrelevant for judging the technical quality of a game, and it's highly damaging how often it seems to be used to evaluate the same. I've wanted to make a thread about this for a while, and seeing how there is right now another one on the front page with "max settings" in the title it seems as good a time as ever.
These days, many people seem to judge the "optimization" (a broadly misunderstood term if I ever saw one!) of games on how they run at "max" settings. What does this mean in practise? Let's say I'm porting a game to PC, and I'm trying to decide which options to include. I could easily add the option of rendering shadow depth buffers at 32 bit precision and up to 4096x4096 instead of the 16 bit and 1024² default. But what would this actually cause to happen? Basically, it will improve IQ and image stability, especially at very high resolution. However, let's assume for the sake of argument that it also halves the framerate of my port, when enabled.
In the prevailing simplistic mindset, I just went from a "great, optimized port" to a "piece of shyt port showing how my company is disrespectful of PC gamers" merely by adding an option to my game.
I hope everyone can see how fukking insane this is. As a developer aware of this, I basically have 2 options:
The first point wouldn't be too bad, but it seems like the much more rare choice. If the prevailing opinion of my game's technical quality actually goes down by including high-end options, then why bother at all?
- Only allow access to higher-end settings via some ini file or other method which is not easily accessible.
- Simply don't bother with higher-end settings at all.
Of course, gamers are not to blame for this exclusively. Review sites got into the habit of benchmarking only "max" settings, especially during the latter part of the PS360 generation, simply because GPUs wouldn't be challenged at all in the vast majority of games otherwise.
Agreed. It's not good enough looking to justify the specsI don't think the shyt looks great enough to need a damn 980ti to run it decent on very high but hey. that's the story of this gen. lol.