Yeah, I'm using the box that came with the caseDid u keep the box for the case u have?...
Yeah, I'm using the box that came with the caseDid u keep the box for the case u have?...
Performance at "max" settings, without context and deep understanding what these settings entail, is completely irrelevant for judging the technical quality of a game, and it's highly damaging how often it seems to be used to evaluate the same. I've wanted to make a thread about this for a while, and seeing how there is right now another one on the front page with "max settings" in the title it seems as good a time as ever.
These days, many people seem to judge the "optimization" (a broadly misunderstood term if I ever saw one!) of games on how they run at "max" settings. What does this mean in practise? Let's say I'm porting a game to PC, and I'm trying to decide which options to include. I could easily add the option of rendering shadow depth buffers at 32 bit precision and up to 4096x4096 instead of the 16 bit and 1024² default. But what would this actually cause to happen? Basically, it will improve IQ and image stability, especially at very high resolution. However, let's assume for the sake of argument that it also halves the framerate of my port, when enabled.
In the prevailing simplistic mindset, I just went from a "great, optimized port" to a "piece of shyt port showing how my company is disrespectful of PC gamers" merely by adding an option to my game.
I hope everyone can see how fukking insane this is. As a developer aware of this, I basically have 2 options:
The first point wouldn't be too bad, but it seems like the much more rare choice. If the prevailing opinion of my game's technical quality actually goes down by including high-end options, then why bother at all?
- Only allow access to higher-end settings via some ini file or other method which is not easily accessible.
- Simply don't bother with higher-end settings at all.
Of course, gamers are not to blame for this exclusively. Review sites got into the habit of benchmarking only "max" settings, especially during the latter part of the PS360 generation, simply because GPUs wouldn't be challenged at all in the vast majority of games otherwise.
I agree the game sucks. I don't understand the hype eitherso I played through Gone Home last night (I've owned it for a minute, but never really played), after seeing it compared to The Vanishing of Ethan Carter (which I really liked)...
as far as 'walking simulators': Ethan Carter >>>>>> Gone Hone
the fukking ending....
all that to find out the bytch ran off with her dyke girlfriend
I'm far from "homophobic", but I can't help but think this game received so much praise just because it touched on 'gay' issues
you typed that wrong you must have meant GTX 970.Copping a AMD 8530 after I complete my A+ cert.
Then a GTX 770
from gaf on shadow of mordor.Alright.
Ultra-Textures are not available yet, so the PC Gamer Vid only uses "High".
I doubt even the ultra-textures will use 6 Gigs on 1080p however - High uses 2.7 Gigs but with downsampling from 2720 x 1700 @ 1920 x 1200. Reason because I didn't try 1080 is because Mordor recognizes my downsampling-resolution as native and will only use percentages of that- so instead of 1080p I'll get some really weired ones like 1927 x 1187 or something like that (and I can't be arsed to fix it right now).
There is however a ingame-supersampling option so you can set the internal rendering to 200 % and will get effectivly 4K-Res while still being in 1080p. If I'd use that coming from 1080p instead of 1700p I'd wager you'd be around 3 Gigs. So the recommendation seems plausible.
So with Ultra-Textures taking somewhat more VRAM and Supersampling enabled, 4 Gigs might be just not enough, making the jump to 6 Gigs logical. - Remember: With Supersampling you're effectively running 4K not 1080p. Coming from there, I'd wager you'll be absolutely fine with 3 Gigs running this game with everything set to ultra, including textures but disabled supersamping.
It also runs very nice. Everything set to ultra except textures and downsampling I get an average framerate of just over 60 with some minor drops to ~50. Setting this to 4K, I still get 45+ with a R9 290X. This is an Nvidia-sponsored game, so Geforce-users will probably be at least slightly higher than this with a similar performing GPU (GTX 780- 780T i, Titan, GTX 970)
I'll test that more in-depth at some point next week, so we'll see.
So much for not opimized.
So don't get your panties in a bunch ;-)
Thought u was getting a 4690k...shyt is heatrocks fam...So I guess I'll be able to try out Shadow of Mordor thanks to steam sharing but I'm afraid my CPU may not cut it as far as requirements go.
no one has tested those 'ultra' textures yet, so who knows. I'm curious to see how they run on a 3gb cardfrom gaf on shadow of mordor.
@Ciggavelli sounds like you're good.
I am but not now. These system requirements for the last couple of games soon to come out got me holding out and plus right now I'm broke so no CPU for me.Thought u was getting a 4690k...shyt is heatrocks fam...
I didn't but now that i look at the prices how the hell is the 970 almost the same price as the 770?you typed that wrong you must have meant GTX 970.
The 970 is actually cheaper in some cases...this is what pisses me off about it...lol..I got this 770 and the 970 is better and cheaper...I didn't but now that i look at the prices how the hell is the 970 almost the same price as the 770?
Son don't believe that i7 bs...There's no need for that shyt...the 4690k literally just came out this summer...Its far more than enough..and I feel u bout being broke...lol..that's why I can't grab a 970 like I want to right now...I am but not now. These system requirements for the last couple of games soon to come out got me holding out and plus right now I'm broke so no CPU for me.