In most games the major contributors to the frame-rate are:
- Can you simulate all of the action that's happening on the screen - physics, animation, HUD, AI, gameplay etc?
- Can you render all of the action that's happening on the screen - objects, people, environment, visual effects, post effects etc?
The first point relates to all of the things that are usually handled by the CPU and the second point relates to things that are traditionally processed by the GPU. Over the successive platform generations the underlying technology has changed, with each generation throwing up its own unique blend of issues:
- Gen1: The original PlayStation had an underpowered CPU and could draw a small number of simple shaded objects.
- Gen2: PlayStation 2 had a relatively underpowered CPU but could fill the standard-definition screen with tens of thousands of transparent triangles.
- Gen3: Xbox 360 and PlayStation 3 had the move to high definition to contend with, but while the CPUs (especially the SPUs) were fast, the GPUs were underpowered in terms of supporting HD resolutions with the kind of effects we wanted to produce.
In all of these generations it was difficult to maintain a steady frame-rate as the amount happening on-screen would cause either the CPU or GPU to be a bottleneck and the game would drop frames. The way that most developers addressed these issues was to alter the way that games appeared, or played, to compensate for the lack of power in one area or another and maintain the all-important frame-rate.
This shift started towards the end of Gen2 when developers realised that they could not simulate the world to the level of fidelity that their designers wanted, as the CPUs were not fast enough - but they could spend more time rendering it. This shift in focus can clearly be seen around 2005/2006 when games such as God of War, Fight Night Round 2 and Shadow of the Colossus arrived. These games were graphically great, but the gameplay was limited in scope and usually used tightly cropped camera positions to restrict the amount of simulation required.
Then, as we progressed into Gen3 the situation started to reverse. The move to HD took its toll on the GPU as there were now more than four times the number of pixels to render on the screen. So unless the new graphics chips were over four times faster than the previous generation, we weren't going to see any great visual improvements on the screen, other than sharper-looking objects.
Again, developers started to realise this and refined the way that games were made, which influenced the overall design. They started to understand how to get the most out of the architecture of the machines and added more layers of simulation to make the games more complicated and simulation-heavy using the CPU power, but this meant that they were very limited as to what they could draw, especially at 60fps. If you wanted high visual fidelity in your game, you had to make a drastic fundamental change to the game architecture and switch to 30fps.
Dropping a game to 30fps was seen as an admission of failure by a lot of the developers and the general gaming public at the time. If your game couldn't maintain 60fps, it reflected badly on your development team, or maybe your engine technology just wasn't up to the job. Nobody outside the industry at that time really understood the significance of the change, and what it would mean for games; they could only see that it was a sign of defeat. But was it?
Switching to 30fps doesn't necessarily mean that the game becomes much more sluggish or that there is less going on. It actually means that while the game simulation might well still be running at 60fps to maintain responsiveness, the lower frame-rate allows for extra rendering time and raises the visual quality significantly. This switch frees up a lot of titles to push the visual quality and not worry about hitting the 60fps mark. Without this change we wouldn't have hit the visual bar that we have on the final batch of Gen3 games - a level of attainment that is still remarkable if you think that the GPU powering these games was released over seven years ago. Now if you tell the gaming press, or indeed hardcore gamers, that your game runs at 30fps, nobody bats an eyelid; they all understand the trade-off and what this means for a game.