Friday, September 8, 2017

Rantology: Framerates

It is with more and more frequency that we see the gap between PC and console becoming closer and closer, whether that be in commonly shared games, hardware, or even capabilities (like streaming off on consoles).  Naturally that conversation always rears it's ugly head back to performance (instead of more valuable conversations like the costs of digital delivery, Steam vs PS/MS storefronts, and a discussion on cross platform play), and one of those primary components is framerates and how they affect gameplay and can make the game seem like it an overall better experience.  As a software engineer this is something I know a thing or two about, so I figured what a better way than to turn a heated debate into a passionate conversation.



So, the first thing that I feel needs to be addressed when having conversations like these is the idea that a higher frame rate means a more polished game, and while on the surface this can seem true, however when you get down to it that's not the case.  So what exactly does frame rates mean?  Well to put it simply it's the number of times a screen will refresh in a certain period, most commonly we look at it in terms of frames per second, or fps, another term for this is a Hertz, Hz, so something that has a rate of 1 fps (1 Hz) is redrawing it's screen once every second, now this doesn't necessarily mean that it is changing what is on the screen, just that is how fast the screen does a refresh.  This is a restriction by the hardware, but most modern TV's will have a minimum of 60 Hz, with a large portion going up to 120 Hz or 240 Hz.

So if thats what a Hz is, how does this affect framerates and our gameplay?  Well simply put when a game runs at 30 fps it is capable of rerendering that display once every 1/30 of a second.  This doesn't necessarily mean that the screen will change though, and even in some PC games that rate will fluctuate while you play based on background processes, which actually is an issue because it leads to tearing and creates an effect of removing immersion in the game and due to rapid change in framerate.  Ever play League of Legends and experience a perceived lag spike?  That's exactly what I'm talking about. On the opposite side of it the higher the framerate the smoother the animations look, the better the experience you're likely to have.



All of this sounds really simple so far, and it's pretty easy to see that framerates are one of the most noticeable aspects of your game, well what if you have an effectively unlimited framerate, but other parts of your game are causing you to experience slowdowns and issues?  Say, for example, you're playing a very intensive RTS and you're in command of massive WWII sized armies on a battlefield. Well this kind of game will undoubtedly be very processor intensive, but the graphics engine you're using allows you to run at 60 fps consistently.  This is a pretty good thing, but because you need to do so many calculations on this game at one point in time because this game is hyper accurate and bases shots and damage on the individual and not on the squad/unit every time a bullet collision happens it needs to check to see if the wound caused was fatal or not.  This is going to cause a large slowdown because now you're running very processor intensive operations every game cycle, and these calculations may not by the time you need to redraw, well even in an optimal setting of only being able to redraw every other cycle now you've cut your framerate in half and are effectively running at 45 fps at 90 Hz refresh.

So while you may be able to handle that sweet 90 fps, you're now effectively playing at 45 fps, which is a noticeable drop in how smooth your game runs.  This is actually a pretty generous example of a scenario like this as some more intensive games like Sins of a Solar Empire can slow down to around 15 fps during max processing.  How can we combat this and still achieve a maximum amount of smoothness?  Well the easiest way is to drop your refresh rate to run at a lower Hz, this will give you a higher effective framerate, while optimizing how smooth you can operate.  Lets use the above example.  Say our calculations actually take 1/50 of a second to run and prepare the screen, well if we drop our refresh rate to 50 Hz we can actually increase or frame rate to 50 fps, up from our formerly noticed metric of 45 fps.  So in this case while I can advertise my game at running at 90 fps, in practice, running at 50 fps will give the player a better experience.

Some of your better developers will actually account for this and adjust the framerate and optimize it based on calculations, League of Legends attempts this as do most other MMO's, but a lot of those are based on network traffic, which is a conversation for another time, though that does play into your framerate when you need to redraw the screen.  Ideally though, in ur perfect environment, we're working offline, as these are the games that tend to get the most flak for setting a "low ball" of 30 fps. While many arguments on both sides can be made about capping framerates on consoles and pc, and just about everything around the subject, the base of it is all the same, the fact that when you're dealing with a difference in systems and computing power it will all affect your framerate output to your user.



So how can we change this and better use our resources to optimize our framerates?  Well the easiest way is to better handle data.  More structured, and ideally organized data is faster and easier to process.  The idea is if we use a data structure that offers near constant access time to data the player cares about then the better we are using our resources.  Another way is to do what I call prioritized processing, or process data that the player will see first, and do "invisible" calculations afterwards.  I know that's confusing so let me explain.

The idea is if we know our player has a viewbox that is 60 degrees of a circle and can only see that we start with things happening in that view, and then the further something is from our view, the lower a priority it has to process thus allowing data to be updated in relative to how important it is for our view.  Let's use our example above.  Say we break our 90 Hz to 90 time units, in 1 unit I can process roughly 55% of our data, by the above optimization we run at 50 fps, of that 55%, 30% is our immediate view screen, and the other 25% is external data, well say we really want to squeeze out that extra 10 fps, so we need see how we can optimize our resources to achieve that.  Well we use this idea of easing out, or setting an ideal range away from the camera for data we deem important.  Let's say our ideal is half the screen at current view. Everything after that has the same priority of "doesn't need to happen now, but we shouldn't put it off forever".  So the idea is that every time unit, remember we reduced down to 60 fps, or 60 Hz we start off by checking our bounds, everything in that has a priority of 1, then every object moving outwards gets its priority modified as we move out, until we reach that easing line, and calculate its priority.  That is how we determine our movement calculations, seeing who takes damage, seeing what needs to animate, seeing what needs to disappear because it's been destroyed.  This way if, in that one time unit, we only get 80% of the way through our calculations before we need to clean and prepare for the next cycle, we slightly increase it's priority, and prepare for the next cycle.  This will allow us to hit that sweet sweet 60 fps by optimizing our available resources.



While framerates may not be the most important aspect of development, but they can look like it, and purely devoting resources to framerates isn't the only or ideal solution to improve this metric. They matter, but not as much as other things.  I hope this has been educational and you can go out and win some arguments you've been having with your friends.  Till next time.

No comments:

Post a Comment