A bit of a shorter diary this week as I'm busy working on a number of things in Dysnomia that I'm not quite ready to show yet! Also, this week is more about coding than anything else, so non-techies may want to wait until the next entry when I should be showing off a little more of the game!
Optimisation is a very important part of game development. However, for simpler games you may not ever need to worry about performance - let's face it, if your game is running smoothly on your target platform from start to finish then going back over the code and optimising for the sake of it is ultimately pointless. Put the energy into starting your next game!
Likewise, spending hours and hours pondering over every line of code to ensure it's as efficient as possible will likely result in a convoluted and hard-to-follow program.
If you know your game is likely to use a lot of memory, either graphically or object-wise then you'll need to keep performance and optimisation in mind right from the start. Because I've never had to optimise a game before, I completely ignored all the issues. That is until I started to get frame hitches when running on the 360. Then I started to realise and put right many of the mistakes I had made.
Shawn Hargreaves, one of the XNA framework developers, has written a number of blog entries regarding .NET garbage collection and the 360. These three posts are the ones I found the most useful when starting to track down where I had gone wrong.
I noticed a definite "jerk" in Dysnomia that was occurring once every few seconds or so. After going through the code and commenting out the heavier update and draw methods, I couldn't narrow it down to one particular place in code. That's when I started to pay attention to the posts regarding garbage collection and how to either avoid or manage collections. I ran the CLR profiler against the game and found that (on Windows at least), I was seeing a level 2 (heavy collection) occurring every few seconds which would definitely correspoind to the glitches on the 360.
I started to look at my code and figure out where it all went wrong. To start with, I was using generic List
() lists to hold my game classes. Enemies, spawn locations, lights, bullets - pretty much anything that changes in game is allocated on the fly and added/removed from its list. Each time an enemy died, it was removed from the list. When an enemy spawned, a new Enemy() was allocated and added to the list. Same with bullets. Those are quite complicated classes to begin with, and to top it off I was giving each enemy and bullet object its own Texture2D sprite. What the hell was I thinking?!
To fix the issues, I decided to follow Shawn's advice and went with his first "path" for avoiding garbage collection worries: Allocate as little as possible during the game loop.
I went back and re-worked how I managed my in-game objects. Rather than letting the List allocate for me, I created a Manager class for each of the lists. They work in much the same way as the ParticleEngine XNA sample in that the Manager classes pre-allocate all of the objects I will ever use in game, and use a queue to select the next object for use. When the object in question is no longer needed (if an enemy dies or a bullet hits something), it is set as inactive and placed back on the queue to be used again.
All of the pre-allocation is done when the level and content is loaded. After I've performed all my loading a allocating, I do a GC.Collect() before fading the level up and starting the game. This gives me a clean slate for the smaller allocations (Vector2s and value types mostly) that occur in-game.
Net result is that garbage collections no longer occur whilst the game is in progress. Okay, so now I'm starting to get into this optimisation stuff. I looked at the way I was using Texture2D for loading game content. I realised that having a spritesheet for each instance of a game class (enemy, door, light, bullet) was not the way to go. Too much unneccessary memory wasted.
Instead, in each of my game object Managers, I loaded the spritesheets for each type of object. For instance, I have a Spider enemy and a Slug enemy. Each enemy in my List
in my enemy manager class can be a Spider or a Slug type. My manager loads one instance of a Texture2D for a Spider enemy, and one for a Slug enemy. In the EnemyManager.Draw() routine, I loop through all the active enemies in the List and choose the correct Texture2D to draw from, rather than having each Enemy class draw itself from its own Texture2D.
So that's just a couple of the ways I'm handling memory usage and garbage collection in Dysnomia. Allocate lots during load, allocate close to nothing during game. There are still some glitches to iron out in the AI routines that can get a little heavy, but I'm getting closer to that constant smooth 60fps gameplay I'm after.
Here's a bonus off-screen recording showing the action on the 360: