Twitter  Facebook  YouTube  E-Mail  RSS
The One Man MMO Project
The story of a lone developer's quest to build an online world :: MMO programming, design, and industry commentary
Smooth like Buttah
By Robert Basler on 2011-09-08 19:00:30
Homepage: www.onemanmmo.com email:one at onemanmmo dot com

I've had a rendering bug that has persisted for a long time. I've tried a number of times to track the bug down, but no matter what I did, after my game had been running for a long time, the scene geometry would freak out with polygons poking out all over the place. It kind of made the game look like a crazed porcupine.

I use VBO's for my rendering. They contain either indices or vertices. I only had one VBO for each and had coded my renderer so that the VBO's would reallocate and expand whenever they needed to. The expansion code was sort of slow since the VBO's needed to be repopulated with vertex and index data and then transferred to the GPU whenever they grew. My thought was that I'd find out how big they needed to be during development, then set the starting size to that so they would never actually need to expand at runtime.

For a long time I had known that the graphical corruption always occurred shortly after a VBO grew so I figured there was something wrong with the code that did the expansion. I debugged and checked and checked, but no matter what, I couldn't find a problem.

Finally last week I had a breakthrough. I set the VBO's to huge sizes so they wouldn't have to ever grow, and to my great surprise, after a long time running, the porcupine came out to play. My assumption was wrong! It wasn't growing the VBO's that was the problem!

I looked a little closer at the indices of the geometry that was bad, they were all in the 70,000 range. Hmm. I use 16-bit vertex indices that only support up to 65536 vertices. Crap.

So I spent a day rewriting the VBO code to limit the number of vertices in a single VBO to 65536 and added support for multiple vertex VBO's. I'm optimistic this will perform better than the old code because as new models are added to my VBO's, I only have to update one small VBO out of many, rather than one giant VBO. This should save some memory bandwidth and reduce the chance of a stall because I'm changing a VBO while OpenGL is still using it.

Once I had that fixed, ahh the glory. Everything ran and ran and ran. Beauty. Perfection. Bliss. Damn, the framerate's stuttering.

My environment is streaming. As the player moves around, the scene geometry is loaded in chunks. Every time one of those chunks loaded the frame rate stuttered really badly.

Now the chunk loading is done on a separate thread so the actual loading shouldn't affect the framerate, but once the chunks are loaded there is a bunch of work done to add the models to the VBO's.

I thought maybe the VBO updates might be the cause of my framerate dips. I had read that gDEBugger was free now, so I wanted to see if it could tell me that. Now gDEBugger is really cool, but it couldn't tell me that. However it did tell me that my frame rate dips matched up exactly with a big CPU use increase. Since the chunks are made up of a number of separate models, and those models need to be processed by the CPU, I modified the loading code to only load one model per frame. Since all the models are offscreen anyway, it doesn't really matter if they are delayed a few frames. This helped, but I was still seeing long frame warnings in my output. Most of the time the game chugs along at 60fps, but when the chunks were loading, some of the frames were 120ms. Yikes.

gDEBugger complained that I was calling glGetError a lot. Actually, I call it after every gl call in my debug build. It also said that the whole gl command queue has to flush to get the result for that call. That sounds slow. I modified things so that I could test without those calls. The framerate improved a bit, but it wasn't the magic bullet.

Looking at the trace, I noticed that calls to glLinkProgram were taking between 30 and 90ms. Wow. That's really slow. I have a shader system that makes up different shaders for every model depending on its material needs. It turns out I wasn't reusing shader programs, I was making multiple copies of otherwise identical shader programs. gDEBugger told me that. Hmm.

So I spent another day heavily modifying the shader management so that if the game asks for a shader and it happens to be the same as a previously created shader, it doesn't generate a new shader, it just adds a reference to the existing one. Well damn if that doesn't fix all those long frames!

Now in debug the frame rate still drops a bit when a chunk loads, but a release build is smooth as butter.

So lessons learned: watch the number of vertices in your VBO's, use gDEBugger, avoid glGetError in production code, and glLinkProgram is wayyyyy too slow to use at runtime.

New Comment

Cookie Warning

We were unable to retrieve our cookie from your web browser. If pressing F5 once to reload this page does not get rid of this message, please read this to learn more.

You will not be able to post until you resolve this problem.

Comment (You can use HTML, but please double-check web link URLs and HTML tags!)
Your Name
Homepage (optional, don't include http://)
Email (optional, but automatically spam protected so please do)
Plastic or paper? (What's this?)

  Admin Log In



[The Imperial Realm :: Miranda] [Blog] [Gallery] [About]
Terms Of Use & Privacy Policy