The One Man MMO Project: Terrain 2.0 - Part I
It's a milestone. I'm working on my final rendering feature! With gamma correct Ambient/Diffuse/Specular throughout the world, hardware particles for effects (got those working a couple weeks ago) and some lovely shadows, I have but one big rendering issue left to address: Terrain 2.0.
My goals for Terrain 2.0 are:
The first improvement to texturing was to add normal maps to to make the terrain bumpy. That was surprisingly easy. The only tricky parts were figuring out the tangent and bitangent, and figuring out how to blend multiple normal-mapped textures. (At this point I'm doing the normal mapping, then blending the colors rather than trying to blend normal maps but that may change. If you're looking for a good article on blending normal maps, check out this one.) The funny thing with adding normal maps was that I needed to add an additional sun to the lighting: one for shadows, a much farther away one for lighting. Trying to use the same sun position for both does not work.
Next on the list was increasing the resolution of the heightmap. I wanted to get it to 1.25m between samples, but there are some worrying issues. First is that its a lot of data. A rough estimate for the raw terrain data is about 400GB. My laptop has 2 500GB drives, so that's a little fat to be convenient. Another issue is that it is 64x as many polygons per terrain block. That means longer load times, more triangles to render, serious potential performance issues. The real dealbreaker though is the size of the game data - 288GB based on the current data.
So I'm trying for 2.5m, that's 100GB of source data, 16x the polygons, 72GB of game data.
Now the good news is that I discovered some issues with my current terrain data that gave me a couple of ways to make it smaller. The first was that I was storing normals for the edge samples within the block data (I calculate the internal normals at load time.) Those normals are three 32-bit floats and were needed for the lighting since I didn't have a good way to calculate those at load time. I was able to rip out those floats and instead store the heights for one row around the entire block so that the normals for the interior samples could be calculated at load time. That's only one float per normal and the values are quantized so they compress down to just a few bits. Since I was in there, I rearranged the data to be more compressible (structure of arrays rather than array of structures) and run ZIP compression on each block. I'm not quite finished with everything yet, but on my initial tests with my original heightmap, I shed 64% of my on-disk data.
My original terrain didn't store any duplicate height data, so there was a gap between each terrain block that I had to stitch together after adjacent blocks were loaded. To make it even more complicated, when a diagonal block was loaded, the stitching data had to be rebuilt to fill in the corner. I never liked this system. It worked fine, but it was too complicated. Since I was adding extra heights to calculate the normals, I decided I'd also store the first column/row of height data from the adjacent blocks so I could get rid of the stitching. I enjoyed ripping out all the stitching code.
The next big problem was that with the original data, the terrain import tool used 33GB of RAM and the runtime was 15 hours - mainly due to swapping to disk. If I increased the data size 16x, that was going to be unmanageable. The problem was that some of the terrain building steps needed the entire terrain to be loaded at the same time. It required quite a bit of rearranging, but I've changed the tool to process and free the memory for blocks sequentially. The added bonus of doing that was that previously only block writing was done in parallel, but now all the processing for the blocks can be handled in parallel - 8 of them at a time. It runs faster, and in much less memory. I haven't run it on the full dataset yet so I'm not sure how much of an improvement I have, but I estimate it should run in 5.8GB of RAM.
I've been playing with World Machine to procedurally generate the terrain data rather than using USGS data. It's looking promising so far. I really like the erosion modelling.
More to come in Part II.
Copyright (C)2009-2013 onemanmmo.com. All Rights Reserved