Monday, September 24, 2018

Creepy Purple Head Is Creepy

This took a lot longer than I hoped. This post is about level of detail. That's where you take out some polygons and simplify a shape, usually based on how far away it is from the viewer, to save your graphics card the trouble of calculating things for polygons which are so small they only cover a few pixels or less anyway. I've had spheres doing this in my "engine" for a long time, because for spheres it's relatively easy to calculate the appropriate low(er)-poly mesh or meshes. For general meshes though, the easy approach is to create a mesh for each LOD using external tools or by hand. I was thinking of taking an automatic approach to generating these meshes and calculating the appropriate view distances. This would, in my mind, save me the trouble of having to do this outside the engine and tune each model's selection of meshes. (It would also save me from the problem of how to handle shape keys on lower-poly versions of a mesh, although I'd probably just ignore that.)

So, after finishing up fog and transparency illumination in the last post, I noticed that my sphere LOD scales seemed to be a bit off, so I started debugging. That led to bug fixing, and that led to me starting to implement that auto-LOD-generation that I had been thinking about. Like I said, it turned out to take longer than I had hoped. Part of this was finding some reasonably fast and simple algorithm to implement, and part of it was discovering things like Blender had output a separate vertex for every triangle corner in my mesh, even though most triangles share corners with other triangles. This meant a detour to merge these vertexes so I could do a bit of topology analysis in the algorithm (also incidentally saving a bunch of memory after merging the vertex data, although I guess that's not such a big deal compared to things like texture memory).

The final effect is shown in the video above. It works... Ok? I guess? The effect is exaggerated for testing in the video, and normally the changes are basically invisible. I think simplifying material shaders for distant objects will probably be a bigger win (and I'll see if I can automate that too). But anyway, now you know how I spent the last couple of weeks and why clouds still aren't implemented.

Monday, September 03, 2018

Dramatic Sunset Lighting With Forward Scattering

It sort of has something to do with clouds, honest.

Among a bunch of under the hood fixes and additions over the past little while, I went back to my transparency and fog shaders. Both of these effects reference a 3D texture "illumination map" that I generate in screen space. The illumination map makes it possible for all the lights to at least have some effect on transparent objects, including fog. The original illumination map only holds the response of a directly illuminated sphere to incoming light, summed up for all lights. This would be appropriate for clouds made of fairly large particles, which is why I talked about dust in the description of the first video. Maybe blowing sand would be a better description.


Notice how the transparencies in this older video are black when viewed against the sun, and compare to the behavior of the same type of object in the first part of the new video above.

I also wanted to be able to simulate things like steam or water clouds, which, because they are made of smaller (and transparent) particles, scatter light forward instead of just reflecting it back. And I also wanted to support a mix of different types of cloud or transparency in the same scene. I've attacked this problem by rendering two illumination maps, the original, pure reflection map, and another which represents (mostly) forward scattering. (Right now, both maps are applied as-is, but I plan to make the response weighted by the transparent material in a later update.) This is what makes the transparencies shine when illuminated from behind in the top video, and gives the nice (if pixellated) murky sunset effect at the end of the video.

I was thinking of moving on to clouds next, but I could also implement an entire framework for visualizing and debugging the illumination maps, which would only take another week or two...