Monday, January 31, 2005

Terrain in the Membrane


I am completely and utterly fascinated by terrain generation.

Maybe it has to do with the view, perfectly suited to those with well-developed god complexes.

For those with a serious jonesing for worldmaking info, check out the Virtual Terrain Project. They cover a whole heaping helping of things which I often peruse but completely fail to understand. Why do I do this?

Here's an interesting fact! "Our method, dubbed Real-time Optimally Adapting Meshes (ROAM), uses two priority queues to drive split and merge operations that maintain continuous triangulations built from pre-processed bintree triangles." What the hell does that mean!? I only think I sorta know! [Read all about it here]

Middle-Earth Online has a perfectly coherent breakdown of what is probably the most common method of making terrain for videogames. They also explain some of the difficulties in creating miles and miles of world for players to trek through. I especially like the way they can adjust which specific distant objects get rendered, allowing for navigation by landmarks - I suppose we could name it something like Continually Refocusing Distant Object Rendering using a Dynamically Adjusted Priority Queue. Whee, this is fun!

Designing reactive terrains is akin to making a matte painting. Different layers are laid down which alter the behavior of the system. First you create the shape of your terrain with a triangle grid. Then you delineate impassable areas (cliffs and steep slopes).

Many games now use specialized procedures for generating vegetation based upon terrain type - so when you lay down semi-arid desert then the program decides what types of plants and animals to spawn in that area. This is especially useful for online worlds, which may introduce new zones or planets to traverse at regular intervals and therefore demand ways to simplify content creation. With this system a developer isn't responsible for placing every single bush and rock and lizard - of course, there should be a way to place specific trees or bushes included.

Terrain textures are often governed by programming that seamlessly tiles them across an area in order to avoid obvious seams (imagine bathroom tiles which, when properly placed, form one swirling abstract image but, if misaligned, clearly denote their square natures). Noise is introduced into the textures and blended through, altering their properties slightly but maintaining their general characteristics.

But what of water?

Most terrain engines allow only one set water height. It is impossible to have, for example, a lake in a volcano crater that drains down its side into a series of descending locks, not without some clever fudging or annoying zone loads.

I am wondering about using particle systems to model, at the least, water in motion. They are already used most often to create ambience (a broken pipe sprinkling out droplets, for example). What if you created water volumes, then had a way of determining a cohesion level, a flow direction and a limited number of definable bounds. When the cohesion level of a bound was breached, then flow direction would travel perpendicular to the bound until the volume emptied or was blocked. Wherever a breach occurs, a particle system spawns and emits a number of particles determined by the total water volume - the size of the particles is determined by the constraints of the space the particle emitter is projecting to - and you could even determine a pressure of the emitter which would help determine how much the particles stick.

I think the new (maybe) ideas here are: (1) The particles will not continuously emit; They are produced based upon a set number (namely, how much water you have in your volume). (2) Particles adhere into 'superparticles' which retain the behavior of the regular particles (like a Bose-Einstein condensate).

The difficulties of such a particle emitter (other than, y'know, would it actually work?) are: (1) Are current processors capable of such tricky computing? (2) How do we determine simple planes to define the bounds of a volume of water? Especially if it's inside of, say, a cylinder. (3) Water is hydrophilic (it forms strong bonds with itself) but only when constrained. How do we determine at what point water loses all cohesion and can effectively be destroyed in a virtual environment? Done wrong, we will end up with water drops that attract each other a la the T-1000.

I think I'm going to begin copyrighting titles of programming methods before they are even implemented, beginning with my own water particle system.

I'll title it: Rendering Non-Static Water Volumes using a Particle System with a strong Cohesion-And-Flow-Model (CAFM) and Processing Fluid Bounds in Realtime.

Yeah. That's got zing.

No comments: