12 Mesh Basics

Last time, I showed you how to use a z-buffer to draw two triangles so they would properly obscure each other in a realistic three-dimensional manner. But what’s realistic about a couple of triangles? Not much.

If we want to move beyond the really simple scenes that we’ve been using to date, we’re going to have to abandon our reliance on VertexBuffers. While Direct3D will still work with FVFs and Device.DrawPrimitives behind the scenes, we need a higher-level abstraction to work with if we’re going to be productive. That abstraction is a mesh.

At its core, a mesh is simply a predefined set of vertices and a set of textures and materials that go with them. Basically, it’s a way to save us from having to keep track of every single triangle and texture, when all we really want to do is to draw a tank or an ogre. As it turns out, Direct3D lets you get a whole lot more sophisticated than that, but even within this simple definition we can get a lot done.

The heart of this new abstraction layer is the Mesh class, which lives in theMicrosoft.DirectX.Direct3DX assembly. Note the extra X on the end of that – the Direct3DX assembly is helper assembly for Direct3D, and has a ton of goodies in it that provide the sort of advanced capabilities that we generally make use of in real applications.

We could create a Mesh by calling its constructor, stuffing a bunch of vertices in there, and adding textures and materials. Generally speaking, however, we’ll be loading complex objects that have been designed in some sort of 3D modeling tool.

You have two choices when it comes to loading a model: create your own or use someone else’s. It turns out there’s a whole pile of free ones available on the Internet, at places like 3D Cafe. They have tanks and guns and trees and bushes and lots of other things you can use in your experimentation.

If you can’t find something you like already available, you’ll have to create your own models. There are quite a few modelers out there, some free like gMax and some not, like 3D Studio Max or LightWave. They all have their strengths and weaknesses, although 3D Studio Max is something of an industry standard.

Regardless of whether you create your own or grab something off the Web, your models need to be stored on disk somehow. Tools like LightWave have their own proprietary file formats that they use for storing shape and texture information, but most of them can be converted into the file format that Direct3D knows how to use: the .x file format, although some tools like MeshX save to the .x format directly.

One of the most popular tools for converting between 3D graphics formats is PolyTrans. If you’re going to be doing any serious 3D work, you’ll want to add it to your toolbox. It supports conversion to and from just about any format you might run across. Unfortunately, it’s fairly expensive for people like me that just do graphics in their spare time. The DirectX SDK used to ship with a tool called conv3ds that would convert from the 3D Studio format to the Direct3D format. The tool is still available, but it has been moved to the Direct3D Extras package at the MSDN DirectX website , so you’ll need to download it separately. The Extras package also ships with addins for many popular 3D modeling packages – check it out.

A .x file holds everything we need to know in order to render an object. Although it can contain lots more, for our purposes we can think of it as holding vertex, texture, and material information. Given that these are the primitives that we’ve been working with so far, I think you can see that this is going in a good direction: we’re going to get the Mesh to deal with most of the details involved in coordinating these things.

Before looking at the code around how to load and render a mesh, we need to talk about subsets. Every mesh is divided up into one or more subsets. A subset is that portion of the mesh’s vertices that all use the same texture and the same material. You’ll recall that texture and material information are global to the Device. Since a given mesh object could be make use of several different textures and materials, it’s therefore very important to keep track of which one is active right now, and to try to switch between them as little as possible for performance reasons; hence subsets.

Loading a mesh is a little bit less straightforward than loading a texture. Because we have to deal with multiple textures and materials – possibly one for each subset of the mesh – we need to load the mesh from within a loop, storing information about the materials and textures that correspond to each subset. The code to do that looks like this:

// Private fields to hold our mesh info for // use during rendering private Mesh mesh; private Material[] materials; private Texture[] textures; protected void CreateMesh(string path) { ExtendedMaterial[] exMaterials; mesh = Mesh.FromFile(path, MeshFlags.SystemMemory, device, out exMaterials); if (textures != null) { DisposeTextures(); } textures = new Texture[exMaterials.Length]; materials = new Material[exMaterials.Length]; for (int i = 0; i < exMaterials.Length; ++i) { if (exMaterials[i].TextureFilename != null) { string texturePath = Path.Combine(Path.GetDirectoryName(path), exMaterials[i].TextureFilename); textures[i] = TextureLoader.FromFile(device, texturePath); } materials[i] = exMaterials[i].Material3D; materials[i].Ambient = materials[i].Diffuse; } }

There’s a fair amount of code there, so let’s break it down line-by-line.

ExtendedMaterial[] exMaterials;

This line sets up an array of ExtendedMaterial objects. ExtendedMaterial is the type we’re going to use to find out about the textures and materials that each subset has applied to it. So if we’re loading a tank object that has a gun, turret, body, and two tracks, we’ll need different materials and textures for each of these parts. The ExtendedMaterial array is what will hold that information.

mesh = Mesh.FromFile(path, MeshFlags.SystemMemory, device, out exMaterials);

This line of code actually loads the mesh. We give it a path to a .x file (insert your favorite Mulder & Scully joke here), some MeshFlags (which I’ll leave for a later discussion), a reference to our Device, and an out parameter that’s the array of ExtendedMaterial objects we declared above.

When this method returns, we’ll have the Mesh object itself, as well as a description of the materials and textures for each of the subsets. But there’s more work to do. Because the Mesh.FromFile API doesn’t actually create the Texture and Material objects that we needs – it merely says what they look like. So we need the following code

textures = new Texture[exMaterials.Length]; materials = new Material[exMaterials.Length]; for (int i = 0; i < exMaterials.Length; ++i) { if (exMaterials[i].TextureFilename != null) { string texturePath = Path.Combine(Path.GetDirectoryName(path), exMaterials[i].TextureFilename); textures[i] = TextureLoader.FromFile(device, texturePath); } materials[i] = exMaterials[i].Material3D; materials[i].Ambient = materials[i].Diffuse; }

The code creates an array of Textures and an array of Materials, and then loops through populating it based on the information in the ExtendedMaterials array. Note that we have to explicitly load each texture:

if (exMaterials[i].TextureFilename != null) { string texturePath = Path.Combine(Path.GetDirectoryName(path), exMaterials[i].TextureFilename); textures[i] = TextureLoader.FromFile(device, texturePath); }

Which implies (correctly) that textures are not actually stored in the .x file itself. They are stored as separate files. Here, I’m assuming they’re stored in files in the same directory as the .x file itself.

Although it has nothing to do with DirectX, let me point out that you should always use Path.Combine, too, to ensure that file paths get built up the right way. If nothing else, it saves you from having to worry about whether the directory name ends with a backslash or not, so you can figure out whether or not to append one.

The other slightly strange bit of code you might have noticed is this one:

materials[i] = exMaterials[i].Material3D; materials[i].Ambient = materials[i].Diffuse;

What we’re doing here is simply copying the material stored in the .x file into our material array, and then setting the diffuse and ambient components of the material to be the same. This is just to ensure that the object reacts to diffuse and ambient light the same way (remember lighting?). You might not always want to do that, but usually you will.

With our Mesh, Material, and Texture objects safely created and tucked away, we can turn our attention to actually rendering the mesh. This turns out to be slightly easier than loading it:

protected void RenderMesh() { for (int i = 0; i < materials.Length; ++i) { if (textures[i] != null) { device.SetTexture(0, textures[i]); } device.Material = materials[i]; mesh.DrawSubset(i); } }

Because of the way we loaded our mesh, we know that for every subset that the mesh has, there will be a corresponding entry in the materials array. So we simply iterate over the materials array, activating the appropriate texture (if any) and material, and then executing the key line of code:

mesh.DrawSubset(i);

This is sort of like our Device.DrawPrimitives call from all of our previous examples, but on steroids. This one call will take care of figuring out which of the potentially thousands of vertices that make up our model need to be drawn, and it’ll draw them. In fact, we could probably actually write our own Mesh class that would do the same thing by making the Device.DrawPrimitives call ourselves. Of course, we would never do that, as we have better things to do with our time than implement something that’s already done and has tons of optimizations in it, but you get the point: DrawSubset is what allows us to program at a higher level of abstraction, focusing on objects and scenery rather than triangles and vertices.

Oh, and just like when we were working with individual vertices, if we want to move the object around, we do so by setting the World transform on the device before rendering. (Remember transforms?)

And that’s the story! It’s a huge step forward for us – by making use of meshes and .x files, we’ve really freed ourselves from worrying about vertex geometry. Instead, we can deal with object geometry, which is a far more productive space to be in.

By now you’re probably used to me saying what we’ll talk about next time. And I do have a few things in mind to write about next. (Update: next time we’ll talk about managing device loss.) But they’re all sort of little details that fill in the cracks around what we’ve been discussing – I’ve actually reached the limit of what I’ve explored in Direct3D. Sometime soon, I hope to continue my research into areas like frame-based hierarchies and animation, which allow us to do things like make a monster mesh walk in a realistic manner. Until then, feel free to contact me with any questions, and I’ll do my best to give you an answer.

As usual, I’ve included a complete program below for you to experiment with. Before you run it, be sure to put a .x file and any corresponding texture files into the directory where the .exe you compile lives. You can find a few .x files in the samples directory of the DirectX SDK.

The Code

Update

If you're using the October 2004 SDK, you'll need to change the call to Commit in SetupLights with a call to Update. This is due to the changes the DirectX team made to the SDK in the October 2004 release.

using System; using System.Drawing; using System.Collections; using System.ComponentModel; using System.Windows.Forms; using System.Data; using System.IO; using Microsoft.DirectX; using Microsoft.DirectX.Direct3D; namespace WinDev.Candera.Direct3D { public class Game : System.Windows.Forms.Form { private Device device; private PresentParameters pres; // Private fields to hold our mesh info for // use during rendering private Mesh mesh; private Material[] materials; private Texture[] textures; static void Main() { Game app = new Game(); app.Width = 800; app.Height = 600; app.InitializeGraphics(); app.Show(); while (app.Created) { app.Render(); Application.DoEvents(); } app.DisposeGraphics(); } protected bool InitializeGraphics() { pres = new PresentParameters(); pres.Windowed = true; pres.SwapEffect = SwapEffect.Discard; pres.EnableAutoDepthStencil = true; pres.AutoDepthStencilFormat = DepthFormat.D16; device = new Device(0, DeviceType.Hardware, this, CreateFlags.SoftwareVertexProcessing, pres); device.RenderState.CullMode = Cull.None; CreateMesh(@"tank.x"); return true; } protected void CreateMesh(string path) { ExtendedMaterial[] exMaterials; mesh = Mesh.FromFile(path, MeshFlags.SystemMemory, device, out exMaterials); if (textures != null) { DisposeTextures(); } textures = new Texture[exMaterials.Length]; materials = new Material[exMaterials.Length]; for (int i = 0; i < exMaterials.Length; ++i) { if (exMaterials[i].TextureFilename != null) { string texturePath = Path.Combine(Path.GetDirectoryName(path), exMaterials[i].TextureFilename); textures[i] = TextureLoader.FromFile(device, texturePath); } materials[i] = exMaterials[i].Material3D; materials[i].Ambient = materials[i].Diffuse; } } protected void DisposeTextures() { if (textures == null) { return; } foreach (Texture t in textures) { if (t != null) { t.Dispose(); } } } protected void SetupMatrices() { float yaw = Environment.TickCount / 500.0F; float pitch = Environment.TickCount / 312.0F; device.Transform.World = Matrix.RotationYawPitchRoll(yaw, pitch, 0); device.Transform.View = Matrix.LookAtLH(new Vector3(0, 0, -6), new Vector3(0, 0, 0), new Vector3(0, 1, 0)); device.Transform.Projection = Matrix.PerspectiveFovLH((float)Math.PI/4.0F, 1.0F, 1.0F, 10.0F); } protected void SetupLights() { device.RenderState.Lighting = true; device.Lights[0].Diffuse = Color.White; device.Lights[0].Specular = Color.White; device.Lights[0].Type = LightType.Directional; device.Lights[0].Direction = new Vector3(-1, -1, 3); device.Lights[0].Commit(); device.Lights[0].Enabled = true; device.RenderState.Ambient = Color.FromArgb(0x00, 0x00, 0x00); } protected void RenderMesh() { for (int i = 0; i < materials.Length; ++i) { if (textures[i] != null) { device.SetTexture(0, textures[i]); } device.Material = materials[i]; mesh.DrawSubset(i); } } protected void Render() { device.Clear(ClearFlags.Target | ClearFlags.ZBuffer, Color.BlueViolet, 1.0F, 0); device.BeginScene(); SetupMatrices(); SetupLights(); RenderMesh(); device.EndScene(); device.Present(); } protected void DisposeGraphics() { DisposeTextures(); device.Dispose(); } } }