Last time, we talked about the basics of how to light a scene. We were able to get a rotating triangle to get darker and lighter as it turned away and toward a light source. It looked something like this:
While pretty, this tricolored triangle is hardly useful for building realistic-looking scenes. Generally speaking, what we want is the ability to map images onto the surfaces of our triangles so they can look like brick, metal, leather, or whatever.
Direct3D has support for this capability through something called textures. A texture is simply an image – say from a BMP file – that is “glued” or “tacked” onto a triangle or series of triangles. If you do it right, you can have very detailed-looking objects that consist of only a few triangles, and therefore render very quickly.
To understand textures, we have to introduce one more coordinate system. Luckily, these texture coordinates are pretty straightforward. To start with, they’re only two-dimensional. These dimensions are often referred to with the variables u and v, in much the same way that we express object coordinates using x, y, and z.
Here’s the basic idea. Let’s say I have an image like this:
Which is what I look like when I’m in Hawaii. Let’s say for some strange reason that I want to map this image onto a triangle, like so:
Notice that the image is mapped sideways – it has been rotated 90 degrees to the right, and of course since it’s mapped onto a triangle, half of it has been lopped off. Which is what I feel like when I get back from Hawaii.
The way that we pull this off is by introducing another vertex format – PositionNormalTextured. By now you’ve seen enough vertext formats to have a pretty good idea of what this means: that each point on the triangle will contain a three-dimensional position, a surface normal for lighting, and some information having to do with textures.
The extra texture information that we’re adding to the vertices is the u, v texture coordinates I mentioned earlier. Texture coordinates refer to a position on the original image that the vertex should map to. The texture coordinates 0, 0 mean “the upper left corner of the image”, coordinates 1, 0 mean “the upper right corner of the image” and so on. Notice that u and v range between 0 and 1 regardless of the size of the image in pixels.
So to get the 90-degree rotation texture mapping that I showed above, what we want is something like this:
By setting the texture coordinates of the lower-right corner (for example) of the triangle to (1, 0), we’re telling it to map to the upper right corner of the image. Texture coordinates – like color values and normal values - are smoothly interpolated across the surface of the triangle. So a point halfway along the bottom edge of the triangle corresponds to a point halfway down the right side of the texture image.
That’s the theory. How do we put it into practice? It’s easy enough. To illustrate, I’ll keep it really simple: we’ll use transformed coordinates rather than cluttering the code with all the vertex normal and lighting stuff we talked about last time. Recall that transformed coordinates are merely screen coordinates.
Here’s what our InitializeGraphics routine is going to look like:
private Texture texture; protected bool InitializeGraphics() { PresentParameters pres = new PresentParameters(); pres.Windowed = true; pres.SwapEffect = SwapEffect.Discard; device = new Device(0, DeviceType.Hardware, this, CreateFlags.SoftwareVertexProcessing, pres); // New code texture = CreateOverlayTexture(device); vertices = CreateVertexBuffer(device); return true; }
There’s really only two new bits of code here – I’ve called them out as such. CreateOverlayTexture is where we load the file we want to map onto our triangle, and the private field texture is where we store the resulting texture. Looking at CreateOverlayTexture, we can see that it’s very simple:
protected Texture CreateOverlayTexture(Device device) { Texture t = TextureLoader.FromFile(device, "texture.bmp"); return t; }
The TextureLoader.FromFile call does all the hard work. We hand it a Device and a filename, and it hands us back a Microsoft.DirectX.Direct3D.Texture object. There’s just one catch: TextureLoader actually lives in the Microsoft.DirectX.Direct3DX assembly, not the Microsoft.DirectX.Direct3D assembly, like everything else we’ve done so far. That means that you’ll need to be sure to add a reference to Direct3DX or your project won’t compile. Direct3DX is a helper assembly that is chock full of useful stuff. We’ll be covering a number of its features in later articles in this series.
With the texture object in hand, we can go ahead and create our new textured object. We do this by defining a VertexBuffer.
protected VertexBuffer CreateVertexBuffer(Device device) { VertexBuffer buf = new VertexBuffer( typeof(CustomVertex.TransformedTextured), 3, device, 0, CustomVertex.TransformedTextured.Format, Pool.Default); PopulateVertexBuffer(buf); return buf; } protected void PopulateVertexBuffer(VertexBuffer vertices) { // Note use of TransformedTextured CustomVertex.TransformedTextured[] verts = (CustomVertex.TransformedTextured[]) vertices.Lock(0, 0); int i = 0; verts[i++] = new CustomVertex.TransformedTextured( Width / 2, Height / 4, 0.5F, // Vertex position 1, // rhw (advanced) 0, 1); // texture coordinates verts[i++] = new CustomVertex.TransformedTextured( Width * 3 / 4, Height * 3 / 4, 0.5F, 1, 1, 0); verts[i++] = new CustomVertex.TransformedTextured( Width / 4 , Height * 3 / 4, 0.5F, 1, 1, 1); vertices.Unlock(); }
Just about everything in here should look familiar. I’ve pointed out the only new thing – the creation of the TransformedTextured vertices, and even that is pretty much like what we’ve been doing all along.
I’m using transformed coordinates to keep the code clean. Recall that transformed coordinates are just screen coordinates, which means we don’t have to bother setting up a World, View, or Perspective transform. So the constructor arguments for TransformedTextured vertices are an x, y screen position, a z value that doesn’t matter, a rhw value that should always be 1, and finally our texture coordinates.
What I’ve done here is set up a more-or-less equilateral triangle where the topmost vertex is the lower-left corner of the image, the lower-right vertex is the upper-right of the image, and the lower-left vertex is the lower-right corner of the image. When rendered, it looks like this:
But how do I render it? After all, nowhere in the vertices did I say, “Use the picture of Hawaii Craig.” Textures don’t work like that in Direct3D. Instead, we do this:
protected void Render() { device.Clear(ClearFlags.Target, Color.Black, 1.0F, 0); device.BeginScene(); // New code device.SetTexture(0, texture); device.VertexFormat = CustomVertex.TransformedTextured.Format; device.SetStreamSource(0, vertices, 0); device.DrawPrimitives(PrimitiveType.TriangleList, 0, 1); device.EndScene(); device.Present(); }
Again, this code should look familiar except for the one line. What SetTexture does is to say, “Hey, if someone wants to render a vertex with texture information, they should use this one.” As it turns out, you can actually have more than one texture active at the same time, which is what the first argument is for. Multiple textures is an advanced topic that we’ll cover another time, though, so for now simply passing zero to indicate we’re setting the first and only texture is sufficient.
As you can see, the rest is the stuff we’ve been doing all along – calling BeginScene, using DrawPrimitives, etc. etc.
I’ve put the complete code listing below. Go ahead and experiment by changing the texture coordinates to control the way the image maps onto the surface. If you get really ambitious, try to map the texture onto a square by adding a second triangle.
Next time, we’ll talk about how to go beyond triangles and into much more complicated shapes by using Meshes.
using System; using System.Drawing; using System.Collections; using System.ComponentModel; using System.Windows.Forms; using System.Data; using Microsoft.DirectX; using Microsoft.DirectX.Direct3D; namespace Craig.DirectX.Direct3D { public class Game : System.Windows.Forms.Form { private Device device; private VertexBuffer vertices; private Texture texture; static void Main() { Game app = new Game(); app.Width = 800; app.Height = 600; app.InitializeGraphics(); app.Show(); while (app.Created) { app.Render(); Application.DoEvents(); } app.DisposeGraphics(); } protected bool InitializeGraphics() { PresentParameters pres = new PresentParameters(); pres.Windowed = true; pres.SwapEffect = SwapEffect.Discard; device = new Device(0, DeviceType.Hardware, this, CreateFlags.SoftwareVertexProcessing, pres); texture = CreateOverlayTexture(device); vertices = CreateVertexBuffer(device); return true; } protected VertexBuffer CreateVertexBuffer(Device device) { VertexBuffer buf = new VertexBuffer( typeof(CustomVertex.TransformedTextured), 3, device, 0, CustomVertex.TransformedTextured.Format, Pool.Default); PopulateVertexBuffer(buf); return buf; } protected void PopulateVertexBuffer(VertexBuffer vertices) { CustomVertex.TransformedTextured[] verts = (CustomVertex.TransformedTextured[]) vertices.Lock(0, 0); int i = 0; verts[i++] = new CustomVertex.TransformedTextured( Width / 2, Height / 4, 0.5F, // Vertex position 1, // rhw (advanced) 0, 1); // texture coordinates verts[i++] = new CustomVertex.TransformedTextured( Width * 3 / 4, Height * 3 / 4, 0.5F, 1, 1, 0); verts[i++] = new CustomVertex.TransformedTextured( Width / 4 , Height * 3 / 4, 0.5F, 1, 1, 1); vertices.Unlock(); } protected Texture CreateOverlayTexture(Device device) { Texture t = TextureLoader.FromFile(device, "texture.bmp"); return t; } protected void Render() { device.Clear(ClearFlags.Target, Color.Black, 1.0F, 0); device.BeginScene(); device.SetTexture(0, texture); device.VertexFormat = CustomVertex.TransformedTextured.Format; device.SetStreamSource(0, vertices, 0); device.DrawPrimitives(PrimitiveType.TriangleList, 0, 1); device.EndScene(); device.Present(); } protected void DisposeGraphics() { vertices.Dispose(); device.Dispose(); } } }
Contributed by adudley. Thanks!
using System; using System.Drawing; using System.Collections; using System.ComponentModel; using System.Windows.Forms; using System.Data; using Microsoft.DirectX; using Microsoft.DirectX.Direct3D; namespace Craig.Direct3D { public class Game : System.Windows.Forms.Form { private Device device; private VertexBuffer vertices; private Texture texture; static void Main() { Game app = new Game(); app.InitializeGraphics(); app.Show(); while (app.Created) { app.Render(); Application.DoEvents(); } app.DisposeGraphics(); } protected bool InitializeGraphics() { PresentParameters pres = new PresentParameters(); pres.Windowed = true; pres.SwapEffect = SwapEffect.Discard; device = new Device(0, DeviceType.Hardware, this, CreateFlags.SoftwareVertexProcessing , pres); //change this later device.RenderState.CullMode = Cull.None; device.RenderState.Lighting = false; texture = CreateOverlayTexture(device); vertices = CreateVertexBuffer(device); return true; } protected Texture CreateOverlayTexture(Device device) { Texture t = TextureLoader.FromFile(device, "texture.bmp"); return t; } protected void Render() { // Sets the surface to black everywhere device.Clear(ClearFlags.Target, Color.Bisque, 1.0F, 0); device.BeginScene(); device.SetTexture(0, texture); // 3D Rendering calls go here SetupMatrices(); device.SetStreamSource(0, vertices, 0); device.DrawPrimitives(PrimitiveType.TriangleList, 0, 1); device.EndScene(); device.Present(); } protected VertexBuffer CreateVertexBuffer(Device device) { device.VertexFormat = CustomVertex.PositionTextured.Format; VertexBuffer buf = new VertexBuffer( typeof(CustomVertex.PositionTextured ), // What type 3, // How many device, // The device 0, // Default usage CustomVertex.PositionTextured.Format, // Vertex format Pool.Default); // Default pooling PopulateVertexBuffer(buf); return buf; } protected void PopulateVertexBuffer(VertexBuffer vertices) { // Note use of TransformedTextured CustomVertex.PositionTextured [] verts = (CustomVertex.PositionTextured[]) vertices.Lock(0, 0); int i = 0; verts[i++] = new CustomVertex.PositionTextured ( 0, 1, 0,0,1); verts[i++] = new CustomVertex.PositionTextured( -0.5F, 0, 0, 1,0); verts[i++] = new CustomVertex.PositionTextured( 0.5F, 0, 0, 1,1); vertices.Unlock(); } protected void SetupMatrices() { float angle = Environment.TickCount / 500.0F; device.Transform.World = Matrix.RotationY(angle); device.Transform.View = Matrix.LookAtLH(new Vector3(0, 0.5F, -3), new Vector3(0, 0.5F, 0), new Vector3(0, 1, 0)); device.Transform.Projection = Matrix.PerspectiveFovLH((float)Math.PI/4.0F,1.0F, 1.0F, 5.5F); } protected void DisposeGraphics() { device.Dispose(); } } }