08 Lighting Basics

We talked about how to create and render three-dimensional objects last time, but they were all black because there were no lights in the scene. Between the time I wrote that and now, someone pointed out to me that we could simply have added this line of code to InitializeGraphics:

device.RenderState.Lighting = false;

By disabling lighting this way, we’d tell Direct3D not to bother trying to figure out how to make things look like light is shining on them, but simply to render them using whatever color information the vertices provide. Of course, proper lighting is a big part of making any scene realistic, so let’s talk about it.

There are four different sorts of lights in Direct3D: point lights, directional lights, spot lights, and ambient lights.

Point lights emanate light equally in all directions from a particular point in space. You might use this sort of light if you were modeling a light bulb.

Directional lights don’t emanate from a particular point in space. Rather, they are assumed to be located an infinitely far distance away. Because of this all of the light rays are parallel to each other; they all travel in the same direction. You’ll often use a directional light to model natural lighting, like sunlight or moonlight.

Spot lights are pretty much that…spotlights. Like point lights, they emanate from a particular point in space. But rather than shining in all directions, they’re constrained to point in a particular direction. The light forms a cone in space, and there are parameters you can control to determine how wide the cone is and how quickly the light fades at the edges.

Ambient light is different. Ambient light is meant to model the light that bounces off of objects, hits other objects and bounces off those, and so on and so on. The end result is a sort of overall lighting that lights everything more or less equally. In fact that’s exactly how Direct3D chooses to model it – as a light with no source that illuminates everything evenly.

Now, each of these different lights has several different components. Each light has both a diffuse and a specular color. As it turns out, lights also have an ambient color, but that’s a little more advanced than what we want to get into right now. We’ll leave specularity for our discussion of materials. That just leaves diffuse lighting.

I was confused by the term “diffuse” when I first saw it. After looking through some computer graphics textbooks, I found out where the term came from, and then promptly forgot what the origin is. Now, I just think of it as the “ordinary” component of a light, since that’s what it is.

The ultimate goal in Direct3D is to draw pixels on the screen, a process known as rendering. The lighting in a scene strongly affects what color these pixels will be: if the light is very close or very bright, an object will be drawn with lighter pixels (assuming the object itself isn’t jet black). If the light is dimmer or farther away from an object, then obviously the object should be rendered darker.

However, it’s not as simple as that. Take this picture of a sphere resting on a black surface:

I’ve put in a single directional light coming from the right side. Notice how the object gets darker away from the light? It’s obvious once you think about it, but you might never have noticed before.

What’s going on here is that Direct3D is using something called the surface normal to calculate how light interacts with an object. The word “normal” in this context has nothing to do with being ordinary. Rather, it’s a mathematical term that means “a vector that’s perpendicular to something”. So a surface normal is a vector that points straight out of the surface. In the case of our sphere, the surface normal always points straight out from the center of the sphere, but for more complex objects, this wouldn’t necessarily be the case.

The way that surface normals play into lighting is quite straightforward. For every point on an object’s surface, Direct3D calculates the angle between the light and the surface normal at that point. The closer to zero the angle is, the more brightly lit the object is at that point. The closer the angle is to ninety degrees, the more dimly lit the object is at that point. For angles past ninety degrees, the normal is pointing away from the light source, and the object isn’t lit at all at that point.

You can see this on our sphere. At the right side, where the surface normal points towards the light, the sphere is almost pure yellow (it’s real color). Near the left edge, the object is not light very brightly because the surface normal makes an angle with the light source that is either equal to or greater than ninety degrees.

And actually, it’s not quite that simple. This rendering of the sphere makes it look almost perfectly smooth. But we already know that objects are made up of vertices, which in turn make up triangles. Unless I’m using a really ridiculous number of vertices, the object should really look more like this:

Vertex shading is a wonderful thing. It’s what lets us use a model that’s actually faceted, but looks smooth. And it all has to do with the way the lighting model works.

It turns out that Direct3D models don’t actually store a surface normal for every position on the object. That would be hugely expensive. Instead, we can store a normal at every vertex. Then, Direct3D smoothly interpolates the normals across the surface of the triangle, the same way it interpolated colors for our multicolored triangle. Since the normal changes gradually across the surface, so does the intensity of the light, making it look smooth.

Of course, it’s an illusion – the surface isn’t actually curved. If you look carefully at the outline of the first sphere, you can see the facets – it’s shaped a little more like a stop sign than it is like the circle it would be if it were a true sphere. These sorts of artifacts are particularly noticeable when you’re using a small number of triangles. Say like this:

Obviously, the illusion is starting to break down. This is because I’ve used fewer triangles this time around – as you can see clearly when I turn off shading:

How many triangles to use is a crucial decision if you care about performance. The more you use, the better your scene will look, but the slower it will render. Striking a balance will depend on figuring out your particular requirements.

OK, so how do we do all this in code? Well, let’s add a new method to our code, called SetupLights. Here it is:

protected void SetupLights() { // Use a white, directional light coming from over our right shoulder device.Lights[0].Diffuse = Color.White; device.Lights[0].Type = LightType.Directional; device.Lights[0].Direction = new Vector3(-3, -1, 3); device.Lights[0].Commit(); // Or Update() with the 2004-Oct SDK device.Lights[0].Enabled = true; // Add a little ambient light to the scene device.RenderState.Ambient = Color.FromArgb(0x40, 0x40, 0x40); }

Let’s break it down a line at a time. We want to set up a white, directional light that’s coming from somewhere over our “right shoulder”. Here’s how we start

device.Lights[0].Diffuse = Color.White;

OK. What we’re doing here is accessing the Lights array of the Device. Lights are a limited resource – you only have so many of them. On my system, the number is fixed at eight, but it could be different on your system. However many there are, you access them individually via Lights.

Here we’re setting the diffuse color for the light. Remember, diffuse means “ordinary” (I need to avoid the word “normal”, since we’ve overloaded that now). It’s just the color the light is. You’ll usually use white, although other colors might be useful in some situations.

Next we set the type of light. Here we’re using a directional light, but we could also have used a point light or a spot light.

device.Lights[0].Type = LightType.Directional;

Directional lights need a direction. So we set that:

device.Lights[0].Direction = new Vector3(-3, -1, 3);

If we were using a point light, we wouldn’t need to do that, since they shine in all directions equally. However, we would need to set a position, which we don’t need to do for a directional light, since they’re assumed to be very, very far away. Spot lights would need both a position and direction.

Once we’ve set up the light, we just need to tell Direct3D to accept the changes

device.Lights[0].Commit();

Update:

if you're using the October 2004 DirectX SDK instead of the Summer 2003 version on which these tutorials were written, you'll find that they renamed the Commit method to Update, so that last line of code will need to read like this

device.Lights[0].Update();

instead.

and to turn the light on

device.Lights[0].Enabled = true;

It’s probably also a good idea to add a small bit of ambient light to the scene:

device.RenderState.Ambient = Color.FromArgb(0x40, 0x40, 0x40);

As we discussed before, ambient light illuminates the entire scene evenly, regardless of surface normals. We use it because without it, parts of objects that are pointing away from a light source are totally black, which doesn’t look very realistic. Even the underside of your desk gets a little bit of light from reflections off the floor and other bits of your desk.

At this point, our lighting is all set. All we need to do is make sure we call SetupLights at some point – either during InitializeGraphics if they’re never going to change or during the call to Render if we want the brightness or position of the light to evolve with the scene. However, if that was the only change we made, we’d find that our object still rendered as completely black. What’s going on?

Recall that lighting interacts with a surface according to its surface normal. Well, so far we haven’t talked about how exactly to set surface normals up on a object, other than to mention briefly that the information is stored with the vertices. Luckily, it’s fairly straightforward. We just have to use one of the vertex formats that lets us store normal information, like CustomVertex.PostionNormalColored. The constructor of a PositionNormalColored vertex takes the same arguments as the PositionColored we’ve been using, plus three more – the x, y, and z components of the vertex surface normal.

Because we're using this vertex format, we'll need to make sure we tell the Device, by calling

device.VertexFormat = CustomVertex.PositionNormalColored.Format;

Otherwise, the Device will be expecting vertices in some other format, and Bad Things will happen. You can put this call pretty much anywhere in your code, since once it's set, it stays set until you change it again. InitializeGraphics is a reasonable place for it.

For reasons that will become clear in a later article, I've split the code that creates the vertex buffer off from the code that populates the buffer with vertices. Here’s the code:

protected VertexBuffer CreateVertexBuffer(Device device) { VertexBuffer buf = new VertexBuffer( typeof(CustomVertex.PositionNormalColored), 3, device, 0, CustomVertex.PositionNormalColored.Format, Pool.Default); PopulateVertexBuffer(buf); return buf; } protected void PopulateVertexBuffer(VertexBuffer vertices) { CustomVertex.PositionNormalColored[] verts = (CustomVertex.PositionNormalColored[]) vertices.Lock(0, 0); int i = 0; verts[i++] = new CustomVertex.PositionNormalColored( 0, 1, 0, // vertex position 0, 0, 1, // vertex normal Color.Red.ToArgb()); // vertex color verts[i++] = new CustomVertex.PositionNormalColored( -0.5F, 0, 0, 0, 0, 1, Color.Green.ToArgb()); verts[i++] = new CustomVertex.PositionNormalColored( 0.5F, 0, 0, 0, 0, 1, Color.Blue.ToArgb()); vertices.Unlock(); } // Which is almost exactly like the code we’ve been using all along, with the addition of the // PositionNormalColored vertices and the necessary three extra constructor arguments.

Here, I’ve chosen to set all three normals the same because I want our rotating triangle to look flat. Remember that if I point them in different directions, it will cause the normal to be interpolated across the surface, giving the illusion of curvature. Take a look at the lighting tutorial in the C# Direct3D SDK – they render a cylinder in that one using code very similar to what we’re doing here, and they calculate the normals to make the surface look curved.

If you run this code, you’ll see that the triangle gets lighter as it turns towards the light, and darker as it turns away, making for a much more realistic scene. But depending on how you’ve setDevice.RenderState.CullMode, you’re also going to see one of two completely undesirable effects.

If you’ve left the CullMode set to its default value, what you’ll see is the same disappearing triangle we saw last time. And for the same reason – Direct3D by default does not render faces considered to be pointing away from the camera. The fix for this last time was to set the CullMode to Cull.None. But this has its own problems.

What you’ll see if you turn off backface culling might seem a bit weird if you don’t think about it. As the triangle spins, you’ll notice that while one face reacts to the light, getting darker and lighter as it turns towards and away from the light source. But the other face just remains black! Here are a couple of screenshots showing the “front” and “back” of the triangle:

We actually have all the information we need to understand this behavior. Recall that the amount a face gets lit has to do with the angle between the light source and the surface normal. Well, as the triangle rotates, our surface normals sometimes point towards the light source, and sometimes point away! When this happens, the angle between the two vectors is much greater than 90 degrees, and Direct3D doesn’t light the surface at all.

The fix for this is weird but simple. We actually need to render two triangles in the same place – one with surface normals that point one way, one with surface normals that point the other. We also need to turn culling on so that Direct3D only draws the one currently facing the camera, and we need to make sure we get the order of the vertices right so that it culls the correct one at all times.

The only change to CreateVertexBuffer is to make sure it’s big enough to hold six vertices instead of three:

protected VertexBuffer CreateVertexBuffer(Device device) { VertexBuffer buf = new VertexBuffer( typeof(CustomVertex.PositionNormalColored), 6, // NOTE!!! device, 0, CustomVertex.PositionNormalColored.Format, Pool.Default); PopulateVertexBuffer(buf); return buf; } In PopulateVertexBuffer, we just need to add the definition of the second triangle: protected void PopulateVertexBuffer(VertexBuffer vertices) { CustomVertex.PositionNormalColored[] verts = (CustomVertex.PositionNormalColored[]) vertices.Lock(0, 0); int i = 0; verts[i++] = new CustomVertex.PositionNormalColored( 0, 1, 0, 0, 0, 1, Color.Red.ToArgb()); verts[i++] = new CustomVertex.PositionNormalColored( -0.5F, 0, 0, 0, 0, 1, Color.Green.ToArgb()); verts[i++] = new CustomVertex.PositionNormalColored( 0.5F, 0, 0, 0, 0, 1, Color.Blue.ToArgb()); verts[i++] = new CustomVertex.PositionNormalColored( 0, 1, 0, 0, 0, -1, Color.Red.ToArgb()); verts[i++] = new CustomVertex.PositionNormalColored( 0.5F, 0, 0, 0, 0, -1, Color.Blue.ToArgb()); verts[i++] = new CustomVertex.PositionNormalColored( -0.5F, 0, 0, 0, 0, -1, Color.Green.ToArgb()); vertices.Unlock(); }

And we also need to change the call to DrawPrimitives in our Render method to draw two triangles instead of just one:

device.DrawPrimitives(PrimitiveType.TriangleList, 0, 2);

With those changes in place, our triangle will now render correctly, interacting realistically with the light as it spins.

I’ve included the complete code for this article below. To make it easy for you to play with, I’ve put two #define definitions at the top of the file. Commenting and uncommenting these will allow you to flip between culling and not culling and rendering one triangle and rendering two triangles.

Next time, we’ll talk about how to move beyond simple colored triangles. We’ll discuss textures, which let you map arbitrary images onto your objects.

The Code

// Comment this line out to have two triangles //#define SINGLETRIANGLE // Comment this line out for default culling //#define CULLNONE using System; using System.Drawing; using System.Collections; using System.ComponentModel; using System.Windows.Forms; using System.Data; using Microsoft.DirectX; using Microsoft.DirectX.Direct3D; namespace Craig.DirectX.Direct3D { public class Game : System.Windows.Forms.Form { private Device device; private VertexBuffer vertices; static void Main() { Game app = new Game(); app.Width = 400; app.Height = 400; app.InitializeGraphics(); app.Show(); while (app.Created) { app.Render(); Application.DoEvents(); } } protected bool InitializeGraphics() { PresentParameters pres = new PresentParameters(); pres.Windowed = true; pres.SwapEffect = SwapEffect.Discard; device = new Device(0, DeviceType.Hardware, this, CreateFlags.SoftwareVertexProcessing, pres); vertices = CreateVertexBuffer(device); #if CULLNONE device.RenderState.CullMode = Cull.None; #endif device.VertexFormat = CustomVertex.PositionNormalColored.Format; return true; } #if SINGLETRIANGLE protected void PopulateVertexBuffer(VertexBuffer vertices) { CustomVertex.PositionNormalColored[] verts = (CustomVertex.PositionNormalColored[]) vertices.Lock(0, 0); int i = 0; verts[i++] = new CustomVertex.PositionNormalColored( 0, 1, 0, // vertex position 0, 0, 1, // vertex normal Color.Red.ToArgb()); // vertex color verts[i++] = new CustomVertex.PositionNormalColored( -0.5F, 0, 0, 0, 0, 1, Color.Green.ToArgb()); verts[i++] = new CustomVertex.PositionNormalColored( 0.5F, 0, 0, 0, 0, 1, Color.Blue.ToArgb()); vertices.Unlock(); } protected VertexBuffer CreateVertexBuffer(Device device) { VertexBuffer buf = new VertexBuffer( typeof(CustomVertex.PositionNormalColored), 3, device, 0, CustomVertex.PositionNormalColored.Format, Pool.Default); PopulateVertexBuffer(buf); return buf; } #else protected void PopulateVertexBuffer(VertexBuffer vertices) { CustomVertex.PositionNormalColored[] verts = (CustomVertex.PositionNormalColored[]) vertices.Lock(0, 0); int i = 0; verts[i++] = new CustomVertex.PositionNormalColored( 0, 1, 0, 0, 0, 1, Color.Red.ToArgb()); verts[i++] = new CustomVertex.PositionNormalColored( -0.5F, 0, 0, 0, 0, 1, Color.Green.ToArgb()); verts[i++] = new CustomVertex.PositionNormalColored( 0.5F, 0, 0, 0, 0, 1, Color.Blue.ToArgb()); verts[i++] = new CustomVertex.PositionNormalColored( 0, 1, 0, 0, 0, -1, Color.Red.ToArgb()); verts[i++] = new CustomVertex.PositionNormalColored( 0.5F, 0, 0, 0, 0, -1, Color.Blue.ToArgb()); verts[i++] = new CustomVertex.PositionNormalColored( -0.5F, 0, 0, 0, 0, -1, Color.Green.ToArgb()); vertices.Unlock(); } protected VertexBuffer CreateVertexBuffer(Device device) { VertexBuffer buf = new VertexBuffer( typeof(CustomVertex.PositionNormalColored), 6, // NOTE!!! device, 0, CustomVertex.PositionNormalColored.Format, Pool.Default); PopulateVertexBuffer(buf); return buf; } #endif protected void SetupMatrices() { float angle = Environment.TickCount / 500.0F; device.Transform.World = Matrix.RotationY(angle); device.Transform.View = Matrix.LookAtLH(new Vector3(0, 0.5F, -3), new Vector3(0, 0.5F, 0), new Vector3(0, 1, 0)); device.Transform.Projection = Matrix.PerspectiveFovLH((float)Math.PI/4.0F, 1.0F, 1.0F, 10.0F); } protected void SetupLights() { device.Lights[0].Diffuse = Color.White; device.Lights[0].Type = LightType.Directional; device.Lights[0].Direction = new Vector3(-1, -1, 3); device.Lights[0].Commit(); // If using the October 2004 DirectX SDK, use Update instead of Commit, like so: // device.Lights[0].Update(); device.Lights[0].Enabled = true; device.RenderState.Ambient = Color.FromArgb(0x40, 0x40, 0x40); } protected void Render() { device.Clear(ClearFlags.Target, Color.Bisque, 1.0F, 0); device.BeginScene(); SetupMatrices(); SetupLights(); device.SetStreamSource(0, vertices, 0); #if SINGLETRIANGLE device.DrawPrimitives(PrimitiveType.TriangleList, 0, 1); #else device.DrawPrimitives(PrimitiveType.TriangleList, 0, 2); #endif device.EndScene(); device.Present(); } } }