DIGM 512: Shader Writing and Programming

Winter 2012

Dr. Paul Diefenbach

pjdief at drexel dot edu

Office Hours: By Appointment


Course Description

Development of custom output shaders allows for the use and manipulation of materials for use in production render engines. This course focuses on the basic components of shaders including reflective, translucency, and illumination models.

Course Purpose

This course will introduce students to shaders for both real-time and non-real-time rendering. 

Expected Learning Outcomes

This course will teach the basis of both real-time and non-real time rendering, and the lighting equations associated with each.  Students will learn to write programmable shaders and use tools to create new shaders.


Classes will be a combination of:

  • ·         instruction and tutorial
  • ·         class discussions
  • ·         individual and group assignments
  • ·         lab and presentation period

Class participation is an important part of your evaluation and grade.  In addition, students will be required to work in teams outside of class in the computer labs, doing research online and in books and journals, and hands-on exposure to various video games.




The term project will be done individually or in small groups. Each week after the proposal, the group will give a brief 5-10 presentation on what they've accomplished the previous week, showing off new features, highlighting what worked well, what they have trouble on, and an opportunity for collecting feedback.

Project Proposal (5%) (Due week 6)

Project Status Reports (20%) (Due Week 7, 8, 9, 10)

Final Project Presentation (50%) (Due finals week)


            Please place your assignments in a folder labeled with your name (first initial, last name) or team name in the DIGM512 "Submission" folder, in the folder of the assignment/project name. The DIGM512 folder is located on DigmFiles, which can be accessed from the labs or using a FTP program on your home computer.

            Projects are to be put in the folder before the start of class, ready for presentation. You must present what you have; there are no late submissions.



15% - Assignments (1-3)
10% - Participation/Attendance
25% - Project Assessments
50% - Project Demo Presentation


You are expected to attend all classes.  Class participation is an important part of your grade. Unexcused absence results in a deduction in your grade.  Missing 3 classes results in automatic failure.  If a student must miss class, it is the student's responsibility to contact me by phone the day prior to the missed class.  Students will also be responsible for getting missed notes from the other students.  

Academic Integrity, Plagiarism, and Cheating Policy

Students with Disability Statement

Course Drop Policy

Course Change Policy

The instructor reserves the right to change the course during the term at his or her discretion. These changes will be communicated to students via the syllabus, website announcement, or email


Week 1: History and Overview of Rendering

Overview of Rendering
Local vs. Global
Ray tracing
Rendering Math

History of Graphics


Embed gadget

Ray Tracing


Interactive Cross Product Demo
Interactive Dot Product Demo

Week 2: Maya Renderers

Maya Renderers
Shader networks/hypershade

Assignment 1:
Create a Maya shader network to combine 3 different materials that are not normally combined, i.e. Bob The Alien's rusty-furry-glass-car or her favorite breakfast, a metalic-appleskinned-frosted-donut.

Maya Renderers:

Renderer Use

mental ray® for Maya® renderer

A general purpose renderer that includes exclusive, advanced rendering functionality, such as host and network parallel rendering, area light sources for soft shadows, global illumination, and caustics (light patterns).

Maya’s Software renderer

A general purpose renderer with broad capabilities. You can produce high-quality images with complex shading networks, including procedural textures and ramps. Software rendering is computed through your machine’s processor.

Interactive Photorealistic Rendering (IPR)

A feature of Maya’s software renderer and mental ray for Maya renderer, used to make interactive adjustments to the final rendered image. You can adjust shading and lighting attributes in real-time, and IPR automatically updates the rendered image to show the effects of your changes. IPR is useful for tweaking an image before rendering to disk.

Maya Vector renderer

A specialized renderer used to produce stylized renderings (for example, cartoon, tonal art, line art, hidden line, wireframe) in various bitmap image formats (IFF, TIFF) or in 2D vector formats (SWF, AI, EPS, SVG). The Maya Vector renderer is often used to render web-ready images.

Maya’s Hardware renderer

A general purpose renderer that uses your machine’s graphics card for computation. You can produce broadcast resolution images in less time than with software rendering, and in some cases, the quality may be good enough for final delivery.

Maya Rendering Reference:

Maya Rendering Reference

Renderer Comparison:

Renderer Comparison

Hypershade Tutorials:

Hypershade Tutorial

Hypershade Tutorial #2

Maya Layered Textures:

Layered Textures

Double-sided shader

Double-sided material

Maya Shader Networks (The One Ring):

The One Ring pt. 1

The One Ring pt. 2

More complex shader network (Apple Shader):

Maya Shader Utilities:,topicNumber=d0e537815

Include gadget (iframe)

Maya Real-time Rendering:

Include gadget (iframe)

Week 3&4: Rendering and Shaders

Pixel (Fragment)
Maya Mental Ray shaders
Renderman shaders

Note: Assignments use the file //DIGMFILES/Faculty_Files/Diefenbach_Paul//DIGM512/
1) Complete the smiley face shader
2) Make smiley face bump or geometry shader
3) Compile and use 3 shaders from included weblinks below
    (see Mental Ray Shader Source Code Examples)

Extra credit:
Modify any of the 3 shaders.  The more the modification, the more the credit.

IMPORTANT NOTE on building the shaders:

The libraries supplied in the zip file were built for Maya 2011 on the Mac, so you need to rebuild the object files (.o) and the library files (.so, .dylib or .dll on Windows) if you are using a different OS or version.

Building using the Makefiles:
If you are having trouble with Cutter, just use the supplied makefiles that I gave you.  First remove the .o and .so and .dylib files, such as using the following from Terminal on the Mac: rm smiley_textures.o

Then you can build the .dylib files using the appropriate makefiles, i.e.:
make -f Makefile.smiley_textures
using Terminal on Mac, CMD on Windows.

On the Mac, the library that is built should end in .dylib, so the Makefile copies the .so file to one with the same name but with a .dylib extension.  On windows, it will have a .dll extension, and you might need to edit the Makefile accordingly since I didn't test it on Windows.

Building using Cutter:
If you want use cutter to compile, first copy the Makefile.shadeop_OSX that I supplied on the Mac to Cutter's template directory which is at Cutter_Help/templates/MakeFile/, or edit it as follows: : YOUR_SHADER_NAME.c OTHER_OBJ_PATHS


    gcc ${CFLAGS} -c YOUR_SHADER_NAME.cpp
    cp YOUR_SHADER_SO_PATH.dylib

    gcc ${CFLAGS} -c YOUR_SHADER_NAME.cpp

This will make the Makefile in cutter look for the .cpp file instead of the .c version.
The Windows version would look the same but have .dll instead of .so

Then load smiley_textures.cpp into cutter and choose Script->Compile C++ Source.

Rendering the .MI file:
As far as rendering the .mi file, you will have to do it through Maya instead of through cutter, because you don't have "ray" installed on those machines which is the mental ray renderer.  Instead there is Maya's "Render" application, which is similar to ray but I'm not sure how you pass in the command line arguments.  All of that can be set up in Cutter's preferences under Languages->Mi if you want to try to use "render" instead of "ray", but it is easier to just load it into Maya the way I showed.

Mental Images Mental Ray

 Mental Ray Shader Networks:

Mental Ray Shaders:

Mental Ray Dieletric:

Maya Mental Ray Shader Network Tutorial:

Mental Ray SSS tutorials:

Writing Native Shaders

A vertex program must output a position and may return one or more colors, texture coordinate sets, and other per-vertex outputs. A fragment program, however, must reduce everything to a single color that will update the frame buffer. (In some advanced profiles, fragment programs can write additional data such as a depth value as well.)

An example of a surface shader that defines a metal surface is:

surface metal(float Ka = 1; float Ks = 1; float roughness = 0.1;)
  normal Nf = faceforward(normalize(N), I);
  vector V = - normalize(I);
  Oi = Os;
  Ci = Os * Cs * (Ka * ambient() + Ks * specular(Nf, V, roughness));
Shaders do the work by reading and writing special variables such as Cs (surface color), N (normal at given point), and Ci (final surface color).

Mental Ray provides a shader language to do this.

Teaching slides
from Writing mental ray shaders: A perceptual introduction, the third mental ray Handbook now available from SpringerWienNewYork and

Writing Mental Ray Shaders Book

Building and Loading Shaders in Maya:

NOTE for Mac Users:  Beginning Autodesk Maya 2011, mental ray for Maya has implemented the default substitution rule .so > .dylib for the Mac OS X platform. Old shader libraries with extension must be renamed. The consistent .so/.dll/.dylib substitution rules provide .mi scenes with cross-platform compatibility.

How to compile, load, and use the “Hello World” Mental Shader:,topicNumber=d0e535338

Steps for Custom Mental Ray shader:

Mental Ray Manual:

Mental Ray Shader Source Code Examples:

Many shaders from Writing mental ray shaders: A perceptual introduction, the third mental ray Handbook now available from SpringerWienNewYork and

Many many shader examples:

Mental Ray shaders:

MKMISHADER (generating code from mi file):

The shader skeleton utility mkmishader is started as

   mkmishader [options] [scenefile...]

This utility is intended for shader writers only. It takes shader declaration files and generates C source code files that implement the shader. The scenefiles should only contain shader declarations. One source file per shader is created. If it already exists, it is overwritten. The sources include all necessary includes, declarations, local variables, parameter evaluation statements, and array loops, but the implementation of the actual algorithm is, of course, missing.

Maya, Mel, Python, Renderman & Mental Ray Scripting in Cutter:

Scripting for Maya and Renderers

Note: Makefile and other tempates in:
Modified Cutter makefile for C++ compiling is in zip file for this class.

Misc Mental Images Stuff:!

Mental Ray Glossary & Tutorials

Implementing the mental images Phenomena Renderer on the GPU
Mental Mill:


Renderman for Maya Tutorials:
Virtual Statue Renderman for Maya:

Renderman Shader Tutorial:

Renderman Shaders (free):

Advanced Rendering Effects

Procedural Shaders:

Interactive Fractals:

Understanding Fractal Shaders- Building a Marble Pattern:

Maya Fractal Textures:

Fractal Noise:

Perlin Noise:

Perlin Noise

Simple Explanation:

The Importance of Being Noisy: Fast, High Quality Noise:

Advanced Noise Foundations: Fourier Transforms:
Nyquist info:

Pixar's Wavelet Noise Paper:

Matrix, Vector, and Quaternion Math:

Week 5: Real Time Rendering

Overview of Shaders:

There are two widely used shader languages out there: HLSL (or Cg, which is the same for practical purposes) and GLSL. Cg/HLSL is used by Direct3D, Xbox 360 and PS3. GLSL is used by mobile platforms (OpenGL ES 2.0), Mac OS X (OpenGL) and upcoming WebGL.  Traditionally, Unity shaders are written in Cg/HLSL

You can think HLSL as a C language for GPU programming except there are no pointer, union, bitwise operations, and function variables. There are no goto, switch, recursive function in HLSL as well. However HLSL adds vector data type, build-in constructor, swizzling and masking operators. HLSL standard library includes mathematical functions and texture processing functions.

Great Overview of not just HLSL but shader principles in general:
Programming Vertex, Geometry, and Pixel Shaders, Second Edition, by Wolfgang Engel

ShaderX2: Intro & Tutorials (DirectX 9)

Other HLSL overview:

D3D Effects Files:
Higher level shading languages like HLSL and Cg define rendering methods and effects files define the rendering context (based on platform etc.).  Effect files can contain different rendering techniques that can be chosen based on the hardware available as well as HLSL code to define rendering. Direct3D provides a set of
interfaces to allow the easy manipulation of effects.  This is similar to the use of .mi files in MentalRay.

Effect File Structure

An effect file is a text file with a .fx extension. It is split into three main sections:
  1. Variable declarations - these are values that can be set before rendering and then used in this effect file. Examples include: textures, world matrix, lighting parameters
  2. Techniques & Passes- defines how something is to be rendered. It includes state information along with vertex and shader declarations.
  3. Functions - the shader code written in HLSL



Shader Translators and Optimizers:
HLSL2GLSL, an existing open source project that ATI has made 4 years ago and seemingly abandoned
Unitiy's fork of this translator, named “hlsl2glslfork”, is here:

“GLSL Optimizer”:

Unity3D Shaders:

Shaders in Unity can be written in one of three different ways:

  • Surface Shaders will probably be your best bet. Write your shader as a surface shader if it needs to interact properly with lighting, shadows, projectors, etc. Surface shaders also make it easy to write complex shaders in a compact way - it's a higher level of abstraction. Lighting for most surface shaders can be calculated in a deferred manner (the exception is some very custom lighting models), which allows your shader to efficiently interact with many realtime lights. You write surface shaders in a couple of lines of Cg/HLSL and a lot more code gets auto-generated from that.
  • Vertex and Fragment Shaders will be required, if you need some very exotic effects that the surface shaders can't handle, if your shader doesn't need to interact with lighting or if it's an image effect. Shader programs written this way are the most flexible way to create the effect you need (even surface shaders are automatically converted to a bunch of vertex and fragment shaders), but that comes at a price: you have to write more code and it's harder to make it interact with lighting. These shaders are written in Cg/HLSL as well. (NOTE: most other systems integrate lighting into the fragment shader, but because of Unity's mixed lighting models, you should not typically do this!)
  • Fixed Function Shaders need to be written for old hardware that doesn't support programmable shaders. You will probably want to write fixed function shaders as an n-th fallback to your fancy fragment or surface shaders, to make sure your game still renders something sensible when run on old hardware or simpler mobile platforms. Fixed function shaders are entirely written in a language called ShaderLab, which is similar to Microsoft's .FX files or NVIDIA's CgFX.
To create a new shader, either choose Assets->Create->Shader from the menubar, or duplicate an existing shader, and work from that. The new shader can be edited by double-clicking it in the Project View.

Unity3D Shader Lab:

Regardless of which type you choose, the actual meat of the shader code will always be wrapped in ShaderLab (file extension .SHADER , which is used to organize the shader structure, similar to an .MI file in Mental Ray. It looks like this:

Shader "MyShader" {
    Properties {
        _MyTexture ("My Texture", 2D) = "white" { }
        // other properties like colors or vectors go here as well
    SubShader {
        // here goes the 'meat' of your
        //  - surface shader or
        //  - vertex and fragment shader or
        //  - fixed function shader
    SubShader {
        // here goes a simpler version of the SubShader above that can run on older graphics cards
ShaderLab shaders encompass more than just "hardware shaders". They do many things. They describe properties that are displayed in the Material Inspector, contain multiple shader implementations for different graphics hardware, configure fixed function hardware state and so on. The actual programmable shaders - like vertex and fragment programs - are just a part of the whole ShaderLab's "shader" concept.

If you want to write shaders that interact with lighting, take a look at Surface Shaders documentation.

Surface Shaders in Unity is a code generation approach that makes it much easier to write lit shaders than using low level vertex/pixel shader programs. Note that there is no custom languages, magic or ninjas involved in Surface Shaders; it just generates all the repetitive code that would have to be written by hand. You still write shader code in Cg / HLSL.

Unity3D Shader Design history, background, and theory:

Unity3D Shader Lab:

Example shader tutorials and source code:

Source Code for all built-in Unity Shaders:

Unity Built-in Shaders:

Unity Shader Performance:

Shader IDEs (Integrated Development Environment)

ATI RenderMonkey (discontinued at DirectX® (9.1), OpenGL® (2.0), and OpenGL ES® (2.0) shading languages)

FX Composer 2.5 (Windows only) is a powerful integrated development environment for shader authoring.

mental mill® enables artists and other professionals to develop, test and maintain shaders and complex shader graphs for GPU and CPU rendering through an intuitive graphical user interface with real-time visual feedback without the need for programming skills.


Unity Strumpy Shader Editor
Available in Unity Asset Store for free

Strumpy Tutorial Videos:
Beta 4 -
Beta 1 (Still helpful!) -

Strumpy FAQ Post:
Some questions, answered with demo's:

 Misc Resources:

Many Realtime rendering resources:

Great 3D graphics tutorial applets:

Vector and 3D space:

Game Programming Wiki :