During the last lesson, we learned the basics of using Blender and simple 3D modelling techniques. This lesson will demonstrate how to modify these models into realistic looking images.
Rendering is a process of converting a 3D scene into a 2D image or animation. This process usually takes a long time because it tries to recreate many real world physics phenomena. A standard Blender install usually has 2 rendering engines, but many more can be added with the use of addons - both free and commercial (comparison). The engines normally provided with Blender are:
Blender Internal, aka Blender Render - a standard direct illumination raytracing engine
Cycles - a global illumination engine utilizing pathtracing, a technique based on a Monte Carlo simulation
In order to compare these two engines, create a following scene
Create a new empty project and delete the cube
Add a new "Plane" mesh to the middle of the scene and scale it 20 times in all directions
Add a "Monkey" mesh, raise it one unit in the Z axis and rotate it -30 degrees around the X axis
With the monkey selected, press CTRL-2 to automatically add a Subdivision Surface modifier at level two - also choose Smooth shading from the menu on the left
Enter the camera view and zoom-in to the monkey, a little bit
Select the light (probably easiest from the object tree at the top left part of the program)
Choose the light tab from the area right underneath the object list (third tab from the right) and change the light type from Point to Sun
To render such a scene, you simply need to go to the Render tab (a little icon in the menu on the right) and click the Render button or simply press F12 on the keyboard. This will generate an image similar to the one below (you can save it using the Save as Image option from the Image menu on the bottom or by pressing F3 on the keyboard):
Note that, however, the shadows on that image are perfectly black. This never occurs in nature - perfect blackness doesn't exist in real life. Light will always find a path, by bouncing off of other surfaces in the scene and illuminating each shadow, even if by a little bit. The shadows in the above image aren't illuminated because Direct Illumination methods follow the rays from the lightsource and if such a path doesn't exits for a given pixel (because it is occluded by an object) it will give a prefectly black color.
One way of dealing with this (also used in many games and rendering engines) is Ambient Occlusion. Switch to the World tab (the planet icon, fourth from the left), mark the Ambient Occlusion checkbox and set it's Factor to 0.2. After rendering this image, you should receive something like this:
This image looks much better, but this is still largely a "hack" by a simple approximation - the object is first rendered using uniform illumination and then added to the normal image, as rendered above. Change the engine from Blender Render to Cycles Render using the drop down menu on the top of the application. Now you can disable the Ambient Occlusion option and after rendering, you should receive something that looks like this:
Notice that the way that this image renders is completely different than before and takes much more time and CPU power. Furthermore, Blender has a very neat feature that allows performing this render in real-time, each time we modify something in the scene. Simply change the Viewport Shading to Rendered in the menu at the bottom of the 3D view:
After activating it, we can see how the scene will look rendered immediately after we change anything in the scene. You can now see how the Monte Carlo algorithm renders this scene - it starts from almost a random arrangement of pixels and iteratively refines the image, approaching the realistic image, the more we let it run. The quality of the viewport rendered image is always slightly worse than an complete render, so it's worth pressing F12 after we are happy with how the scene looks. The quality of the rendering can be configured in the settings (Render tab, Sampling section, Sample parameters).
If you have a computer with a good graphics card, you can use CUDA or OpenCL to greatly increase the performance of the rendering. Simply open File/User Preferences and switch to the last tab titled System - at the bottom left, choose the Cycles Compute Device. Now, in the Render tab (right side of the main window), you can switch the CPU to GPU Compute in the Device section.
In order to change the color of the monkey above, we need to change the material assigned to it. We briefly mentioned materials while changing the colors of the objects during the last exercise. This time, we will talk about it in more details, but from the point of view of the Cycles engine.
After selecting the monkey, let's switch to the Materials tab (a shiny ball icon, fourth from the right, at the right side menu) and add a new material by clicking New. By default, a white Diffuse type material is created. The color can be changed by clicking the field next to the Color parameter. This allows us to change the uniform color for the whole object, but we would prefer to put an actual texture from an image on the object instead.
Before continuing it's a good idea to make sure that the image is not too dark. If it is, go to the World tab (the planet icon, fourth from the left in the right side menu) and change the color there to something lighter (best leave it grayscale - ie. colorless - so it is color neutral). This is only a temporary solution, however, as we will discuss lighting in an exercise further below.
Textures can be easily found online on pages like this one. For the examples below I used this and this image. The problem with textures is that it's not easy to stretch a 2D image on top of a 3D object. We need a specific procedure known as UV mapping to achieve this.
Let's assign a texture to the monkey head. By clicking on the little dot next to the color picker of the diffuse material, a menu will open. Choose the Image Texture item from there, then click Open and choose the the texture file that you downloaded.
We still cannot see the texture on the monkey, because we haven't defined a UV map:
split the 3D view into two parts (to see how, check the beginning of the previous lecture)
change the right view to the UV/Image Editor
click on the little file icon, to the right of the Image and change Render Result to the loaded texture of your choice
select the monkey and enter the Edit Mode - make sure everything is selected (press A a few times if not sure)
the the Unwrap option from the Mesh menu (or simply press U on the keyboard) and choose the Unwrap option from the submenu
The UV editor should display a grid of UV coordinates overlaid over the image. Now change the Viewport Shading of the 3D view to either Texture, Material, or Rendered and you should see the texture overlaid on the model of the monkey head. The key point here is the ability to manipulate the mapping using the UV editor:
while hovering with the mouse over the UV editor, press G and move the UV coordinates in any direction - observe the change in the 3D view
now press R and rotate the coordinates
press S and scale them - what happens to the texture when the scale of the UV coordinates increases?
notice that even if the coordinate leaves the image area, the monkey will still have the texture drawn everywhere. This is because the texture has the Repeat property enabled in the materials tab - imagine as if the image was repeated to infinity in each direction
move the mouse over the 3D view and press U again (make sure you are still in Edit Mode) and choose a different method of generating UV coordinates than Unwrap
As you may have noticed, UV mapping can be a pretty daunting task and take a lot of time during the graphics design process - especially when working with a complex model. If you are happy with how the texture looks, press F12 to generate an image and save it to disk (File/Save).
Make sure to save the previous solution to a separate file! Download the file from this link and open it in Blender. It contains a project with the model of a helicopter without any materials - actually all the materials are set to the same white color material. Try rendering the scene to see how it looks.
The window is split into 2 parts. The left side is the so called Node Editor, which is an alternative way of observing the network of connections between different materials available in the material editor. In reality, Node Editor is a GUI element used for much more, but in this lesson we will use it only to modify the materials.
The basic concept for defining the look of an object is a so called Shader - a simple program responsible for shading (i.e. coloring-in) of the individual faces of an object. This program has different inputs and the output is usually simply the color of the individual pixels. Shading can also be the result of a sequence of many subroutines, each having its own inputs and outputs. Thus the graph of nodes is a very convenient way of manipulating such a process in a graphical user interface.
Going back to the task, we have several groups of materials in the scene:
the background should remain a white diffuse material - it is supposed to represent something that looks like a white piece of paper
the screws have a material named Metal and should look like a bright, shiny metal-like material
the rest of the object are named Colored and are also made out of metal, but it should look as though they are painted by a thick layer of oil paint
Let's start with the screws:
select any screw in the scene - the Node Editor should display a material named Metal with 5 users (5 object using the material)
delete the Diffuse node and add Glossy BSDF in its stead (from the Add menu in the Shader category)
connect the output of the Glossy BSDF with the input of the Material Output (Surface)
the Roughness parameter determines the "shininess" of the metal - a low value will create a mirror-like effect, so a value of 0.15 should look much more realistic for this particular image (but try experimenting on your own, as well)
While doing the above steps, the effect may seem very subtle at first. This is because the color of the environment is too uniform. Select any part of the helicopter (e.g. the propeller) - you should see a material named Colored in the Node Editor with 14 users. Change the color of the diffuse to anything but white (e.g. blue, like below). If you look at the screws from up close now, you should notice a slight reflection from the rest of the helicopter. Now you can also test the Roughness parameter a bit better.
Let's do a material for the rest of the helicopter now:
select any part of the helicopter, e.g. it's propeler - you should notice the Colored material in the Node Editor
add a Glossy shader (but don't erase the Diffuse, like last time)
add a node named Mix Shader (from the Shader category)
connect the output of the Diffuse to the top Shader input of the Mix Shader node and the output of the Glossy to the bottom Shader input of the Mix Shader
add a Fresnel (Input category) and connect its Fac. output to the Fac. input of the Mix Shader
set the IOR parameter (index of refraction) to 2
Fresnel is a concept describing a collection of formulas for determining the rules of reflection and refraction of the rays of light, depending on the material the light is being reflected off (or passing through).
We have now created a material that looks like painted metal, but we can still improve it a bit. When painting metal, we often get an uneven surface finish - due to impurities, bristles of the painting brush or sometimes even on purpose. We can simulate this effect using the Displacement parameter in the Material Output node:
add a Noise Texture node to the material (Texture category)
set the Scale to 100, Detail to 5 and Distortion to 1
add a Math node (Converter category) and change its operation to Maximum
connect the Fac. output of the Noise Texture to the top Value input of the Maximum node and leave the bottom Value input as 0.5
add a MixRGB node (Color category) and connect the output of the Maximum node to the Color 1 of the Mix node and set Fac. to 0.98
connect the Mix output to the Displacement input of the Material Output node
All together, it should look like this:
After rendering, the effect isn't noticeable from far away, but if we zoom in and look at some of the highlights of our paint, we can see it more clearly:
The scene above is still not perfect, because it is uniformly lit and it contains no sources of light that would generate realistic shadows and reflections in our materials. Let us turn off the background light by going to the World tab (the planet icon - fourth from the left in the right side menu) and change the background color to total blackness.
Running the render now will generate a completely black image. We need to add some light sources to the scene. Blender contains different types of built-in lamps, but the easiest way of making a light source is creating a material that generates light:
press SHIFT-C to center the scene and add a Plane mesh
move the place 3 units up, 1 unit in the X and -1 in the Y axis
scale it 2 times (in all directions)
rotate it 45 degrees in the Y and -45 in the Z axis
This will create a surface located behind the camera and directed towards the helicopter. Now go to the Materials tab and add a new material to this object. In the same tab, change the Surface from Diffuse to Emission (notice it will also change in the Node Editor). Set the color of the light to slightly cold (R:0.8, G:0.8, B:0.9) and increase its Strength to 5. This will make a scene that looks much more realistic than before:
This is, however, only the beginning. When taking photos in a studio setting, we usually have several light sources. The light we just added above is known as the Key light and is the basic source of lighting in a scene, lighting the front of the object. Let's add another source of light, known as the Back light to the scene:
press SHIFT-C to center the scene and add a Plane mesh
set the new Plane so that it is -2 in the X axis, 2 in the Y axis and 2 in the Z axis
rotate it -15 degrees in the Y and -45 in the Z axis
add a material and change it to Emission
set it to a warm color (R:0.8, G:0.7, B:0.3)
set its Strength to 2
Make sure the camera is positioned in such a way so that the lights aren't visible (photography/film making 101). The scene looks even better now. You can compare the different lighting scenarios by enabling and disabling the visibility of individual lights with the eye and camera icons on the object tree list at the top right part of the window:
Setting the light manually can be a time consuming task and sometimes its even impossible to get a realistic looking scene. With an outdoor scene we have many dozens of light sources, coming practically from all directions (remember all the reflections from other objects). We can try and approximate this using many lamps, but there is a much easier solution. If we make a 360 degrees photo at the location we want to place out object in, we can tell Blender to use this image both as a background image and a source of natural background lighting for our image. To make the effect more accurate, we will prefer an image in the HDR format, which contains a greater dynamic resolution than a simple image.
We can find many sources of HDR environment images online, but usually not for free. Go to this URL http://www.hdrlabs.com/sibl/archive.html then download and unpack a chosen scene.
To apply the image to our helicopter scene above, to the following:
hide all the Plane object in the scene by removing their eye and camera icons in the object tree at the top right part of the window
go to the World tab
click on the little dot icon next to the Color parameter and choose the Environment Texture from the menu
click the Open button and find the 360 degree background image you downloaded above - best choose one with the "hdr" extension (generally, the larger the file size, the better the quality)
If you are in Rendered Mode in the 3D view, you can start rotating the scene and you should see the selected image in the background. If the image doesn't show up, make sure you are in perspective mode (press 5 on the numeric keypad). The lighting is also adjusted to the conditions present in the image. You can also tweak its intensity by changing the Strength parameter. If there are a lot of white pixels (the so-called fireflies) try setting the Clamp parameters (both Direct and Indirect) to 1 (in the Render tab, Sampling section, Settings subsection).
Here are a few example of the HDR lighting: