Lighting, especially when you are trying to simulate a real space in 3D, for visual effects purposes can be a minefield. Matching lighting, especially of a natural environment, can involve many lights and much experimentation to get right.

But there is a much simpler way in that you can light a scene with an HDR image by setting it as the background colour. This means you can light an object you are perhaps trying to blend into a scene with lighting from the actual scene.

Sounds complicated, and in some ways it is somewhat more technical than other Blender topics, but we’ll take it slow and explain it as fully as we can.

Environment Maps

HDRI stands for High Dynamic Range Imaging, and it’s received a bit of a bad name lately. HDR has come to mean those brightly coloured “tone-mapped” landscape images people post on the Internet. But that’s not what HDR is.

High Dynamic Range photos are actually photos which have a much higher range of luminance recorded in the file than is shown in the image. Usually they are made by taking three or more images and combining them as a special HDR file to record higher and lower levels of light that cameras can usually record with a single exposure. This means you can closer approximate the way human eyes see a scene.

That’s all very technical, but how does this help you light a scene? You can use a 360º image to actually light a scene using the tones in the picture to shine light on your objects from all angles.

DIY HDR

Making HDR environment maps is quite labour-intensive. You have to make 360º images by putting your camera on a tripod and taking pictures in a circle and stitching them together. That’s quite simple, but you also need to bracket the exposures (taking one overexposed and one underexposed) for each image, and then use software to combine those three images into one. Some DSLR cameras allow you to shoot bracketed exposures with a single click. Then having made your HDR images, you need to stitch them all together as a single 360º image, like the below image.

Luckily, Blender expert Greg Zaal over at Adaptive Samples has taken a lot of the pain out of this process for you. As well as explaining the long and tedious process of how it can be done for you, he kindly offers a free pack of high-quality Creative Commons HDR environment maps which you can use for lighting your own scenes.

Obviously, the best maps are ones you make yourself, but there are a ton of other pre-made ones out there which may fit the bill for your scenes. Google “HDR environment maps” or “360 HDR” for more examples.

A simpler and quicker way of getting a usable environment map is to take a bracketed set of exposures of a mirrored ball. For example, on the film Batman Begins, all the environmental lighting for CG was captured with chrome balls using a 300mm lens at a distance of around 12′ to shoot a 4″ chrome ball. Another good way to capture HDR is to use a 180º fisheye lens and shoot in one direction, then turn the camera around 180º and shoot the other way.

Once you have shot your chrome ball or fisheye HDR images, you can unwrap them with software like Hugin.

Make a Scene

So having either made or obtained an HDR lighting image, how do you light a scene with a picture? First you need a scene which will show off the subtle lighting, so let’s make a quick one with a couple of spheres and a plane.

When you start Blender, delete the cube and the light source. Select them with the right mouse button and press the “Delete” key. Select “Cycles Render” from the drop-down at the top of the screen, and you are ready to start.

Next add a plane. Press “Shift + A” and choose “Mesh -> Plane.” Now stretch the plane out to fill the grid. Press Tab to enter edit mode. Click the edges button to select edges.

Click on the four edges in turn and stretch them out using the green and red X and Y axis arrows.

Now add a sphere with “Shift + A.” Duplicate it by selecting it and using the Objects menu at the bottom of the screen, “Objects -> Duplicate Objects.”

The duplicate object will be stuck to the cursor. Press the Y key on the keyboard to restrict its movement to the Y axis, then slide it to one side.

Make sure the two balls are sitting on the plane. Go into a side view by pressing the numeric keypad buttons so you can see the two balls side by side. Press the 5 key to make sure you are in orthographic mode. Select the plane and lower it until it sits at the very bottom of the spheres.

To make the balls nice and smooth, add a subsurface modifier on the Modifiers properties button.

Frame up the shot so the two balls are nicely centred. (We will cover easy framing of shots in a forthcoming article on cinematography.)

Give the two spheres a colour and surface texture. Select ball one and click the Material properties button.

Click on Diffuse BSDF, and change it to “Mix Shader” in the pop-up. Then choose the two shaders you will mix, the first being Glossy BSDF and the second being Diffuse BSDF. Adjust the Fac slider to about 0.75. Give the diffuse shader a colour, but leave the glossy one as white.

Do the same for the other ball, but give the diffuse shader a different colour.

Lighting with Pictures

Okay, now we can light the scene. Go to the “World properties” button in the Properties panel. The background colour defaults to a dark grey, so to give us a baseline to compare the others, click on that, and crank it all the way up to white.

This is an even white light which covers the scene. Render it with F12 and you will see it’s a good even light, but it’s not subtle or real; it looks like it was photographed in a very bright featureless CG photo studio.

Now click on the dot button next to the colour, and select Environment texture from the pop-up.

Now load an HDR as the image in the background to provide the light. (We are using the gorgeous HDRs from Greg Zaal, but any you can find from the Internet will be good as well.)

Before rendering, make sure you check the Multiple Importance checkbox down the bottom in Settings, as this makes Blender treat HDR environment maps as a light and improves the rendering. (Always check this for HDR.)

Hike the Map Resolution up to 1024. For another tip to avoid the sparkles which start creeping into complex lit scenes, find the Filter Glossy setting in the Light Paths panel of the render properties, and set it to about 0.5.

Now render using F12. Do you see the difference? It actually looks like it was shot outside.

With every different HDR image the tone of the scene changes. This is an interior with the shades drawn.

The light comes from different directions, and it is coloured by the image. It’s as if white light is shining through a huge transparent ball around the scene, and on that ball is the image of the original scene.

Scenes where the HDR was filmed outdoors look like they are outdoors, and HDRs from interiors look like they are indoors. It’s really quite vivid and obvious.

The subtlety and realism of light you can get with HDR environment maps is amazing, and once you’ve used it you rarely stray back into normal hand-built lighting again. The light is both harsh and bright or soft and diffused and comes from multiple directions, casting shadows and reflections on your object which only enhance the sense of realism.

Why Use HDR?

In movie special effects, it is common to add CG elements to real life scenes and use motion tracking to incorporate them into the scenes. But the lighting must match exactly or the CG objects will look fake. So if you take all the photos needed to make an HDR in the location you film as your background plate, you can later on use Blender to render the CG objects in the scene using the environmental illumination from the original scene. That’s very cool.

Thanks for joining us for this peek into CG visual effects lighting. If you have any questions or comments, please leave them in the comments section below.

Image Credit: Greg Zaal

Phil South has been writing about tech subjects for over 30 years. Starting out with Your Sinclair magazine in the 80s, and then MacUser and Computer Shopper. He’s designed user interfaces for groundbreaking music software, been the technical editor on film making and visual effects books for Elsevier, and helped create the MTE YouTube Channel. He lives and works in South Wales, UK.

Our latest tutorials delivered straight to your inbox