More distant elements in a scene typically have “baked in” lighting. That is, the lighting is already in the texture image because (for example) it’s a photo shot at a particular time of day.
Backdrop elements are typically at a medium to long distance. Some examples are the faces of distant buildings or terrain. The premiere example is the background cyclorama bubble. This is an image mapped onto a sphere at great distance, e.g. 1 km from the camera. There are no lighting effects required. The image is high resolution, e.g. 4k, so that zooming in doesn’t “break up” into large pixels on the background image. The image can be a movie texture or a still, but in either case it should map correctly onto the sphere.
Environment images are typically shot as “spherical panoramas” which wrap 360 around and 180 degrees up and down. These can be flattened into an image several ways, but one of the most common is “lat/long” or “latitude, longitude” mapping.
Here is the spherical panorama image from the example above flattened using “lat/long”:
Because there is only half a sphere, only half of the image above is mapped onto it.
The shader controls (as seen in Maya):
Sampling anisotropy can generally be left at 4x. It’s the number of passes made to reduce aliasing or “jaggies” on textures that have high contrast (e.g. checkerboard floor patterns) as they get farther away from camera.
The color sampler field maps the base color texture onto the object using the UV set named “UV”.
Because the surface lighting is “baked in” there are a controls to change coloring.
The MipMapping option can generally be left on. For very large background bubble texture GPU memory can be spared by turning it off.
Download the Shader Library 2014 package from the Dashboard.