Material shaders are the primary type of shaders. All materials defined in the scene must at least define a material shader. Materials may also define other types of shaders, such as shadow, volume, and environment shaders, which are optional and of secondary importance.
When mental ray casts a visible ray, such as those cast by the camera (called primary rays) or those that are cast for reflections and refractions (collectively called secondary rays), mental ray determines the next object in the scene that is hit by that ray. This process is called intersection testing. For example, when a primary ray cast from the camera through the viewing plane's pixel (100, 100) intersects with a yellow sphere, pixel (100, 100) in the output image will be painted yellow. (The actual process is slightly complicated by supersampling and filtering, which can cause more than one primary ray to contribute to a pixel.)
The core of mental ray has no concept of ``yellow.'' This color is computed by the material shader attached to the sphere that was hit by the ray. mental ray records general information about the sphere object, such as point of intersection, normal vector, transformation matrix etc. in a data structure called the state, and calls the material shader attached to the object. More precisely, the material shader, along with its parameters (called shader parameters), is part of the material, which is attached to or inherited by the polygon or surface that forms the part of the object that was hit by the ray. Objects are usually built from multiple polygons and/or surfaces, each of which may have a different material.
The material shader uses the values provided by mental ray in the state and the variables provided by the .mi file in the shader parameters to calculate the color of the object, and returns that color. In the above example, the material shader would return the color yellow. mental ray stores this color in its internal sample list, which later gets filtered to compute frame buffer pixels, and then casts the next primary ray. Note that if the material shader has a bug that causes it to return infinity or NaN (Not a Number) in the result color, the infinity or NaN is stored as 1.0 in color frame buffers. This results in white pixels in the rendered image. This is true for subshaders such as texture shaders also.
With an appropriate output statement (see page ),
mental ray computes depth, label, normal-vector, and motion vector
frame buffers in addition to the standard color frame buffer, and any
user frame buffer defined with frame buffer statements in the
options block. The color returned by the first-generation material
shader is stored in the color frame buffer (unless a lens
shader exists; lens shaders also have the option of modifying
colors). The material shader can control what gets stored in the
depth, label, normal-vector, and motion-vector frame buffers by storing
appropriate values into state - > point.z, state - > label, state - > normal, and state - > motion,
respectively. It can also store data in the user frame buffers with
an appropriate call to mi_fb_put. Depth is the negative Z
coordinate.
Material shaders normally do quite complicated computations to arrive at the final color of a point on the object:
Note that the shader parameters of a material shader are under no obligation to define and use classical parameters like ambient, diffuse, and specular color and reflection and refraction parameters. Here is a full-featured example for the C source of the shader declared in the previous section:
#include <stdio.h> #include <mi/shader.h> int my_material_version(void) {return(1);} miBoolean my_material( miColor *result, miState *state, struct my_material *paras) { miColor *diffuse, *specular; miVector bump, dir; miColor color; int num, i_lights, n_lights; miTag *lights; miScalar factor; /* * bump map */ state->tex = state->tex_list[0]; mi_call_shader((miColor *)&bump, miSHADER_TEXTURE, state, *mi_eval_tag(¶s->bump)); if (bump.x != 0 || bump.y != 0) { mi_vector_to_object(state, &state->normal, &state->normal); state->normal.x+=bump.x * state->bump_x_list->x +bump.y * state->bump_y_list->x; state->normal.y+=bump.x * state->bump_x_list->y +bump.y * state->bump_y_list->y; state->normal.z+=bump.x * state->bump_x_list->z +bump.y * state->bump_y_list->z; mi_vector_from_object(state, &state->normal, &state->normal); mi_vector_normalize(&state->normal); state->dot_nd = mi_vector_dot(&state->normal, &state->dir); } /* * illumination */ *result = *mi_eval_color (¶s->ambient); diffuse = mi_eval_color (¶s->diffuse); specular = mi_eval_color (¶s->specular); i_lights = *mi_eval_integer(¶s->i_lights); n_lights = *mi_eval_integer(¶s->n_lights); lights = mi_eval_tag ( paras->lights); for (num=0; num < n_lights; num++) { miColor color, sum; miInteger samples = 0; miScalar dot_nl; sum.r = sum.g = sum.b = 0; while (mi_sample_light(&color, &dir, &dot_nl, state, lights[i_lights + num], &samples)) { sum.r += dot_nl * diffuse->r * color.r; sum.g += dot_nl * diffuse->g * color.g; sum.b += dot_nl * diffuse->b * color.b; factor = mi_phong_specular( *mi_eval_scalar(¶s->shiny), state, &dir); sum.r += factor * specular->r * color.r; sum.g += factor * specular->g * color.g; sum.b += factor * specular->b * color.b; } if (samples) { result->r += sum.r / samples; result->g += sum.g / samples; result->b += sum.b / samples; } } result->a = 1; /* * reflections */ factor = *mi_eval_scalar(¶s->reflect); if (factor > 0) { miScalar f = 1 - factor; result->r *= f; result->g *= f; result->b *= f; mi_reflection_dir(&dir, state); if (mi_trace_reflection (&color,state,&dir) || mi_trace_environment(&color,state,&dir)) { result->r += factor * color.r; result->g += factor * color.g; result->b += factor * color.b; } } /* * refractions */ factor = *mi_eval_scalar(¶s->transparency); if (factor > 0) { miScalar ior = *mi_eval_scalar(&state->ior); miScalar f = 1 - factor; result->r *= f; result->g *= f; result->b *= f; result->a = f; if (mi_refraction_dir(&dir, state, 1.0, ior) && mi_trace_refraction (&color, state, &dir) || mi_trace_environment(&color, state, &dir)) { result->r += factor * color.r; result->g += factor * color.g; result->b += factor * color.b; result->a += factor * color.a; } } return(miTRUE); }
Four steps are required for computing the material color in this shader. First, the normal is perturbed by looking up a vector in the vector texture, and using the bump basis vectors to determine the orientation of the perturbation (the lookup always returns an XY vector). The second step loops over all light sources in the light array parameter, adding the contribution of each light according to the Phong equation. In the case of area lights, the light is sampled more than once, until the light sampling function is satisfied.
Finally, reflection and refraction rays are cast if the appropriate parameters are nonzero. In both cases, first the direction vector dir is computed using a built-in function, and a ray is cast in that direction. If either trace function returns miFALSE, indicating that no object was hit, the material's environment map that forms a sphere around the entire scene is evaluated. When all computations are finished, the calculated color, including the alpha component, is returned in the result parameter. The shader returns miTRUE indicating that the computation succeeded.
The mi_eval functions permit shader assignments, which let shaders base their computations on parameters that are driven by other shaders. For example, the diffuse component of the above example is a good candidate for attachment to a texture shader. This works only if mi_eval is used properly.