Rough Diffuse

In our quest for an improved diffuse model we already seen a principled implementation and a super smooth lambertian sphere model. We'll briefly see here a 'microfacet' lambertian model that comes with the DiffuseRough shading node. As before we don't really need to fully understand all of this but a general understading is still better than nothing. DiffuseRough has the same parameters of a classic diffuse model so it will work good anyway even if we just experiment visually with it and refuse to understand the model behind it ;)

Recall from the principled implementation that we introduced the basic diffuse shading as the cosine of the angle between the viewing direction and the light direction. However take care that we don't have only to compute the cosine between those two vectors but also to sample it.

Generally when we have a certain model we have two routines to obey with to let a renderer perform well. One is the evaluation routine where here we effectively compute the cosine between the two light/view dir vectors; this can be seen also as direct lighting evaluation. We then have a sampling routine that actually samples the diffuse function bringing here indirect lighting calculation. Both are then used in a multiple importance sampling scheme for better convergence of the rendering equation.

Practically while evaluating we just take care of computing the magnitude of our lighting function.. ie. how much direct light will shine on our shading point. When sampling instead we need to calculate an outgoing direction that will be used to sample the scene. For a diffuse model a sampling routine needs to provide the direction where to compute the actual indirect global illumination. Because this set of directions is not the full set of directions a ray can take we need to importance sample an outgoing direction. With a standard diffuse model we'll need to importance sample an hemispherical direction with a cosine one. Meaning that the full range of directions is hemispherical but practically we shrink that direction to have a cosine shape.. this is called cosine weighted hemispherical sampling. It simply means that our directions are not uniformly distributed over the hemisphere but that there's instead a cosine (aka bell-shaped) distribution of those directions.. ie. some directions have more probabilities to be picked up than others. 

Let's see this with a specular function that better highlights how, while sampling the specular direction, we take a subset of all the hemispherical directions we could take. 

So when a light ray strikes a surface it bounces off the surface taking a certain direction. The full sets of directions it can take is called a lobe. The orange shape is the lobe of directions a specular ray can take when bouncing on a surface. We can clearly see that it is really a small subset of the whole hemispherical set of directions. If we would not take care of this we would end up with a solution with much more variance (aka noise). To complete the picture we gonna say that the more the surface is rough the more the lobe will be wide approaching more and more an hemispherical distribution.

We have seen already while inspecting microfacet models that the underlying microfacets arrangement will determine a certain lobe of directions. Meaning that the orientation of those micro facets will give a certain direction to our outgoing rays. For a specular lobe the more we increase the roughness parameter the more we are approaching an hemispherical distribution of directions that here it also means the more we are approaching a diffuse appearance. 

So to approach a diffuse lobe why not use a microfacet model instead to go for the usual cosine weighted ?

That's the assumption we take for our DiffuseRough model. As of the why, we may shall remember that with the diffuse Oren-Nayar model (DiffuseGeneralized and Arnold Std Material) we suffer from the single scattering limitation (it gets darker) and while the principled implementation is an approximation, here we have the real thing, - (physical) multiscattered diffuse roughness. 

Effectively we have the best backscattering of all the three models. This comes at a expense of having a little longer render times. Below our DiffuseRough shading node set to Multiscatter-GGX with a roughness of respectively 0 and 1. Notice how the increased roughness introduce backscattering (the flattening of contrasted dark/bright diffusion) without unrealistically darken the whole shading. 

 

Keywords :

diffuse, diffuse rough, rough, arnold, arnoldrender, arnoldrenderer, shader, material, reflect, reflection, microfacet, arnold shaders, arnold download, arnold materials, arnold renderer materials

 

2 thoughts on “Rough Diffuse

Leave a Reply