Advanced Lighting Techniques Dan Baker (Meltdown 2005)

Post on 12-Apr-2017

5.092 views 2 download

Transcript of Advanced Lighting Techniques Dan Baker (Meltdown 2005)

Advanced Lighting TechniquesDan BakerSoftware Design EngineerMicrosoft Corporation

Review The BRDF• Bidirectional Reflectance

Distribution Function• Ratio of irradiance to

reflected radiance

ωedωi

ωi

The BRDF• Incident, excitant directions defined

in the surface’s local frame (4D function) Nωe

ωi

T

θiθe

φiφe

The BRDF• Isotropic BRDFs are 3D functions

Nωe ωiθi

θe

φ

The Reflection Equation

The Reflection Equation

The BRDF• The laws of physics impose certain constraints

on a BRDF• To be physically plausible, it must be:

• Reciprocal• Energy-Conserving

• You are already using a BRDF, even if you don’t know it• Phong• Blinn-Phong• Diffuse

How Do We Evaluate a Lighting Model?• Real time graphics technology is

based on rasterization• Currently, our pipeline consists of

triangles, which get processed by a vertex shader, and then rasterized

• The rasterization step produces pixels (aka fragments), which are then processed and placed into a rendertarget

A Modern Real Time Graphics System

VertexBuffer

InputAssembler

VertexShader

SetupRasterizer

OutputMerger

PixelShader

IndexBuffer Texture Render

TargetDepthStencilTexture

Memory

memory programmable fixed

Sampler Sampler

Constant Constant

Tomorrows Graphic’s System

VertexBuffer

InputAssembler

VertexShader

SetupRasterizer

OutputMerger

PixelShader

GeometryShader

IndexBuffer Texture Texture Render

TargetDepthStencilTexture Stream

Buffer

Stream out

Memory

memory programmable fixed

Sampler Sampler Sampler

Constant Constant ConstantMemory

Where We Have Control• Shader controls each process point• With some restrictions, can perform

any computation we want• A reflectance model can be evaluated

at any of these stages, or any combination

• Historically, computations were done on vertex levels, but have steadily moved into pixels

• Soon, will be able to shade on a triangle

Frequency of Evaluation• Generally, vertex evaluation is lowest

frequency, while pixel is at a higher frequency

• Not the case for low resolution• Small triangles (sliver) not rendered

by realtime hardware• Geometry aliasing avoided by not

drawing sliver triangles

An Example BRDF• In HLSL, the Blinn-Phong model looks like:

float3 BlinnPhong(float3 L, float3 V){

float3 H = normalize(V + L);float3 C = Ks*pow(dot(H, float3(0,0,1)), Power) + Kd*dot(N,L);

}

Gamma Space• BRDFs operate in linear space, not

gamma space• Most albedo textures are authored in

gamma space, so must convert into linear space in the shader (and convert them back into gamma space)

• Can use SRGB to convert gamma albedo textures into linear space, gives more precision where needed and acts as a compression scheme

Gamma Correcting• If we render straight to the screen, and

backbuffer isn’t linear (usual case), need to go into gamma

• sqrt is a close approximation

float3 BlinnPhong(float3 L, float3 V){

float3 H = normalize(V + L);float3 C = Ks*pow(dot(H, float3(0,0,1)), Power) + Kd*dot(N,L);

C = sqrt(C);}

Dynamic Range• BRDFs can have a wide range of color

intensities• 8 bit per channel backbuffer does poor

job of capturing range• Easy for information to be lost – e.g.

clamped to max value, or not enough precision

Blooming/Tone Mapping• Bright pixels bleed to neighbors• Exposure control • If these are used – BRDF will store

color value stored in rendertarget, and an image space filter applied

• Likely de-facto standard in future

How Do We Evaluate a BRDF?• Direct Evaluation

• Make an ALU program in the GPU• Texture Evaluation

• Perform a texture lookup• A combination of the 2.

• Factor some things into textures• Do others with ALU

How Do We Represent Material Variation?

• Microvariation is important• Want more then just a single

BRDF for a surface, this BRDF must change across the surface

• Bumpmapping is a coarse variation

Which Reflectance Model to Use• Different reflectance models have

different costs• Cost – Data• Cost – ALU• Cost – Time to make content• Cost – Accuracy• Consider: Strong trend toward more

ALU, less data

BRDF Costs, Minimal Surface Variation

Model Texture costs ALU costsBlinn-Phong Direct 0 7Blinn-Phong factored 1 2Banks Direct 0 12Banks Factored 1 5Ashikhmin/Shirley 0 40Ashikhmin/Shirley factored 4 10Lafortune Direct 0 10 + 5 *nCook-Torrance 0 35

BRDF Costs With Surface Variation

Model Texture costs ALU costsBlinn-Phong Direct 1 15Blinn-Phong factored 2 10Banks Direct 1 25Banks Factored 2 18Ashikhmin/Shirley 2 50 (60)*Ashikhmin/Shirley factored 6 30 Lafortune Direct 2 30 +

5*LobesCook-Torrance 1 40

chart courtesy of Addy Ngan, Frédo Durand, and Wojciech Matusik

Accuracy of Blinn-Phong

chart courtesy of Addy Ngan, Frédo Durand, and Wojciech Matusik

Accuracy of Ashikhmin Shirley

Getting Variation• Could sample real materials to

generate a BRDF map [McAllister SBRDF].

• But, variation usually done by an artist.

• Sometimes we get variation by using an understanding of mesogeometry

• Sometimes it is done by making changes in the assumption of the microgeometry

Bump Mapping

Pixel Level Evaluation

Pixel Level Evaluation, Shift

Shifting Samples• The sample points change drastically

as the triangle moves and changes size

• If using micro variation, e.g. loading data elements from a texture, the evaluation can be significantly different

• Texture filters – solves this problem for linear data elements

• Linear for temporal, MIP mapping for resolution changes

But…• Texture filters are linear• BRDFs are usually non linear• Get some image stability, but not

great results• Must at least prevent radical shifts in

image • Often, variation must be mitigated• Would be nice to texel shade instead

What Does a Sample Point Mean?

What happens What happens when this pixel when this pixel is evaluated? is evaluated? Given a set of Given a set of parameters parameters interpolated interpolated from the from the triangles 3 triangles 3 vertices, how vertices, how do we go about do we go about generating a generating a color?color?

Texture Filtering Review

A (1-A)

B

(1-B)

Sample Color =

A*B * +A*(1-B) * +(1-A)*B * +(1-A)*(1-B)* +

Texture swimming

What about lower resolutions?

A common hack: Level of Detail

MIP

Lev

el

By lowering the By lowering the amplitudeamplitudeOf the Of the displacement, displacement, effectively effectively decreasing to a decreasing to a less complex less complex model as the model as the model moves model moves further away. This further away. This will prevent will prevent aliasing, but isn’t aliasing, but isn’t accurateaccurate

Scale independent lighting• Ideally, the size in pixels of an

object on the screen should not affect its overall appearance

• High frequency detail should disappear

• Global effects should stay the same

• Reasoning behind MIP mapping• Shouldn’t see drastic changes in

image as sample points change

A low resolution rendering of an object should look like a scaled down version of the high resolution one

Scale independent lighting

MIP Mappingr

t

Mip mapping for diffuse

• For a simple diffuse case, the lighting calculating can be approximately refactored

• The normal integration can also be substituted by a MIP map aware texture filter

rr

rr

rrrrr

r WAWNLWANL *)()(

),(2*)),(2,()( tcolormapDtextnormalmapDtexLdotWTNL rrr

r

Non Linear Lighting models

prr

r

prr NWHHNW )*()(

p

r

prr tnormalmapDtexHHNW )),(2()(

Blinn-Phong isn’t Linear!Blinn-Phong isn’t Linear!

Texture Space Lighting• Rasterize using texture coordinates

as a position• Equivalent to explicitly evaluating

the summation• An object needs a texture atlas• The values in the texture space are

now colors – and are linear• Create a MIP chain explicitly or

through the auto MIP gen option• Render the object with this MIP chain

Texture Space Lighting

Rasterize triangles using texture coordinates as positions, the left image is the normal sampled at each point, and the right image is the computed lighting

Texture Space lighting

We paste the texture onto the object

TSL: A case study• Texture Space Lighting (TSL) can

allows us to have high frequency lighting on fast moving objects

• Will only have aliasing problems associated with standard filtering techniques

• A roadway is a perfect candidate, subtle specular reflections with a high degree of motion

A roadway, snapshots

We can see Moiré patterns on any filtered non linear functions. Additionally, temporal aliasing becomes problematic.

Using TSL

The artifacts are largely mitigated if we render the non linear function and MIP reduce. We can also see more high frequency detail.

How this demo works• Each road segment is rendered into a

1024x1024 rendertarget in texture space – alpha is set to 0

• AUTOMIPGEN is then used to do a simple mip filter of the highest level rendering

• This model is then light with the prelit, mip filtered texture and rendered with full aniso

• Art content is simple – just an albedo texture, a normal map with a glossy term

• Everything is done in linear space

The Shadervoid lightPositional( …){ // norm, light and half are in tangent space // power, Kd, Ks and norm come from textures … //normalized blinn phong float fS = pow(saturate(dot(norm, half)), power)) * power; float fD = saturate(dot(norm, light)) * diffuseIntensity; vColorOut.xyz = fS * Ks + fD * Kd; vColorOut.a = 1; //put us in gamma space for the direct rendering only if (bTextureView) vColorOut = sqrt(vColorOut); }

Pasting the texture on the Scene void light_from_texture( float2 vTexCoord: TEXCOORD0,

out float4 vColorOut: COLOR0 ){

// this is done with full anisotropy turned on// with SRGBTexture set to truevColorOut = tex2D(LightMapSampler, vTexCoord);

//for atlased objects, we do not want to blend in // unrendered pixels

if( vColorOut.a > 1e-5 )vColorOut.xyz /= vColorOut.a;

//put us in gamma spacevColorOut = sqrt(vColorOut);

}

Drawing polygons on the backside[emittype [triangle MyVType ]][maxvertexcount [3]]void ClipGeometryShader(triangle MyVType TextureTri[3]){ float2 coord1 = project(TextureTri[0]); float2 coord2 = project(TextureTri[1]); float2 coord3 = project(TextureTri[2]); float3 Vec1 = float3(coord3 – coord1, 0); float3 Vec2 = float3(coord3 – coord2, 0); float Sign = cross(Vec1, Vec2).z; if (Sign > -EPSILON) { emit(TextureTri[0]); emit(TextureTri[1]); emit(TextureTri[2]); cut; }}

Problems with TSL• Main problem is performance• We can’t render each object in the

screen at a fixed resolution and scale down

• We need a way to render the object at a reduced resolution which looks close to the higher detail one when zoomed out

• Basically, we need to know how to create non linear MIP chains

MIP-Mapping Reflectance Models

MIP-Mapping Reflectance Models• Commonly thought of as the normal

map filtering problem • Some common techniques involve

first MIP mapping the height map, and then creating a normal map from them, or fading the normal to (0,0,1)

• All of these techniques are grossly inaccurate– in fact, the normal is the least important aspect of a correct MIP filter

• Only NDM (Olano and North) filters linearly

A little bit of microfacet Theory• Surface modeled as flat microfacets

• Each microfacet is a Fresnel mirror• most BRDFs are microfacet models

• Normal Distribution Function (NDF) p(ω)• p(ω)dω is fraction of facets which have

normals in the solid angle dω around ω• ω is defined in the surfaces’ local

frame• There are various different options for p(ω)

Microfacet Theory• For given ωi and ωe, only facets

oriented to reflect ωi into ωe are active

• These facets have normals at ωhωi

ωe

ωiωiωi

ωiωiωi

ωeωe ωe ωe ωe

ωe

ωhωhωh

ωh ωh ωh

ωh

Microfacet Theory• Fraction of active microfacets is

p(ωh)dω• Active facets have reflectance RF(αh)• αh is the angle between ωi (or ωe) and

ωh ωiωi ωi

ωe

ωiωiωiωi

ωeωe ωe ωe ωe

ωe

ωhωhωh

ωh ωh ωh

ωh

So what does a MIP map mean anyway?• Non-linear lighting functions

fundamentally represent microfacets • Specifically, represents a distribution

of micofacets• A mip map is a lower resolution image

which best represents the higher resolution one

• But – how does this work if we have a bunch of little mirrors?

A simple non linear function

The Blinn-Phong lighting model is one of the most common half angle based lighting functions. The above is a normalized version, that is the total emitted energy is close to constant for any power. This model is parameterized by N, Kd, Ks, and p. Changing these values changes the properties of the underlying surface

psd VLNkpLNkVLf )(*)(),(

A common approach

The image on the left is lit with MIP maps fading to flat, while the image on the right is what the image would look like if rendered at a higher resolution and scaled down. The objects do not physically look the same.

A more correct MIP map

The images in the middle were rendered directly to the screen using a BRDF approximating MIP map. The more we zoom out, the larger the difference between the correct and incorrect approaches

Examining a mip level – simple

32 32

32 32

32

Next MIP Level – same thing

Examining a mip level8 45

16 25

25

Next MIP Level, Average Power?

Examining a MIP level

32 32

32 32

32

Next MIP Level, Average power, normal?

What about a larger patch and a lower mip?

5

Setting up the problem• Forgetting for a second about

texture filtering, each MIP level should approximate the signal in the higher level texture

• For a BRDF model, this means that the lower MIP levels should approximate the more detailed version

• In other words, we want the BRDF at a different scale

A simple BRDF• To start, consider the specular part

of the normalized Blinn-Phong BRDF

• For each texel, there is a function F(V,L), which given a View direction and a Light direction, returns a color

• Each texel has a different N, Kd, and Power to get surface variation

• N, Ks, and Power are the parameters of a Blinn-Phong lighting model

Reflectance at a single texel

Ks = .02, P = 30 Ks = .02, P = 30

Ks = .02, P = 24 Ks = .01, P =30 Ks = ???, P = ???

(?,?,?)

For a given light and view For a given light and view direction, how do we pick a Ks, direction, how do we pick a Ks, a Power and Normal so that a Power and Normal so that we get the same final color?we get the same final color?

MIP Level 0 MIP Level 1

Solving for a single patch of texels

),(4

),(),(),(),( 22211211 VLFVLFVLFVLFVLF

For a given patch of texels w wide and h tall:For a given patch of texels w wide and h tall:

),(),(10 0

VLFVLFwj

w

i

h

jij

ijpijijijij VLNkpVLF )(),(

Where,Where,

We care about all pairs of L and V

L V

w

i

h

jij VLFVLF

wjpkNE

2

0 0

),(),(1),,(

2

0 0

),(),(1),,(

VLFVLFwj

pkNew

i

h

jij

Creating an error function for a single pair of L and V:Creating an error function for a single pair of L and V:

Need to find the minimum error across all possible V and L Need to find the minimum error across all possible V and L

Creating a MIP filter• We want to find a value of N, k, and p

such that E(N, k, p) is the smallest value possible

• This is a non-linear optimization problem

• There are many ways to fit non-linear problems, for this problem we chose to use BFGS which is a quasi-Newton algorithm

Non Linear fitting• BFGS requires the partial derivatives of the

error function with respect to each parameter we want to fit

• The derivatives must be continuous over the space we are interested in

• Double integral cannot be solved analytically, so it is numerically evaluated

• A good way to express the normal is as an X, Y with an implied Z. But error function must take care to keep the normal normalized

Giving a good first guess• Non-linear algorithms work well when

we don’t have many local minima and we have a reasonable starting guess

• For the Blinn-Phong, seed of the averaged Normal, the averaged Ks, and a low power – less then half of the previous level works well

• But, power should not be near or less then 1!

Non linear with Blinn PhongWith the Blinn-With the Blinn-Phong BRDF, the Phong BRDF, the error plot shows error plot shows that the function that the function is mostly smooth. is mostly smooth. As long as we take As long as we take a guess in the a guess in the dark region, we dark region, we should be fine. should be fine. Graph is log(1 + Graph is log(1 + error)error)

power

Ks

Speeding it up• Running a BFGS with that error function

took a few hours on a 512x512 texture• Gives a good fit, but still too slow• However, this function can be

refactored in terms of the half angle• This reduces dimensionality from 4 to 2!• Additionally, summation in middle of

integral is constant for a given H, so this can be precomputed and reused, assuming enough memory

Speeding it up

H

w

i

h

jij HFHF

wjpkNE

2

0 0

)()(1),,(

But, we must use Half Angle But, we must use Half Angle Distribution Distribution

Results• Resulting MIP level looks closer to

original detail • Turns out that the optimal normal is

usually pretty close to the average normal. You can get away with minimizing just P and Ks

• The power decreases rapidly for a rough surface. Each MIP has about 50-70% of the exponent of its previous level!

• An object with a shiny rough surface will appear dull as the object is further away

Dealing with Anisotropy• Using a simple Blinn-Phong BRDF doesn’t allow

anisotropic effects to be captured at lower MIP levels

• A simplified, half angle version of the Ashikhmin-Shirley BRDF can do this

• We now have E(N, T, Ks, Pu, Pv), that is, we have 2 powers for the tangent frame – and the Tangent frame might vary per pixel

• Anisotropic objects, e.g. a record with grooves on it should stay anisotropic

Why Ashikhmin-Shirley ?• Convergence is generally better,

compared to other BRDFS • Is half angle based – so MIP

evaluation is still fast• Is data light, computation heavy• Can exactly represent Blinn-

Phong

Anisotropy• For more complex BRDFs, one should take

care to ensure function well behaved at limit points

222

22

)()(21)(

)()(

0)()(lim HTPHTPHN

HTPHTP

HN

vvuu

vvuu

eHN

2

22

)()()(

)()( HNHTPHTP

vus

vvuu

HNPPKHF

A side note:• Artists can create extremely high

resolution height maps, modeling the micro geometry aspects of a surface

• Brush strokes, grooves, scratches etc would just appear on the high-res height map

• A fitting algorithm could then capture the various aspects – roughness, anisotropy, etc.

Solution for high quality rendering1) Determine objects maximum MIP level2) Render from that MIP level downward,

culling off unseen polygons3) Compute lower MIP levels4) Draw object with the lit texture, using best

filter possible*5) At lower MIP levels, once texel variation is

minimal, disable TSL *Optionally blend MIP transitions to eliminate any

popping

System considerations• Don’t need a texture for each objects

–just a set of textures which we recycle

• Can reduce render resolution to adjust framerate

Final thoughts on fitting• More research to be done on

BRDF fitting • Untackled issues : more

anisotropic fitting, need to look at more data

• Optimizing for bilinear reconstruction would also help direct rasterization at the expense of removing some detail

More thoughts• Even if you don’t use Texture space

lighting, fitting non linear functions to create MIP maps should be done

• It’s free at runtime! • Texture Space Lighting is an easily

scalable feature. It can be enabled on newer hardware

• Because TSL gets computed in point sampling mode and pasted once, we can invest a resources in a much better reconstruction filter – e.g. bicubic with better anisotropy

• Introduction To Modern Optics, 2nd ed., Fowles (Dover)• Principles of Digital Image Synthesis, Glassner (MK)• The Physics and Chemistry of Color, 2nd ed., Nassau (Wiley)• Geometric considerations and nomenclature for reflectance, Nicodemus et. al.

(NBS 1977)• A practitioners' assessment of light reflection models, Shirley et. al. (Pacific

Graphics 1997)• A survey of shading and reflectance models for computer graphics, Schlick

(Computer Graphics Forum)• Illumination for computer generated pictures, Phong (Communications of the ACM)• Models of light reflection for computer synthesized pictures, Blinn (SIGGRAPH

1977)• A reflectance model for computer graphics, Cook and Torrance (ACM ToG)• Measuring and modeling anisotropic reflection, Ward (SIGGRAPH 1992)• An anisotropic phong BRDF model, Ashikhmin and Shirley (JGT)• Non-linear approximation of reflectance functions, Lafortune et. al. (SIGGRAPH

1977)• Generalization of Lambert's reflectance model, Oren and Nayar (SIGGRAPH 1994)• Predicting reflectance functions from complex surfaces, Westin et. al. (SIGGRAPH

1992)• A microfacet-based BRDF generator, Ashikmin et. al. (SIGGRAPH 2000)• Illumination in diverse codimensions, Banks (SIGGRAPH 1994)• An accurate illumination model for objects coated with multilayer films, Hirayama

et. al. (Eurographics 2000)• Diffraction shaders, Stam (SIGGRAPH 1999)• Simulating diffraction, Stam, in GPU Gems (AW)

References - Background

References - Techniques• Experimental validation of analytical BRDF models, Ngan et. al. (SIGGRAPH 2004

Sketch)• Normal distribution functions and multiple surfaces, Fournier (Graphics Interface

1992)• From structure to reflectance, Fournier and Lalonde, in “Cloth Modeling and

Animation” (AK Peters)• Efficient rendering of spatial bi-directional reflectance distribution functions, McAllister

et. al. (Graphics Hardware 2002)• An efficient representation for irradiance environment maps, Ramamoorthi and

Hanrahan (SIGGRAPH 2001)• Precomputed radiance transfer for real-time rendering in dynamic, low-frequency

lighting environments, Sloan et. al. (SIGGRAPH 2002)• The plenoptic function and the elements of early vision, Adelson and Bergen, in

Computational Models of Visual Processing (MIT)• Steerable illumination textures, Ashikhmin and Shirley (ToG)• Polynomial Texture Maps, Malzbender et. al. (SIGGRAPH 2001)• Interactive subsurface scattering for translucent meshes, Hao et. al. (Si3D 2003)• Clustered principal components for precomputed radiance transfer, Sloan et. al.

(SIGGRAPH 2003)• All-frequency precomputed radiance transfer for glossy objects, Liu et. al. (EGSR 2004)• Triple product wavelet integrals for all-frequency relighting, Ng et. al. (SIGGRAPH

2004)• All-frequency shadows using non-linear wavelet lighting approximation, Ng et. al.

(SIGGRAPH 2003)• The irradiance volume, Greger et. al. (IEEE CG&A Mar.98)• Spherical harmonic gradients for mid-range illumination, Annen et. al. (EGSR 2004)• Ambient occlusion fields, Kontkanen and Laine (Si3D 2005)• Normal Distribution Mapping, Olano and North (Tech Report, 1996)

Acknowledgements• Shanon Drone, John Rapp, Jason

Sandlin and John Steed for demo help• Michael Ashikhmin, Henrik Wann

Jensen, Kazufumi Kaneda, Eric Lafortune, David McAllister, Addy Ngan, Michael Oren, Kenneth Torrance, Gregory Ward, and Stephen Westin for permission to use images

• Naty Hoffman for use of his background slides

© 2005 Microsoft Corporation. All rights reserved. Microsoft, is a registered trademark of Microsoft Corporation in the United States and/or other countries. The names of actual companies and products mentioned herein may be the trademarks of their respective owners.

© 2005 Microsoft Corporation. All rights reserved. Microsoft, is a registered trademark of Microsoft Corporation in the United States and/or other countries. The names of actual companies and products mentioned herein may be the trademarks of their respective owners.