# Computer Graphics Module Review CO2409 Computer Graphics.

date post

02-Jan-2016Category

## Documents

view

226download

1

Embed Size (px)

### Transcript of Computer Graphics Module Review CO2409 Computer Graphics.

Computer GraphicsModule ReviewCO2409 Computer Graphics

Lecture Contents2D Graphics & Geometry3D Geometry & MathsRendering Pipeline Key ConceptsProgrammable Pipeline / ShadersDepth / Stencil Buffers & ShadowsAnimation

Pixels & ColourA computer display is made of a grid of small rectangular areas called pixelsPixel colour is usually specified as red, green and blue components:Integers (0-255) or floats (0.0 to 1.0)The RGB colour space is a cubeAnother colour space is HLSHue =colour from spectrum, Lightness = brightness of the colour Saturation = intensity of the colourCan be pictured as a double coneMore intuitive for artists

Bitmaps / Sprites / Alpha ChannelsA bitmap is rectangle of pixels stored off-screen for use in an imageA sprite describes a particular use of bitmapsWhen used as distinct elements in a larger scene

As well as RGB colours, we can store per-pixel values specifying transparencyAlpha data, or the alpha channel, making RGBACan use alpha to blend pixels onto viewportFinalColour = Alpha * SourceColour + (1-Alpha) * ViewportColourAlpha can also be used for alpha testingUsed for cutout sprites

Further BlendingOther ways of blending pixels onto the viewport: Multiplicative blending equation is:FinalColour = SourceColour * ViewportColour

A darkening effect, suitable for representation of glass, shadows, smoke etc

Additive blending equation is:FinalColour = SourceColour + ViewportColour

This is a lightening effect, mainly used for the representation of lights

Basic Geometry DefinitionsIn both 2D and 3D we have used these definitions:A vertex: a single point defined by coordinates on the axes of a coordinate system E.g A(10, 20)An edge: a straight line joining two verticesE.g. ABA vector: a movement within a coordinate systemE.g. AB (from A to B) or V(5, 10)A normal is any vector whose length is 1A polygon: a single closed loop of edges E.g. ABC (from C to A implied)

Coordinate SystemsA coordinate system is a particular set of choices for:The location of the originOrientation and scale of the axesWe later called this a spaceE.g. Model space, World space

A vertex will have different coordinates in two different coordinate systemsThe viewport is a particular coordinate system that corresponds to the visible display area

RenderingRendering is converting geometry into pixelsIn general rendering is a two stage process:Convert the geometry into 2D viewport space (geometry transformation / vertex shaders)Set the colour of the pixels corresponding to this converted geometry (rasterising / pixel shaders)Looked briefly at 2D rendering:Render 2D lines by stepping through the pixelsRender polygons with multiple linesRender circles with equation + many linesDetail about filling polygon pixels is beyond scope of module

Maths/C++ for Graphics AppsBe aware of numeric limitations in C++, e.g:int limits can be exceededfloat / double have limited precisionRepeated calculations with float can build up errorsOther languages have similar limitations

C++ automatically converts between numeric types, issuing warnings when it doesDont ignore, may not be what is required

Several math functions used for graphics:Max, min, remainders, modulus / absolute value, powers, cos, sinKnow the library functions used

3D Geometry - Meshes / NormalsA mesh is a set of polygons making a 3D objectA mesh is usually defined together with a set of face and/or vertex normalsA face normal is a normalised vector at right angles to a polygonUsed to specify the plane of the polygonBut not especially common

A vertex normal can be average of all the face normals of the polygons containing that vertex Or can have multiple for sharp edgesUsed frequently, most notably for lighting

MatricesA matrix (plural matrices) is a rectangular table of numbers:They have special rules of arithmetic

A coordinate system matrix is used to represent a models position/orientation:Transformation matrices used to convert between spaces, or move/orient models: E.g. world matrix converts from model->world spaceBasic transforms: translate, rotation, scale

Of central importance to 3D graphicsWill always be exam questions on matrices

DirectX / Rendering PipelineGraphics APIs perform a pipeline of operations:This is the DirectX 10 pipeline:Input is 3D model geometry dataGeometry stored as lists of vertex dataCustomised for different techniquesOutput is viewport pixels

Pipeline process:Convert to world then viewport space, applying lightingScan through resultant 2D polygons, one pixel at a timeRender pixels using light colours, textures and blending

World MatrixMesh vertices are stored in model spaceThe local space for the mesh with a convenient origin and orientation

For each model we store a world matrix that transforms the model geometry into world spaceThis defines the position and orientation of the modelHas a special form containing the local axes of the model

View MatrixFor each camera we store a view matrix that transforms the world space geometry into camera spaceCamera space defines the world as seen by the cameraX right, Y up and Z in the direction it is facing

For technical reasons this matrix is actually the inverse of a normal world matrixBut it has a similar form and can be used in a similar way

Projection MatrixCameras also have a second matrix, the projection matrix

Defining how the camera space geometry is projected into 2DIt includes the field of view and viewport distance of the cameraViewport distance is also called the near clip distance

This matrix flattens camera space geometry into 2D viewport spaceThen the projected 2D geometry is scaled/offset into pixel coordinates

Lighting / ShadingGeometry colour can come from:Vertex or face colours and/or dynamic lightingTwo shading modes usedHard or smooth edges

3 basic light typesDirectional, Point and SpotLighting is calculated by combining:Ambient light global background illuminationDiffuse light direct lightingSpecular light reflection of light source (highlights)Equations are often examined

TexturesA texture is a bitmap wrapped around a modelThe wrapping is specified by assigning a texture coordinate (UV) to each vertex in the geometryThis is texture mapping

The UVs for the texture range from 0-1UVs outside this range will be wrapped, mirrored, etc. depending on the texture addressing mode

Each pixel in the bitmap appears as a square on the geometry called a texelTextures and texels can be smoothed using texture filtering and mip-mapping

Vertex / Index DataCustomised vertex data is stored in vertex buffersCoordinate (always), normal, UVs, colour etc.Can use vertex data alone to store geometryEach triplet of vertices is a triangle (triangle list)

More efficient to use an additional index bufferStore the set of unique vertices onlyDefine the triangles using triplets of indicesCan also use triangle strips:First triplet defines the first triangleEach further vertex/index is used with the previous two to form a further triangle

Programmable Pipeline / ShadersThree pipeline stages can be programmed directly:The vertex, geometry and pixel processing stagesWe did not look at programmable geometry processingPrograms usually written in a high-level language (HLSL)Called shaders, the vertex shader and the pixel shaderWe write a shader for every rendering technique neededShaders can be compiled at runtime and are loaded as needed

Vertex ShadersWe define shaders in terms of:Inputs - from previous stagesOutputs - to later stagesShaders have a typical usage, but are actually very flexible

Vertex shaders operate on each vertex in the original 3D geometry. Their typical usage is to:Transform and project the vertex into viewport spacePerhaps apply animation or lighting to the vertex

At a minimum, a vertex shader expects vertex coordinates as input, but may have other input too:Normals, UVs, vertex colours, etc. A vertex shader must at the very least output a viewport position for the current vertex, although they often output much more

Pixel ShadersPixel Shaders operate on each pixel in the final 2D polygons. Their typical usage is to:Sample (and filter) any textures applied to the polygonCombine the texture colour with the existing polygon colour (from lighting and/or geometry colours)Input for a pixel shader is usually the output from the vertex shaderAfter rasterization

A pixel shader must at least output a final pixel colour to be drawn/blended with the viewportA Random Image

Advanced ShadersAdvanced shaders can be used to implement high-quality lighting and rendering techniques

A key technique is per-pixel lightingVertex lighting exhibits problems on large polygonsInstead, have the vertex shader pass the vertex position and normal on to the pixel shaderThese are interpolated for the pixel shader, which the uses the normal lighting equations on them

Have covered several other shader techniques:Specular mapping, normal mapping, parallax mapping, cell shading

Graphics ArchitectureThe basic graphics architecture for all modern PCs and game consoles is similar

Comprised of a main system and a graphics unitWith one processor each (CPU & GPU)Fast local RAM for each processorInterface between these two systems is often slow

GPU is a dedicated graphics microprocessorMuch faster than a CPU for graphics related algorithmsGPU runs at the same time as the CPUThese are concurrent systems

Depth Buffers Z-BuffersA depth buffer is a rectangular array of depth values that matches the back-bufferUsed to sort rende

Recommended

*View more*