Textures
Prof. João Madeiras Pereira
Prof. Alfredo Ferreira
IST 2009/2010
Objectives
� Introduce Mapping Methods
� Texture Mapping� Environmental Mapping� Bump Mapping� Light Mapping
� Consider two basic strategies
� Manual coordinate specification� Two-stage automated mapping
Motivation:
(1) adding realism� Objects rendered using Phong reflection
model and Gouraud or Phong interpolatedshading often appear rather ‘plastic’ and‘floating in air’
� Texture effects can be added to give morerealistic looking surface appearance
� Texture mapping� Texture mapping uses pattern to be put on a surface
of an object
� Light maps� Light maps combine texture and lighting through a
modulation process
� Bump Mapping� Smooth surface is distorted to get variation of the
surface
� Environmental mapping� Environmental mapping (reflection maps) – enables
ray-tracing like output
Motivation:
(2) adding surface detail� The most obvious solution is not the best
� Breaking the scene into smaller and smaller polygonsincreases the detail
� But it is very hard to model and very-time consumingto render
� Preferred solution is texture mapping� Examples
� Model t-shirt with logo (1)� No need to model the letters and engine with
triangles� Use large base polygon� Color it with the photo
� Subtle wall lighting (2)� No need to compute it at every frame� No need to model it with a lot of constant color
triangle� Paste photograph on large polygon
� Non-planar surfaces also work (3)� subdivide surface into planar patches� assign photo subregions to each individual patch
� Examples of modulating color, bumpiness, shininess,transparency with identical sphere geometry (4)
1
4
2
3
Textures: at what point do
things start to looking real?
� Surfaces “in the wild” are very complex
� Cannot model all the fine variations
� We need to find ways to add surface detail. How?
geometric model geometric model+
shading
geometric model+
shading+
textures
Texture mapping, texture
pattern, and texels� Developed by Catmull (1974), Blinn and
Newell (1976), and others.
� Texture mapping: adds surface detail by mapping texture patterns onto the surface.
� Pattern is repeated. For example, the texture pattern for the cube aside is the following:
� Texel: short for “texture element”.
� A texel is a pixel on a texture. For example, an 128x128 texture has 128x128 texels. On screen this may result in more or fewer texels depending on how far away the object is on which the texture is used and also on how the texture is scaled on the object
Mapping Techniques
� Texture Mapping� Environmental
Mapping� Bump Mapping� Light Mapping
Texture Mapping: is it
simple?
� Although the idea is simple---map an
image to a surface---there are 3 or 4
coordinate systems involved
2D image
3D surface
Coordinate Systems
� Parametric coordinates (u,v)
� May be used to model curves and surfaces - to map th e 3D surface with 2D parameters
� Texture coordinates (s, t)
� Parameterize points in the texture with 2 coordinate s: (s,t)� So, the texture is simply an image, with a 2D coord inate system
(s,t) - used to identify points in the image to be m apped� Object or World Coordinates (x, y, z)
� Conceptually, where the mapping takes place� Window Coordinates
� Where the final image is really produced (viewport coord + depth info)
Texture Mapping
parametric coordinates
texture coordinates
world coordinateswindow coordinates
Texture to Surface
Coordinate Mapping
s
t
(x,y,z)
s
t
�The basic problem is how to find the texture to
surface mapping
�Consider mapping from texture coordinates to
a point a surface: given a texture position (s,t),
what is the position (x,y,z) on the surface?
�Appear to need three functions:
�x = X(s, t)
�y = Y(s, t)
�z = Z(s, t)
� But we really want to go the other way since
in rendering based on fragment-to-fragment
approach, the inverse mapping from window
coordinates coordinates to texture coordinates
is needed
�
Where does mapping take
place?� Most mapping techniques are implemented at
the end of the rendering pipeline
� Texture mapping as a part of shading
process, but a part that is done on a
fragment-to-fragment basis
� Very efficient because few polygons make it past th e clipper
Backward Mapping� We really want to go backwards
� Given a fragment, we want to know to which point on an object it corresponds
� Given a point on an object, we want to know to which point in the texture it corresponds
� Need a map of the form � s = s(x,y,z) � t = t(x,y,z)or� s = s(u,v) � t = t(u,v)
� Such functions are difficult to find in general
� With polygons:� Specify (s,t) coordinates at vertices� Interpolate (s,t) for other points based on
given vertices
How to set
(s,t) texture coordinates?
� Set the coordinates manually
� Set the texture coordinates for each vertex ourselves
� Automatically compute the coordinates
� Use an algorithm that sets the texture coordinates for us
Manually specifying the
coordinates� We can manually specify the texture coordinates at each
vertex
� We can chose alternate texture coordinates
Mapping Texture to Polygons
� For polygon texture mapping, we explicitly define the (s,t) coordinates at the polygon vertices
� That is, we pin the texture at the vertices
� We interpolate within the triangle at the time of scan converting into window space
….glTexCoord2f(0.5, 0.5);glVertex3fv (10.2,3.4,4.5);….
What about complex 3D
objects?� It is easy to set texture
coordinate for a single 2D polygon, but can be difficult to set the texture coordinates for complex 3D regions or objects.
� Two-Stage Mapping: an automatic solution to the mapping problem is to first map the texture to a simple intermediate surface then map the simple intermediate surface to the target surface
Two-part mapping
� One solution to the mapping problem is
to first map the texture to a simple
intermediate surface
� Example: map to cylinder
Cylindrical Mapping
parametric cylinder:
x = r cos 2πuy = r sin 2πuz = v/h
maps rectangle in u,v space to cylinderof radius r and height h in world coordinates
s = ut = v maps to texture space
Since u and v vary over (0, 1)
Spherical Mapping
We can use a parametric sphere
x = r cos 2πuy = r sin 2πu cos 2πvz = r sin 2πu sin 2πv
in a similar manner to the cylinderbut have to decide where to putthe distortion – ex: Mercator projection puts the most distortion at the poles
Spheres are used in environmental maps
Box mapping
� Map the texture to a box that can be
unraveled, like a cardboard packing
box
� Often used with environmental maps
Second Mapping
� Map from intermediate object to actual object
� Intersect the normals from intermediate surface to actual surface
� Normals from actual to intermediate� Vectors from center of actual
actual intermediate
Determination of colors in
the texture
� Point sampling of the texture can lead
to aliasing errors
point samples in texture space
miss blue stripes
Area Averaging
A better but slower option is to use area averaging.
Example: curve object
- Note that preimage of pixel is curved-To assign a texture value based on averaging the texture map over the preimage-In case of a polygon, the color of the pixel area will be an average of an array rectangular of texels – i.e 2x2 texels (GL_LINEAR filter in OGL)
pixelpreimage
Filtering
� In area averaging, each pixel is
associated with small region of surface
and to a small area of texture.
� There are 3 possibilities for
association:
� 1. one texel to one pixel (rare)� 2. Magnification: one texel to many pixels� 3. Minification: many texels to one pixel
Magnification and
Minification
More than one texel can cover a pixel (minification) ormore than one pixel can cover a texel (magnification)
Texture Polygon
Magnification Minification
PolygonTexture
Can use point sampling (nearest texel) or linear filtering (2 x 2 filter) to obtain texture values
Zoom In: Magnification Filter� Pixel maps to a small portion of one texel
� Results in many pixels mapping to same texel
� Without a filtering method, aliasing is common
� Magnification filter: smooths transition between pixels
Zoom Out: Minification Filter
� One pixel maps to many texels
� Common with perspective foreshortening
Perspective foreshorteningand poor texture mappingcauses checkerboard to
deform
Mipmaps improve themapping, returning moreform to the checkerboard
Better Min Filter: Mipmaps
� “mip” stands for multum in parvo, or “many things in a small place”
� Basic idea: Create many textures of decreasing size and use one of these subtextures when appropriate
� Pre-filter textures = mipmaps
Mipmaps: Storage
Optimization� Must provide all sizes
of texture from input to 1x1 in powers of 2
Filtering in Summary
� Zoom-in calls for mag filter
� Zoom-out calls for min filter
� More advanced filters require more
time/computation but produce better results
� Mipmapping is an advanced min filter
� Caution: requesting mipmapping without pre-
defining mipmaps will turn off texturing; (see
Filtering in OpenGL)
Wrapping Modes
� Can assign texture coords outside of [0,1] and have them either clamp or repeat the texture map
� Wrapping modes:
� repeat: start entire texture over� Repeat Issue: making the texture borders
match-up� mirror: flip copy of texture in each
direction� get continuity of pattern
� clamp to edge: extend texture edge pixels
� clamp to border: surround with border color
Repetitive texture tiling� As seen, a texture can be repeatedly tiled across the surface
by repeating the (s,t) parameterization over the surface
� But, best results are obtained when the texture is seamlessly
tilable
� This means that the right side of the texture joins seamlessly with the left side (same with the bottom and top)
� Seams will appear for most textures when tiled:
� But, we can edit or re-synthesize textures to be seamlessly
repeatable (this is another topic onto itself):
Texturing in OpenGL
Basic steps
� Create a texture object and specify a texture for that object
� (glGenTextures() and glBindTexture());
� set up texture parameters � (glTexParameter(), glTexImage() and glTexEnv());
� enable texturing� Draw the scene, supplying both texture and
geometric coordinates
Texture Mapping
x
y
z
image
geometry display
Texture Example
� The texture is a 256 x
256 image that has
been mapped to a
rectangular polygon
which is viewed in
perspective
Texture Mapping and the
OpenGL Pipeline
� Images and geometry flow through separate
pipelines that join at the rasterizer
� “complex” textures do not affect geometric complexi ty
geometry pipelinevertices
pixel pipelineimage
rasterizer
Creating one Texture Object
� Define a texture image from an array of texels (texture elements) in CPU memory
� Glubyte my_texels[512][512][4];
� Define as any other pixel map
� Scanned image� Generate by application code
� Enable texture mapping
� glEnable(GL_TEXTURE_2D)
� OpenGL supports 1-4 dimensional texture maps
Specifying a Texture Image
Define Image as a Texture
glTexImage2D(target, level, components,w, h, border, format, type, texels );
target: type of texture, e.g. GL_TEXTURE_2D
level: used for mipmapping (discussed later)components: elements per texel in internal OGL texture memoryw, h: width and height of texels in pixelsborder: used for smoothing the tiling of textures in repeat modeformat and type: describe texels in the CPU memorytexels: pointer to texel array
Example: set the current image texture:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 512, 512, 0, GL_RGBA, GL_UNSIGNED_BYTE, my_texels);
Call glTexImage2D . It uploads the texture in CPU memory to the internal OGL video memory where it will be ready to use in our programs.
Converting A Texture Image
� OpenGL requires texture dimensions to be
powers of 2
� Dimensions not less than 64x64
� If dimensions of image are not powers of 2
� gluScaleImage( format, w_in, h_in, type_in, *data_in, w_out, h_out, type_out, *data_out );
� data_in is source image� data_out is for destination image
� Image interpolated and filtered during scaling
Mapping a Texture
� Based on parametric texture
coordinates
� glTexCoord*() specified at each vertex
s
t1, 1
0, 1
0, 0 1, 0
(s, t) = (0.2, 0.8)
(0.4, 0.2)
(0.8, 0.4)
A
B C
a
bc
Texture Space Object Space
Typical Code
Note that we can use vertex arrays to increase efficiency
Interpolation
texture stretchedover trapezoid showing effects of bilinear interpolation
poor selectionof tex coordinates
good selectionof tex coordinates
� OpenGL uses interpolation to find proper
texels from specified texture coordinates
� Distortions can be produced
Specifying Texture
Parameters� OpenGL has a variety of parameters that determine the
behavior and appearance of textures when they are rendered� Wrapping parameters determine what happens if s and t are outside the
(0,1) range� Filter modes allow us to use area averaging instead of point samples� Mipmapping allows us to use textures at multiple re solutions� Environment parameters determine how texture mappin g interacts with
shading� The glTexParameter() function is a crucial part of OpenGL
texture parameterization, concerning wrapping and filtering. The glTexEnv() defines how texture values interacts with fragments colors.
� Take note that each texture uploaded can have its own separate properties, texture properties are not global.
Wrapping Modes
� Clamping : if s,t > 1 use color at 1, if s,t < 0 use color at 0
� glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, G L_CLAMP);� Repeating : use s,t modulo 1
� glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, G L_REPEAT);
Recall Magnification and
Minification
More than one texel can cover a pixel (minification) ormore than one pixel can cover a texel (magnification)
Can use point sampling (nearest texel) or linear filtering( 2 x 2 filter) to obtain texture values
Texture Polygon
Magnification Minification
PolygonTexture
Filter Modes
Note that linear filtering requires a border of an extra texel for filtering at edges (border = 1)
glTexParameteri(GL_TEXTURE_2D, GL_TEXURE_MIN_FILTER,GL_LINEAR);
Modes determined byglTexParameteri( target, type, mode )
glTexParameteri(GL_TEXTURE_2D, GL_TEXURE_MAG_FILTER,GL_NEAREST);
Mipmapped Textures
� Mipmapping allows for prefiltered texture maps of decreasing resolutions
� Lessens interpolation errors for smaller textured objects
� Declare mipmap level during texture definition
� glTexImage2D( GL_TEXTURE_*D, level, … )� GLU mipmap builder routines will build
all the textures from a given image
� gluBuild*DMipmaps( … )
glTexParameter() function
glTexParameter() function
(cont)
Mipmapping:
GL_*_MIPMAP_LINEAR example
Mipmapping: ratio
calculation
Mipmaps choice
Texture Functions
� Controls how texture is applied� glTexEnv{fi}[v](GL_TEXTURE_ENV, prop, param )
� prop can be:� GL_TEXTURE_ENV_MODE� GL_TEXTURE_ENV_COLOR
� GL_TEXTURE_ENV_MODE modes� GL_MODULATE: modulates with computed shade� GL_BLEND: blends with an environmental color� GL_REPLACE: use only texture color� Example:
� glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
� Set blend color with GL_TEXTURE_ENV_COLOR in conjunction with GL_BLEND
GL_TEXTURE_ENV_MODE
modes
f = fragment, t = texture, c = GL_TEXTURE_ENV_COLOR
RGBA MODE
Perspective Correction Hint
� Texture coordinate and color interpolation
� either linearly in screen space� or using depth/perspective values (slower)
� Noticeable for polygons “on edge”
� glHint( GL_PERSPECTIVE_CORRECTION_HINT, hint )
where hint is one of � GL_DONT_CARE� GL_NICEST� GL_FASTEST
Generating Texture
Coordinates
� OpenGL can generate texture coordinates
automatically
� glTexGen{ifd}[v]()
� specify a plane
� generate texture coordinates based upon distance fr om the plane
� generation modes
� GL_OBJECT_LINEAR� GL_EYE_LINEAR � GL_SPHERE_MAP (used for environmental maps)
Applying Textures II
1. specify textures in texture objects
2. set texture filter
3. set texture wrap mode
4. set texture function
5. set optional perspective correction hint
6. bind texture object
7. enable texturing
8. supply texture coordinates for vertex
coordinates can also be generated
Other Texture Features
� Environment Maps
� Start with image of environment through a wide angle lens
� Can be either a real scanned image or an image created in OpenGL
� Use this texture to generate a spherical map� Use automatic texture coordinate generation
� Multitexturing
� Apply a sequence of textures through cascaded texture units
Top Related