Texture Mapping

Texture mapping was one of the major innovations in CG in the 1990s. It allows us to add a lot of surface detail without adding a lot of geometric primitives (lines, vertices, faces). Think of how interesting Caroline's ``loadedDemo'' is with all the texture-mapping.

demo with textures
With textures
demo without textures
Without textures

Other Reading

As with everything in computer graphics, most of what I know about texture mapping I learned from an early edition of Angel's book, so check that first. Unfortunately, the edition I read had one of his weakest chapters, because it didn't do a very good job of connecting the theory with the OpenGL code. A more practical presentation is a chapter of the Red Book (the Red OpenGL Programming Guide). You're encouraged to look at both.

In this reading, we'll start with some conceptual overview, then quickly look at practical examples, then we'll tour through the many settings, parameters and situations there are to consider.

Conceptual View

Texture mapping paints a picture onto a polygon. Although the name is texture-mapping, the general approach simply takes an array of pixels and paints them onto the surface. An array of pixels is just a picture, which might be a texture like cloth or brick or grass, or it could be a picture of Homer Simpson. It might be something your program computes and uses. More likely, it will be something that you load from a orginary image file.

Demos: These all are part of the 307 demos list. You need not worry about the code yet. We'll look at it a bit later.

Conceptually, to use textures, you must do the following:

  1. define a texture (a rectangular array of pixels — texels
  2. specify a pair of texture coordinates (s,t) for each vertex of your geometry

The graphics system ``paints'' the texture onto the polygon.

How it Works

Texture mapping is a raster operation, unlike any of the other things we've looked at. Nevertheless, we apply textures to 2D surfaces in our 3D model, and the graphics system has to figure out how to modify the pixels during rasterizing (AKA scan conversion).

Since texture-mapping happens as part of the rasterizing process, let's start there.

Rasterizing

When the graphics card renders a polygon, it (conceptually)

Note: standard terminology is that the polygon is called a fragment (since it might be a fragment of a Béezier surface or some such). Thus, the graphics card applies a texture to a fragment.

This all happens in either in the framebuffer or an array just like it.

Implementing Texture Mapping

To do texture mapping, the graphics card must

Texture Space

We can have 1D or 2D textures. The texture parameters will be in the range [0,1] in each dimension. Note that if your texture array isn't square and your polygon isn't square, you may have to deal with changes in aspect ratio.

Your texture is always an array and therefore is always a rectangle. Mapping a texture to rectangles (as OpenGL objects) is fairly easy; mapping it to other shapes is likely to cause distortion. We'll need to be careful in those cases.

Associate each vertex of our polygon with a texture parameter, just like we associate it with a normal, a color, and so forth. Three.js has properties of a Geometry object devoted to representing the texture coordinates for each vertex of a triangular face.

texture coordinates
Texture Coordinates

How do the texture coordinates relate to the 2D array of texels? This is easiest to explain with a picture such as the one above.

Conventionally, the texture coordinates are called (s,t), just as spatial coordinates are called (x,y,z). However, they can also be called Thus, we can say that s goes along the rows of the texture (along the ``fly'' of the flag). The t coordinate goes along the columns of the texture (along the ``hoist'' of the flag).

Although you will often use the entire texture, so that all your texture coordinates are 0 or 1, that is not necessary. In fact, because the dimensions of texture arrays are required to be powers of two, the actual image that you want is often only a portion of the whole array.

The computed US flag array has that property. The array is 256 pixels wide by 128 pixels high, but the flag itself is 198 pixels wide by 104 pixels high. Thus, the maximum texture coordinates are:

fly = 198/256 = 0.7734
hoist = 104/128 = 0.8125
texture coordinates
Texture Coordinates inside an Image

The result might look like the image above.

Of course, we also need to ensure that the rectangle we are putting the flag on has the same aspect ratio as the US flag, namely: 1.9. See the official US flag specification.

The texture parameters can also be greater than 1, in which case, if we use \url{GL_REPEAT}, we can get repetitions of the texture. If s is some parameter where 0 < s < 1, specifying some part of the texture partway along, then 1+s, 2+s and so on are the same location in the texture. Move this par

Basic Demos

It's now time to look at the code for our basic demos.

\subsection{Texture Mapping in OpenGL} \begin{figure}[tbp] \begin{center} \includegraphics{TextureParameters.eps} \caption{Demo of how textures can interact with fragment color: TextureParameters} \label{fig:TextureParameters} \end{center} \end{figure} Conceptually, to actually do texture mapping in OpenGL, you have to do all the following steps. \begin{enumerate} \item Create or load a 1D or 2D array of texels. All dimensions must be a power of two! Different kinds of data are possible: \begin{itemize} \item RGB values \item RGBA values \item Luminance (grayscale) \item ... \end{itemize} Also, the data in the array can be in different formats (unsigned bytes, short floats, etc.). You must tell OpenGL what it is. \item Set various modes. These have default values (see the man pages), so they can be skipped in some cases, but I tend to set them all. I copy/paste the code from some working example of texture-mapping, then change the modes as necessary. \item Send the texel data to the graphics card. \item Enable texture-mapping \item Specify a texture coordinate for each vertex. Sometimes this is done automatically (as for the teapot) or is calculated (as for B\'ezier surfaces). We'll get into B\'ezier stuff later. \end{enumerate} For coding, that means the following steps. We'll go through these functions in detail. \begin{alltt} glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, \textit{GL_DECAL}); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, \textit{GL_REPEAT}); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, \textit{GL_CLAMP}); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, \textit{GL_NEAREST}); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, \textit{GL_LINEAR}); glTexImage2D(GL_TEXTURE_2D, 0, 3, \textit{width}, \textit{height}, 0, \textit{GL_RGB}, \textit{GL_UNSIGNED_BYTE}, \textit{texelArray}); glEnable(GL_TEXTURE_2D); glBegin(...); glTexCoord2f(0,1); glVertex3f(0,0,0); \end{alltt} The \url{glTexEnvf} has several settings. The settings can mean different things depending on the format of the texture (such as luminance versus RGB) and the color model in OpenGL. Ignoring those nuances, here's a basic summary. \begin{itemize} \item \url{GL_DECAL} and \url{GL_REPLACE}. These do the same thing except when there's transparency. The intrinsic color of the fragment (the polygon in the model) is ignored and the color of the texel is used instead. \item \url{GL_MODULATE}. The color of the pixel is the product (plain old multiplication) of the color of the fragment and the color of the texel. The texel is often a luminance value and you use the texture to darken or lighten the color of the fragment. \item \url{GL_BLEND}. The color of the pixel is a mixture (weighted average) of the color of the fragment and the texture ``environment'' color, with the texel determining the weighting. \end{itemize} The texture can either replace the scene colors, like a decal, or it can blend with the scene colors, as with wood-grain finishes, or even adding surface smudges and dirt, to make things look more realistic. The parameters settings in \url{glTexEnvf} help to set this interaction between the color of the fragment (whether direct RGB color or material and lighting) and the texture. We'll look at the \url{TextureParameters.py} demo and code (I'll port that to Python from C before class). You can see a screen-shot in figure~\ref{fig:TextureParameters}. We'll also look at \url{~cs307/pub/tw/Tutors/texture}. We'll talk about the \url{glTexParameteri} function calls later. The \url{glTexImage2D} function has a lot of parameters. Most are fixed, though: \begin{enumerate} \item \url{GL_TEXTURE_2D} or \url{GL_TEXTURE_1D}. Those are the values we'll use \item level. It's possible to give OpenGL several images at different resolutions, called ``mipmaps,'' but it seems to be flakey. The OpenGL examples I've downloaded don't work. So, always use the base level, which is 0. \item internal format: specifies the number of color components in the texture. Typically, this is 3, meaning RGB data. \item width of the texture. Must be a power of 2 or one more than a power of 2 if you have a one-pixel border. \item height. Same as width \item border. zero or one. \item format. The kind of pixel data. Typically \url{GL_RGB} or \url{GL_LUMINANCE}. \item type. The datatype of the array. Typically \url{GL_UNSIGNED_BYTE}. \item pixels. A pointer to the image data in memory. \end{enumerate} We'll look at the \url{USflag.cc} file for an example of this. \section{Issues} Here are some issues to face and choices to make: \begin{itemize} \item \textbf{Aspect Ratio:} Your texture is always a rectangle. Even if your polygon is one, too, you'll have to deal with matching aspect ratios if you want the image to be undistorted. With a plain texture (such as grass or wood) this may not matter, but for pictures it may. \item \textbf{Wrapping:} What happens when your texture parameters fall outside the [0,1] range? We'll try this with the tutor. \begin{itemize} \item You can ``wrap'' around (essentially removing the integer part and using only the fractional part). This repeats the texture. For real textures, you often want to do this. \item You can ``clamp'' the value at the edge pixel. If your texture has a border of some sort, that can work out well. \end{itemize} \item \textbf{Filter:} What to do when the pixel doesn't exactly match a texel? You get to specify this for both magnification (pixel smaller than texel) and minification (pixel larger than texel), but in practice, I think they are usually set to the same value. \begin{itemize} \item Use the nearest (Manhattan distance) texel to the center of the pixel. \item Use a weighted average of the four texels nearest to the center of the pixel. \end{itemize} We'll look at the \url{LinearNearest} demo to understand this. \textbf{Note:} the functions to set the filters appear not to have adequate defaults: if you don't set them, you won't get a texture! \item \textbf{density} of the texture repetition. Too little and it looks badly ``stretched.'' Too much can squeeze the texture too much. Look at \url{Grass.cc}. Try the three different textures. Use the ``r'' callback to reveal the vertices that are created. Look at the texture ``from above,'' by using the ``Y'' callback. \end{itemize} \subsection{More Demos} \begin{figure}[tbp] \resizebox{0.5\linewidth}{!}{\includegraphics{LinearNearest-L.eps}}\hfil \resizebox{0.5\linewidth}{!}{\includegraphics{LinearNearest-N.eps}} \caption{Both figures are checkerboard textures, stretched over a large number of pixels. Consequently, the texture coordinate values for many pixels fall ``between'' texel values. In the picture on the left, we use a ``linear'' interpolation between the texel values. In the picture on the right, we use the ``nearest'' texel value.} \label{fig:LinearNearest} \end{figure} \sloppy Please look at the code for the following demos. All of them are in \url{~cs307/public_html/pytw/demos/texture-mapping/}. \begin{itemize} \item \url{Rainbow.py} This is lovely example of a 1D texture. Use the ``R'' keyboard callback to turn the rainbow on/off. The illusion is much better if you switch to ``immerse'' mode. Note that another version of this demo that doesn't use TW, called \url{RainbowSweet}, looks better because the illusion is much better if you're inside the scene. Or use TW's ``immerse'' mode. Original code from Michael Sweet. \item \url{TextureParameters.cc} This demonstrates uses of texture parameters, such as decal vs blending. It generates figure~\ref{fig:TextureParameters}. \item \url{LinearNearest.cc} This demonstrates the difference between \url{LINEAR} and \url{NEAREST} for magnification/minification. You can see screen shots in figure~\ref{fig:LinearNearest}. \item \url{LitUSFlag.cc} This demonstrates how to combine B\'ezier surfaces and texture mapping. Lighting, too. \item \url{~/pub/tw/Tutors/texture}. Nate Robins' tutor. Pretty slick, but he uses textures that are upside down, so it can be confusing, too. \iffalse \item \url{~/pub/glui_v2_1_beta/us-flag}. A tutor I created using a GL UI toolkit. \fi \end{itemize} \section{Images and File Formats} Images come in dozens of formats, with different kinds of compression techniques and so forth. We will look at the following kinds: \begin{itemize} \item Compressed Formats. These are supported by all reasonable web browsers, and the file sizes are not excessive. The different formats have tradeoffs, though, and are complex, because of the compression algorithms. There are common libraries to read/write these. \begin{itemize} \item GIF (Graphic Interchange Format): a compressed (loss-less) format limited to 256 colors. It was encumbered with a patent, but that has now expired. Allows index transparency, meaning chosen pixels can be in the ``clear'' color instead of a normal RGB color. \item JPG (Joint Photographic Experts Group): a compressed (lossy) format that can handle full RGB color (millions of different colors in an image). No transparency. Tends to be best for pictures of realistic natural scenes. \item PNG (Portable Network Graphics): an open-source, compressed (loss-less) format that removes some restrictions of GIF. The file format also stores ``vector'' information, if the image is produced by a drawing program. Fireworks uses this format as its native format. \end{itemize} \item Uncompressed formats. These formats have a simple structure but the file sizes are large because there is no compression: it's just a 2D array of RGB values. These files are typically not supported by web browsers and shouldn't be used on the web anyhow because the file sizes are so large. \begin{itemize} \item BMP (MS-Windows Bitmap format): this is an uncompressed Windows format. \item TIFF (Tag Image File Format): an industry standard pixmap file format, common on Macs. Some digital cameras produce this. \item PPM (Portable Pixmap): an open-source, uncompressed format. The format is: \begin{itemize} \item P6: two ASCII characters identifying the file type \item w, h: two decimal numbers with a space after them giving the width and height of the image \item 255: the largest possible value of a color component \item a carriage return character (ASCII 13) \item data: w × h × 3 bytes giving the R, G, B values for each pixel. \end{itemize} \raggedright It's standard to store the image in top-to-bottom, left-to-right order. I got this info from \url{http://astronomy.swin.edu.au/~pbourke/dataformats/ppm/} \end{itemize} \end{itemize} For TW, we will always use PPM format. You can convert images to/from PPM format to other formats using Windows, Mac or Linux graphics programs or various Linux commands, such as: \begin{itemize} \item ppmtogif \item ppmtojpeg \item bmptoppm \item *topnm \item pnmto* \end{itemize} PNM is a ``portable anymap'' file; the programs seem to be able to guess whether it's black and white (PNB), grayscale (PGM) or color (PPM). \subsection{Demo} \begin{itemize} \item Start Fireworks \item Draw something \item Save (default is PNG, so that's fine) \item copy it to Puma, say with Fetch or WinSCP. \item convert to PPM: \begin{verbatim} % display foo.png % pngtopnm -verbose foo.png > foo.ppm % display foo.ppm \end{verbatim} \item Run QuadPPM.py foo.ppm \end{itemize} \subsection{Loading Images} We'll explore the code of \url{QuadPPM.py}. \begin{itemize} \item You can read in an image from a file and use it as a texture. \item You should read the file in just once, so don't call \url{twTex2D} from your display function. \item Note that most glut objects don't have pre-defined texture coordinates. Only the teapot does. You can generate them for the others, using a fairly incomprehensible interface. We'll try to learn more about this as the semester goes on. \end{itemize} \section{Binding Textures} For additional speed when using several textures, you can load all the textures, associating each with an integer identifier (just like display lists) and then referring to them later. \textbf{Setup steps:} \begin{itemize} \item Ask for a bunch of identifier numbers: \begin{alltt} glGenTextures(num_wanted,result_array); \end{alltt} \item Then, for each texture you want, get one of the numbers out of the array and: \begin{alltt} glBindTexture(GL_TEXTURE_2D, textureNumber); glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, something); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, something); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, something); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, something); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, something); glPixelStorei(GL_UNPACK_ALIGNMENT,1); glTex2D(...); or twTex2D(...); \end{alltt} \end{itemize} \textbf{Reference step:} When you want a texture, just: \begin{alltt} glBindTexture(GL_TEXTURE_2D, textureNumber); \end{alltt} As a convenience, you can replace each of the texture steps with \begin{alltt} twLoadTexture(textureIDs[n], filename); \end{alltt} However, this function uses \url{GL_MODULATE}, \url{GL_REPEAT} and \url{GL_LINEAR}, which may not be what you want. Demos: \url{TextureBinding.cc} and \url{USflag-binding.cc}. Try spinning either of these. Notice how relatively quick they are. This is because the texture is already loaded into memory on the graphics card, so almost nothing needs to be sent down the pipeline to draw the next frame of animation. \subsection{Saving Images} You can also save the contents of the framebuffer as a PPM file. Just hit the ``S'' key. This is accomplished thanks to an interesting function \begin{alltt} void glReadPixels( GLint x, // raster location of first pixel GLint y, GLsizei width, // dimensions of pixel rectangle GLsizei height, GLenum format, // GL_RGB GLenum type, // GL_UNSIGNED_BYTE GLvoid *pixels ); glReadPixels(0,y,width,1,GL_RGB,GL_UNSIGNED_BYTE, (GLvoid *) pixels); \end{alltt} The file is saved as \url{saved_image01.ppm} in the current directory. If you hit ``s'' again, you get \url{saved_image02.ppm} and so forth. In honor of family and friends weekend, convert these to PNG and put them on your web page! Or email them! Note that PPM files are big. In many of our examples, the framebuffer is 500 by 500. The file size is therefore \[ 500\times500\times3+\mbox{\textrm{len(P6500 500 255)}}+1 = 750014 \] \begin{verbatim} % ppmtojpeg -v saved-frame01.ppm > saved-frame01.jpg ppmtojpeg: Input file has format P6. It has 500 rows of 500 columns of pixels with max sample value of 255. ppmtojpeg: No scan script is being used % ls -l saved-frame01.* -rw-rw-r-- 1 cs307 cs307 25290 Nov 7 00:06 saved-frame01.jpg -rw-r--r-- 1 cs307 cs307 750014 Oct 31 14:33 saved-frame01.ppm \end{verbatim} The JPG is a bit smaller! It's 1/30th the size in this case, but your mileage may vary. Since you have a finite filespace quota, manage your space carefully. Once you save a frame, you might convert it to a compressed format (probably PNG or JPEG) and then discard the PPM file. \section{Texture Mapping using Modulate} When you texture-map using \url{GL_MODULATE}, you have to think about the color of the underlying surface. In particular, if you're using material and lighting, you have to use material and textures. Caroline's texture tutor can help: \url{~cs307/public_html/demos/textureTutor} \section{Texture Mapping Onto Odd Shapes} \subsection{Triangles} There are actually two choices here. If you want a triangular region of your texture, there's no problem, just use the texture coordinates as usual. If you want to squeeze one edge of the texture down to a point, it would seem that all you have to do is use the same texture coordinates for both vertices, but that yields odd results. Instead, you can use linear B\'ezier surfaces to make a triangular region. Demo: \url{TexturemapTriangles.cc} \iffalse \subsection{Circles} It might not seem that you'd want to do this, but consider texture-mapping onto the ends of a cylinder. The key issue is, of course, the mapping from the rectangle to the circle. There are infinitely many. Here are just two: \resizebox{0.9\linewidth}{!}{\includegraphics{circle-square.eps}} I have some demos of these (\url{~cs307/pub/distrib/cylinder-flag.c}), but the code is incomprehensible. \fi \subsection{Cylinders} If mapping onto a curved surface, we usually represent the surface with parametric equations and map texture parameters to curve parameters. For example, a cylinder: \begin{eqnarray*} x &=& r\cos(2\pi u)\\ y &=& r\sin(2\pi u)\\ z &=& v/h \end{eqnarray*} With the easy mapping: \begin{eqnarray*} s=u\\ t=v \end{eqnarray*} Demo: \url{CylinderFlag.cc} This shows how to put a 2D texture onto a non-planar figure. It uses the US flag, since it's easy to see the orientation of the texture. Essentially, we have to build the figure ourselves out of vertices, so that we can define texture coordinates for each vertex. There are two ways to put a flag onto a cylinder: with the stripes going around the cylinder or along its length. This demo does either; the ``l'' keyboard callback switches the orientation. Understanding this code is not easy, but it really only requires understanding polar/cylindrical coordinates. The texture coordinates are relatively straightforward. \subsection{Bezier Surfaces} We've already seen this, and we got another dose of it when we looked at mapping onto triangles, but let's look at it again. To map onto a surface with material and lighting, consider: Demo: \url{LitUSFlag.cc} \subsection{Globes} In general, mapping a flat surface onto a globe (sphere) is bound to produce odd distortions. It's essentially a 3D version of the problem of mapping a rectangle onto a circle. The reverse mapping is interesting to contemplate: namely a flat rectangle that shows the surface of the globe. This is a problem that cartographers have wrestled with for years. Indeed, both of the examples I gave above for circles and squares have equivalents in cartography. The distortion problem presents several tradeoffs, the most important of which is shape distortion versus area distortion. \begin{itemize} \item area: To preserve the equal-area property, you have to compress the lines of latitude, particularly those far from the equator. A famous current example is the \textbf{Peters} projection. \item shape: To preserve shape, you end up expanding the lines of latitude, particularly those far from the equator. One important side effect of preserving shape is that a straight line on the map is a great circle, which makes these maps better for navigation. A famous current example is the \textbf{Mercator} projection. \end{itemize} Let's spend a few minutes discussing the pros and cons of these. There are some good web pages linked from the course home page. To texture-map a globe, I created a globe by hand, iterating from the north pole (π) to the south pole () and from 0 longitude around to longitude. I converted each (longitude,latitude) pair into (x,y,z) values but also made a (s,t) texture-map pair. This works pretty well except possibly at the poles. Math: \begin{eqnarray*} x &=& \cos(\mathrm{latitude})*\cos(\mathrm{longitude})\\ y &=& \sin(\mathrm{latitude})\\ z &=& \cos(\mathrm{latitude})*\sin(\mathrm{longitude})\\ s &=& 1-\mathrm{longitude}/2\pi \\ t &=& 1-\mathrm{latitude}/\pi+\pi/2\\ \end{eqnarray*} Demo: \url{GlobeTexture.cc} \iffalse \section{Additional Demos} In the \url{~cs307/pub/demos} directory, we have: \begin{itemize} \iffalse \item \url{texture-squares} which shows some effects of different uses of the texture coordinates, clamping and repeating and the like. \fi \item \url{11-USFlag-binding} which loads several BMP files and maps them onto the sides of some cubes, which it then scatters about \end{itemize} \fi \end{document}

Other Innovations

The idea is to map a buffer of information onto a region of the framebuffer, thereby affecting the pixels.

These topics and techniques are covered in the Dirksen book; we'll address them later.

Summary