For this tutorial we will implement an accordian billboard. This is a
billboard that's surface is a series of angled sections so that
depending on the incident angle, the viewer sees a different image. An
example of just such a billboard can be found on Halstead a few blocks
North of UIC!
This tutorial is broken down into a number of stages, with functional
example code that demonstrates certain features at each stage. The stages
include:
This tutorial assumes you already know how to setup and run a "hello
world" GLSL application. It is written to be cross platform, but is
focused for linux users. The demo uses a set of OBJ models built with
wings 3d as the canvas. It uses the OBJ parsing library generously
contributed by Robert Kooima to access the geometry of the model, with
a few modifications to permit application of shaders to the OBJ
geometry. These modifications are available along with an image file
manipulation utility in the common folder in the demo directory. All
source code included in this demo is under the GNU GPL.
The main object in the scene is a model of the sears tower. This model
has been designed with four named surfaces that are the targets for
shader application. They are named "billboard1" - "billboard4" for
each side of the sign wrapped around the tower. Prior to drawing each
surface, the name is checked and shaders are applied / removed and
uniforms are assigned accordingly.
void drawSears()
{
obj_init_gl ();
obj_init_file(obj_sears);
obj_draw_vert(obj_sears);
int num_srfs = obj_num_surf(obj_sears);
for (int i=0; i < num_srfs; i++) {
int mi = obj_get_surf (obj_sears, i );
const char * mn = obj_get_mtrl_name(obj_sears, mi);
if (strstr(mn, "billboard")) {
int offset = 0;
switch (mn[9]) {
case '1': break;
case '2': offset = 1; break;
case '3': offset = 2; break;
case '4': offset = 3; }
glBindTexture(GL_TEXTURE_2D, tex_bill[offset]);
glUseProgramObjectARB(shad_bill);
glUniform1i(ULOC(shad_bill,"tex" ), 0);
obj_draw_surf_polys(obj_sears, i);
} else {
glUseProgramObjectARB(0);
glBindTexture(GL_TEXTURE_2D, 0);
obj_draw_surf(obj_sears, i);
}
}
}
The obj_* functions are defined in the obj.h. They are used
to access and draw different parts of the OBJ file. The most important
of these functions are obj_draw_surf() and obj_draw_surf_polys().
Both draw the specified surface, however obj_draw_surf() applies
material properties such as diffuse lighting constants and
texturemaps. On the other hand, obj_draw_surf_polys() does
not apply any material properties to the polygons- it just draws
them. This is appropriate if we intend to override the fixed
functionality of the GL by applying a shader.
The above code block uses the ULOC() macro to determine the address of
a uniform so that it can be modified by the appropriate glUniform*()
function. This macro is defined as:
#define ULOC(p, x) glGetUniformLocationARB(p, (const GLcharARB*) x)
Ok, enough setup, lets look at some shaders.
This first step is quite simple. All we need to do is draw the surface
using a texture that we passed in as a uniform. Because we are using a
TEXTURE_2D object, and because the surface we are drawing to was
created with normalized texture coordinates (this was done during the
modeling of the sears.obj file) we dont have much work to do. In the
vertex shader we transform the vertecies from eye coordinates to
device coordinates using the ftransform() function. We also pass along
the texture coordinates to the fragment shader by filling
gl_TexCoord[0].
bill.vert
void main(void)
{
gl_TexCoord[0] = gl_MultiTexCoord0;
gl_Position = ftransform();
}
The fragment shader now simply takes the texture coordinate and uses
it to determine the color for a particular fragment.
bill.frag
uniform sampler2D tex;
void main(void)
{
vec3 col = texture2D(tex, gl_TexCoord[0].st).rgb;
gl_FragColor = vec4 ( col, 1.0);
}
For the sky sphere, texture access is slightly different because we
are using a cube map. Here instead of passing a predetermined texture
coordinate, we pass the normalized position vector of the current vertex in eye
coordinates as the texture lookup.
sky.vert
varying vec3 E;
void main()
{
E = vec3(normalize(gl_Vertex));
gl_Position = ftransform();
}
Now in the fragment program we can use this interpolated vector to
access a color from the cube map based on the direction of the current
fragment from the eye.
sky.frag
varying vec3 E;
uniform samplerCube cubemap;
void main(void)
{
vec3 color = vec3(textureCube(cubemap, E));
gl_FragColor = vec4(color, 1.0);
}
Thats it. Try this code by compiling and running main.cpp in the
stage1 folder. Next we'll look at applying some lights.
Before applying light, we must adopt a common coordinate system for
the eye, light and normal vectors that we will be
manipulating. Typically these calculations take place in tangent
space, which is a coordinate system centered at the current vertex,
with x, y and z axes aligned to the vertex's tangent, bitangent and
normal vectors respectively. The tangent and bitangent vectors are
only meaningful in that they are used to calculate the relative
directions of the eye and lights from the current vertex.
In this shader, we are working only with the X and Z faces of a cube,
so we can safely set the bitangent vector to positive Y and calculate
the tangent vector by taking the cross product of the bitangent and
normal vectors. This is performed in the vertex shader.
bill.vert
vec3 biTan = vec3(0.0, 1.0, 0.0);
vec3 n = normalize(gl_NormalMatrix * gl_Normal);
vec3 b = normalize(gl_NormalMatrix * biTan);
vec3 t = normalize(cross(b, n));
Once we have the tangent, bitangent and normal
vectors, we can convert the eye and light vectors into tangent space
by performing component-wise dot products.
bill.vert
varying vec3 E, L;
...
vec3 eye = vec3(gl_ModelViewMatrix * gl_Vertex);
vec3 lgt = vec3(gl_LightSource[0].position) - eye;
...
L.x = dot(lgt, t);
L.y = dot(lgt, b);
L.z = dot(lgt, n);
L = normalize(L);
E.x = dot(-eye, t);
E.y = dot(-eye, b);
E.z = dot(-eye, n);
E = normalize(E);
Now in the fragment shader we can compute the diffuse and specular
lighting based on the positions of the light and eye with respect to
the fragment. Based on our previous definition of the tangent space,
the normal vector of the surface will always be aligned with the Z
axis.
Specular light reflects off the surface in only one
direction. Therefore, the specular component is determined by reflecting the
light vector across the normal and taking its dot product with the eye
vector. Then this value is raised to a power which modifies the size
and intensity of the specular "spot".
Diffuse light scatters off the surface in all directions. The diffuse
component is the dot product of the surface normal with the light
vector. Finally ambient light is simply a constant contribution to the
fragment's light value. These three components are wieghted and mixed
with the color value extracted from the texture to produce the final
fragment value. The lighting wieghts are specified as constant factors
at the top of the example fragment program.
bill_night.frag
varying vec3 E, L;
uniform sampler2D tex;
const float specFactor = 0.75;
const float diffFactor = 1.0 - specFactor;
const float ambiFactor = 1.2 - diffFactor - specFactor;
const float specPower = 32.0;
void calc_lighting(in vec3 norm, out float spec, out float diff) {
spec = clamp(dot(reflect(-L, -norm), E), 0.0, 1.0);
spec = pow(spec, specPower) * specFactor;
diff = max(dot(L, norm), 0.0) * diffFactor;
}
void main(void)
{
vec3 col = texture2D(tex, gl_TexCoord[0].st).rgb;
vec3 norm = vec3(0.0, 0.0, 1.0);
float spec, diff;
calc_lighting(norm, spec, diff);
col *= spec + diff + ambiFactor;
gl_FragColor = vec4 ( col, 1.0);
}
One additional measure is necessary, care must be taken to avoid light
from behind a face from contributing to that face's illumination. For
this reason, the dot products are clamped to a range between 0 and
1. This has the effect of discarding light contributed from sources on
the wrong side of a polygon.
To try out the lighting examples listed here, compile and execute the
stage2 demo. You can use the tab key to switch between day and night.
In order to create a procedural accordian shader, we need a
mathematical representation of the surface. For this demo I've
selected a basic unit of an equilateral triangle. The length of a side
of the triangle is determined by the number of subdivisions of the
texture, which can be changed at run time. Rather than computing the
side length directly, I perform computations on a unit length sided
equilateral triangle, and scale the results appropriately at the end.
The first step is to calculate the difference between vector T (the
threshold vector) and the eye
vector E. This will let you know in which region lies the viewpoint,
to the left or right of T, and determines which inner triangle face should be
projected to the current fragment.
bill.frag
// calculate which side of the triangle
vec2 X = vec2(fract(gl_TexCoord[0].s * divs), 0.0);
vec2 T = normalize(X - V);
vec2 D = E.xz - T;
float side = step(0.0, D.x);
Above divs is the number of divisions of the billboard along its
width. X then varies from 0 to 1 as the fragment travels along the
surface between two triangle peaks.
D is the difference of T and the eye vector, its x component
will be positive if the left triangle is in view for the current
fragment, and negative if the right triangle is in view.
bill.frag
// calculate the intersection point between the eye vector and triangle
vec2 B = vec2(acos(E.x));
B.y = PI - B.y;
vec2 A = PI - 1.0471976 - B;
vec2 b = vec2(X.x, 1.0-X.x) * sin(B) / sin(A);
b.y = 1.0-b.y;
Above, the y component of B and b are used to store results for the
alternate side. Texture coordinates are generated for both
textures, and colors from both are extracted below. Later, one is
discarded depending on the value for side calculated previously.
bill.frag
// perform skewed texture lookup
vec2 skew = (floor(gl_TexCoord[0].s * divs) + b) / divs;
vec3 col1 = texture2D(tex1, vec2(skew.x, gl_TexCoord[0].t)).rgb;
vec3 col2 = texture2D(tex2, vec2(skew.y, gl_TexCoord[0].t)).rgb;
The next step is to apply lighting. Because the geometry is so
regular, we can easily apply an approximation of
ambient occlusion to our model based on the depth of the
accessed texture from the surface. The ambient occlusion lighting
strategy involves precomputing the accessibility of each vertex to
light sources in the scene. This accesibility factor is based only on
the local topography of the model, not on light locations. I can
easily acheive good enough results by scaling a constant by the depth
of the intersection point on the triangle. This depth value is
proportional to the length b computed above. Multiplying b by some
empirically determined scale and bias provides the following results:
bill.frag
// apply lighting and ambient occlusion
vec2 ambOc = vec2((1.0-b.x) * (1.0 - ambiOcBias) + ambiOcBias,
b.y * (1.0 - ambiOcBias) + ambiOcBias) * ambiFactor;
 |  |
0 Ambient Occlusion | 0.7 Ambient Occlusion |
In order for the accordian shader to work correctly with the lighting model
generated in stage2, two additions are required. First off, the
normals for each procedurally generated face of the billboard must be
generated. Secondly, extra care must be taken when applying light to
these normals, as it is possible that light from behind the actual surface
could still create a positive dot product with the virtual
normal. Some combination of both actual and virtual normals must be
applied.
Computing the normals is trivial once we've decided which side, or
texture, a fragment must access. In tangent space, only two distinct
normals exist across the billboard, one for the left facing sides, and
one for the right. These normals are stored in constant values norm1
and norm2, which are passed to the lighting function.
bill_night.frag
calc_lighting(norm1, spec, diff);
col1 *= spec + diff + ambOc.x;
calc_lighting(norm2, spec, diff);
col2 *= spec + diff + ambOc.y;
When calculating the lighting values, we use the actual normal to
determine from which side of the polygon the light is incident. We
then smooth the specular value by this factor to avoid abrupt
appearance or disappearance of highlights.
bill_night.frag
void calc_lighting(in vec3 norm, out float spec, out float diff) {
// we must adjust the standard lighting computations to
// avoid lighting from behind, given the discontinuous normals
float side = dot(L, vec3(0.0, 0.0, 1.0));
spec = clamp(dot(reflect(-L, -norm), E), 0.0, 1.0) *
smoothstep(0.0, 0.2, side);
spec = pow(spec, specPower) * specFactor;
diff = max(dot(L, norm), 0.0) * max(side, 0.0) * diffFactor;
}
The functional code for this demo is in the stage3 folder.
stage3/main.cpp
stage3/shaders/bill.vert
stage3/shaders/bill.frag
stage3/shaders/bill_night.frag
stage3/shaders/sky.vert
stage3/shaders/sky.frag
Stage 4: A Little Creativity:
For the last tweaks to the demo, we will focus on the sky map
again. In order to create the illusion that the sears tower is super huge
we're going to skew, and in the day scene actually invert the cube
map, and paint a picture of the earth on the negative y tile. We will
also add a little fog effect to the day map that increases with
negative y to enhance the feeling of being above the cloud layer.
sky.frag
varying vec3 E;
uniform samplerCube cubemap;
void main(void)
{
// skew the cube map pushing the earth and clouds towards negative y
vec3 skew = -E * vec3(1.0, 0.7, 1.0) - vec3(0.0, 0.3, 0.0);
vec3 color = vec3(textureCube(cubemap, skew));
// add some fog to simulate clouds below
gl_FragColor = vec4(mix(color, vec3(1.0, 1.0, 1.0),
max(-E.y - 0.2, 0.0)), 1.0);
}
Thats it! Looks great huh?! :)
stage4/main.cpp
stage4/shaders/bill.vert
stage4/shaders/bill.frag
stage4/shaders/bill_night.frag
stage4/shaders/sky.vert
stage4/shaders/sky.frag
stage4/shaders/sky_night.frag