Notes for February 9 class -- Introduction to Ray Tracing, part 1

WebGL Cheat Sheet

As I mentioned last week, there is a convenient compact guide to WebGL.

The last page in particular is useful for writing shaders in OpenGL ES.

Note that, as we discussed in class, OpenGL ES fragment shaders do not allow recursion.

 

Gamma correction

Displays are adjusted for human perception using a "gamma curve", since people can perceive a vary large range of brightness.

For example, the image to the right shows the values 0...255 on the horizontal axis, the resulting actual displayed brightness on the virtual axis.

Various displays differ, but this adjustment is generally approximately x2

Since all of our calculations are really summing actual photons of light, we need to do all our math in linear brightness, and then do a gamma correction when we are all finished:

Roughly: c → sqrt(c)
Output brightness

Input values 0 ... 255

Ray tracing: Forming a ray

At each pixel, a ray from the origin V = (0,0,0) shoots into the scene, hitting a virtual screen which is on the z = -f plane.

We refer to f as the "focal length" of this virtual camera.

The ray toward any pixel aims at: (x, y, -f), where -1 ≤ x ≤ +1 and -1 ≤ y ≤ +1.

So the ray direction W = normalize( vec3(x, y, -f) ). Note that normalize() is a built-in function for OpenGL ES.

In order to render an image of a scene at any pixel, we need to follow the ray at that pixel and see what object (if any) the ray encounters first.

In other words, the nearest object at a point V + Wt, where t > 0.

 

Ray tracing: Defining a sphere

                                                                                                                                                                                               

We can describe a sphere in a GLSL vector of length 4:
vec4 sphere = vec4(x,y,z,r);
where (x,y,z) is the center of the sphere, and r is the sphere radius.

As we discussed in class, the components of a vec4 in GLSL can be accessed in one of two ways:

v.x, v.y, v.z, v.w // when thought of as a 4D point
v.r, v.g, v.b, v.a // when thought of as a color + alpha
So to access the value of your sphere's radius in a fragment shader vec4, you will want to refer to it as sphere.w (not sphere.r).
 

Ray tracing: Finding where a ray hits a sphere

D = V - sph.xyz // vector from center of sphere to ray origin.

The point of doing this subtraction is that we are then effectively solving the much easier problem of ray tracing to a sphere at the origin (0,0,0).

So instead of solving the more complex problem of tracing a ray V+Wt to a sphere centered at sph.xyz, we are instead solving the equivalent, but much simpler, problem of tracing the ray D+Wt to a sphere centered at (0,0,0).

(D + Wt)2 = sph.r2 // find a point along ray that is distance r from sphere center.

(D + Wt)2 - sph.r2 = 0 // need to solve for t.

Generally, if a and b are vectors, then a • b = ( ax * bx + ay * by + az * bz )

This "inner product" also equals: |a| * |b| * cos(θ)

Multiplying the terms out, we need to solve the following quadratic equation:

(W • W) t2 +
2 (W • D) t +
(D • D) - r2 = 0

Since ray direction W is unit length, the first term in this equation (W • W) is just 1.
 

Ray tracing: Finding the nearest intersection

Compute the intersection to all spheres in the scene.
The visible sphere at this pixel, if any, is the one with smallest positive t.
 

Ray tracing: Computing the surface point

Once we know the value of t, we can just plug it into the ray equation to get the location of the surface point on the sphere that is visible at this ray, as shown in the equation below and in the figure to the right:

S = V + W t
 

Ray tracing: Computing the surface normal

We now need to compute the surface normal in order to compute how the sphere interacts with light to produce a final color at this pixel.

The "surface normal" is the unit length vector that is perpendicular to the surface of the sphere -- the "up" direction if you are standing on the surface.

For a sphere, we can get this by subtracting the center of the sphere from the surface point S, and then dividing by the radius of the sphere:

N = (S - sph.xyz) / sph.r
 

 

At the end of the class we saw a video:

The Academy Award winning short Ryan by Chris Landreth.

 

Homework

  • Implement simple ray tracing for spheres in a fragment shader.

    As I said in class, I am going to give you some code to get you started.

    For now, you can render your sphere just by showing its surface normal vector N as a color: red for x component, green for y component and blue for z component. I've already done that for you in the starter code I've provided.

    If you get things basically working, you might get something that looks more or less like this:

  • Extra credit: multiple spheres
    Hint: You can create and use an array of spheres as:
          vec4 spheres[3]; // Note that I am explicitly setting the size of the array.
          ...
          for (int i = 0 ; i < 3 ; i++) { // and also the size of the loop.
             ... spheres[i] ...
          }
       
    If you try this extra credit, make sure that at each pixel you choose the sphere with the smallest value of t.

  • Try to create something interactive (using uCursor) and/or animated (using uTime).