
Notes for February 11 class  Introduction to Ray Tracing
WebGL Cheat Sheet
Here is a compact guide to WebGL
The part that is relevant to GLSL for fragment shaders is on the last page.
Note that GLSL fragment shaders do not allow recursion.
 

Gamma correction
Displays are adjusted for human perception using a "gamma curve",
since people can perceive a vary large range of brightness.
For example, the image to the right shows the values 0...255 on the horizontal
axis, the resulting actual displayed brightness on the virtual axis.
Various displays differ, but this adjustment is generally approximately x^{2}
We need to do all our math in linear brightness, and then do a gamma correction.
Roughly: c → sqrt(c)
 Output brightness 
Input values 0 ... 255

Ray tracing: Forming a ray
At each pixel, a ray from the origin V = (0,0,0) shoots into the scene, hitting a virtual screen which is on the z = f plane.
We refer to f as the "focal length" of this virtual camera.
The ray toward any pixel aims at: (x, y, f), where 1 ≤ x ≤ +1 and 1 ≤ y ≤ +1.
So the ray direction W = normalize( vec3(x, y, f) ).
We need to see what gets hit along the ray: V + Wt, where t > 0
 

Ray tracing: Defining a sphere
We can describe a sphere in a GLSL vector of length 4:
vec4 sphere = vec4(x,y,z,r);
where (x,y,z) is the center of the sphere, and r is the sphere radius.
As we discussed in class, the components of a vec4 in GLSL can be accessed in one of
two ways:
v.x, v.y, v.z, v.w // when thought of as a 4D point
v.r, v.g, v.b, v.a // when thought of as a color + alpha
So to access the value of your sphere's radius,
you will want to refer to it as sphere.w (not sphere.r).
 

Ray tracing: Finding where a ray hits a sphere
((Vs.xyz) + Wt)^{2} = s.r^{2}
(Wt + (Vs.xyz))^{2}  s.r^{2} = 0
Generally, if a and b are vectors, then a • b = (
a_{x} * b_{x} +
a_{y} * b_{y} +
a_{z} * b_{z} )
This "inner product" also equals: a * b * cos(θ)
Multiplying the terms out, we need to solve the following quadratic equation:
(W • W) t^{2} +
2 (W • (Vs.xyz)) t +
((Vs.xyz) • (Vs.xyz))  r^{2} = 0
Since W is unit length, the first term in this equation, W • W, is just 1.
 

Ray tracing: Finding the nearest intersection
Compute the intersection to all spheres in the scene.
The visible sphere is the one with smallest positive t, if any.
 

Ray tracing: Computing the surface point
Once we know the value of t, we can just plug it into the ray equation to get the location of the surface point on the sphere that is visible at this ray, as shown in the equation below and in the figure to the right:
S = V + W t
 

Ray tracing: Computing the surface normal
We now need to compute the surface normal in order to compute how the sphere interacts with light to produce a final color at this pixel.
The "surface normal" is the unit length vector that is perpendicular to the surface of the sphere  the "up" direction if you are standing on the surface.
For a sphere, we can get this by subtracting the center of the sphere from the surface point S, and then dividing by the radius of the sphere:
N = (S  s.xyz) / s.r
 

Ray tracing: A simple shader
A simple light source at infinity can be defined as an rgb color, together with an xyz light direction:
vec3 lightColor; // rgb
vec3 lightDirection; // xyz
A diffuse simple surface material can be defined by an rgb "ambient" component,
which is independent of any particular light source, and an rgb "diffuse" component,
which is added for each light source:
vec3 ambient; // rgb
vec3 diffuse; // rgb
The figure to the right shows a diffuse sphere, where the point at
each pixel is computed as:
ambient + lightColor * diffuse * max(0, normal • lightDirection)
 

Ray tracing: Multiple light sources
If the scene contains more than one light source, then pixel color
for points on the sphere can be computed by:
ambient + ∑_{n} (
lightColor_{n} * diffuse * max(0, normal • lightDirection_{n})
)
where n iterates over the lights in the scene.


 