Generating multiple sample positions per pixel

Anti-aliasing in ray tracing, requires casting multiple rays per pixel, to sample the whole solid angle subtended by the imaginary surface of each pixel, if we consider it to be a small rectangular part of the view plane (see diagram).

framebuffer pixel area

It’s obvious that to be able to generate multiple primary rays for each pixel, we need to have an algorithm that given the sample number, calculates a sample position within the area of the a pixel. Since it’s trivial to map points in the unit square, onto the actual area of an arbitrary pixel, it makes sense to write this sample position generation function, so that it calculates points in the unit square.

The easiest way to write such a function would be to generate random points in the unit square, like this:

void get_sample_pos(float *pos)
{
    pos[0] = (float)rand() / (float)RAND_MAX - 0.5;
    pos[1] = (float)rand() / (float)RAND_MAX - 0.5;
}

The problem with this approach is that, even if our random number generator really has a perfectly uniform distribution, any finite number of sample positions generated, especially if that number is in the low tens, will probably not cover the area of the pixel in anything resembling a uniform sampling. Clusters are very likely to occur, leaving large areas of the pixel space unexplored by our rays.
As the number of samples gets larger and larger, this problem is somewhat mitigated, but especially if we’re not writting a path tracer, we’re usually dealing with anything between 4 to 20 rays per pixel, no more.

The following animation shows random sample positions generated by the code above. Even at about 40 samples, the left part of the pixel is inadequately sampled.
random sample generation

Another approach is to avoid randomness. The following function gets the sample number as input, and calculates its position by recursively subdividing the pixel area, taking care to spread the samples of each recursion level around as much as possible instead of focusing on one quadrant at a time.

void get_sample_pos(int sidx, float *pos)
{
    pos[0] = pos[1] = 0.0f;
    get_sample_pos_rec(sidx, 1.0f, 1.0f, pos);
}

static void get_sample_pos_rec(int sidx, float xsz, float ysz, float *pos)
{
    int quadrant;
    static const float subpt[4][2] = {
        {-0.25, -0.25}, {0.25, -0.25}, {-0.25, 0.25}, {0.25, 0.25}
    };

    /* base case: sample 0 is always in the middle, do nothing */
    if(!sidx) return;

    /* determine which quadrant to recurse into */
    quadrant = ((sidx - 1) % 4);
    pos[0] += subpt[quadrant][0] * xsz;
    pos[1] += subpt[quadrant][1] * ysz;

    get_sample_pos_rec((sidx - 1) / 4, xsz / 2, ysz / 2, pos);
}

And here’s the animation showing that code in action (colors denote the recursion depth):
regular subdivision sampling

This sampling is perfectly uniform, but it’s still not ideal. The problem is that whenever we’re sampling in a regular grid, no matter how fine that grid is, we will introduce aliasing. By breaking up each pixel into multiple subpixels like this we effectively increase the cutoff frequency after which aliasing occurs, but we do not eliminate it.

The best solution is to combine both techniques. We need randomness to convert aliasing into noise, which is much less perceptible by human brains trained by evolution to recognize patterns. But we also need uniform sampling to properly explore the whole area of each pixel.

So, we’ll employ a technique known as jittering: first we uniformly subdivide the pixel into subpixels, and then we randomly perturb the sample position of each subpixel inside the area of that subpixel. The following code implements this algorithm:

void get_sample_pos(int sidx, float *pos)
{
    pos[0] = pos[1] = 0.0f;
    get_sample_pos_rec(sidx, 1.0f, 1.0f, pos);
}

static void get_sample_pos_rec(int sidx, float xsz, float ysz, float *pos)
{
    int quadrant;
    static const float subpt[4][2] = {
        {-0.25, -0.25}, {0.25, -0.25}, {-0.25, 0.25}, {0.25, 0.25}
    };

    if(!sidx) {
        /* we're done, just add appropriate jitter */
        pos[0] += (float)rand() / (float)RAND_MAX * xsz - xsz / 2.0;
        pos[1] += (float)rand() / (float)RAND_MAX * ysz - ysz / 2.0;
        return;
    }

    /* determine which quadrant to recurse into */
    quadrant = ((sidx - 1) % 4);
    pos[0] += subpt[quadrant][0] * xsz;
    pos[1] += subpt[quadrant][1] * ysz;

    get_sample_pos_rec((sidx - 1) / 4, xsz / 2, ysz / 2, pos);
}

And here’s the animation showing the jittered sample position generator in action:
Jittered sample generator

S-ray goes public

S-ray is a photorealistic off-line renderer I’ve been writting on and off for the past few months. It’s basically a monte carlo ray tracer using the photon mapping algorithm to simulate global illumination effects, such as indirect diffuse lighting and caustics.

So, it’s time to open up the code and continue development in the open. The code is available only through subversion at the moment, no official “0.1” release shall be made until at least the most important outstanding bugs are fixed. However, feel free to checkout the code and play around all you want.

The project is hosted at googlecode, and you can find it at: http://code.google.com/p/sray
As always, ideas, comments, or bug fixes are more than welcome.

s-ray: caustics through photon mapping

The main feature set that I had planned for my new s-ray renderer is almost complete. The renderer is now capable of producing caustics, that is light being focused on surfaces by curved mirrors or lenses (or otherwise transmission through refractive objects).

Standard recursive ray tracing is incapable of producing caustics; instead a two-pass algorithm called photon mapping is used. Photon mapping works by tracing random photons as they leave the light sources and bounce around the scene, finally ending up on various surfaces. The second pass works just like regular ray tracing, with the difference that on every ray hit, the density of stored photons around that point is used to estimate the amount of light that has arrived there after being reflected or refracted.

Here are a couple of images produced by s-ray, showing both reflective and refractive caustics:

reflective caustics

reflective caustics

refractive caustics

refractive caustics

Raytracing Anamorphic Images

A long time ago, I stumbled upon a couple of strikingly odd images on Jim Arvo’s web site, which are apparently called “anamorphic”. The idea behind an anamorphic image, is that it’s distorted in such a way, that its true shape can be seen only when viewed in a particular manner. In the case of these images, you’re supposed to print the image and place a highly reflective cylindrical object, such as a chrome pipe, at a specific marked location in order to see the geometric shapes correctly.

I kept the images back then, with the purpose of either finding an appropriate cylindrical object, or raytracing them to see what they look like, but for some reason I’ve forgotten all about them until I accidentally found them again yesterday, in a dusty corner of my filesystem.

So I decided to hack some code to add perfect (x^2 + y^2 = r^2) cylinder primitives to my raytracer and do a couple of renderings with those images texture-mapped onto a ground quad (I could just do the same thing with another raytracer such as pov-ray but where’s the fun in that?).

So anyway here are the anamorphic images along with the renderings (click on the images for the full rendering):