Ambient And Directional Lighting In Spherical Harmonics

Stupid SH Tricks by Peter-Pike Sloan, released in 2008, is an excellent reference on spherical harmonics (SH). It covers many techniques for representing lighting as SH, including:

  • Scaling SH for a directional light so that it matches when the normal exactly faces the light
  • Extracting an ambient and directional light from environment lighting represented as SH

Extracting lighting from SH has different solutions depending on how you choose to represent your directional lights and whether you are working with L1 (4-term) or L2 (9-term) SH. This post works through the derivation for all these cases, hopefully improving clarity for anyone looking to implement it, and a shadertoy that demonstrates the results.

Polar plots of ambient and directional lighting as SH



Bidirectional Path Tracing In Participating Media

The framework for integration over path space described by Veach REF1 can be extended to integrate over volumes (REF3 and REF4), using the original rendering equation as a boundary condition. An alternative framework was also developed by Lafortune in REF2. In the approaches described by these papers, scattering events are sampled according to the cumulative scattering density along a ray.

Here is some programmer art rendered using a brute force bidirectional path tracer that implements this approach, stopped at around 1000 samples per pixel.

Cornell box with fog and volume caustics

Still noisy but working volume caustics, so achieves my goals. Since this is a brute force sampling (no MLT or manifolds) there are still issues with SDS paths that show up as unexpected bright spots on the glass sphere. If I manage to resurrect MLT code for REF5 I'll post an update.

This post is an attempt to gather together the definitions and equations for the weighted path contribution and MIS probability ratios for this sampling technique, as I've not seen this written down all in one place (with all of the interior terms of the weighted path contribution cancelled etc). Where possible I've tried to use similar notation to REF1, REF3 and REF4. This post is unapologetically maths-heavy!



Multiple Scattering

Experimenting with volume rendering and multiple scattering has been on my todo list for a long time, so recently I started trying to understand how scattering events fit into a bidirectional path tracer. I found surprisingly few references that deal with volume scattering in the Veach framework (or more accurately, extend the Veach framework to deal with this). Most useful to me so far were:



Bidirectional Instant Radiosity

Bidirectional Instant Radiosity is the title of a paper by B. Segovia et al which presented a new sampling strategy to find virtual point lights (VPLs) that are relevant to the camera. The algorithm given for generating VPLs is:

  • Generate $N/2$ "standard" VPLs by sampling light paths (i.e. vanilla instant radiosity)
  • Generate $N/2$ "reverse" VPLs by sampling eye paths (compute their radiance using the N/2 standard VPLs)

These $N$ VPLs are then resampled into $N^\prime$ VPLs by considering their estimated contribution to the camera. Finally the $N^\prime$ resampled VPLs are used to render an image of the scene.

In this post I'll describe how I think this approach can be generalised to generate VPLs using all the path construction techniques of a bidirectional path tracer. As usual I'm going to assume the reader is familiar with bidirectional path tracing in the Veach framework.

I should state that this is an unfinished investigation into VPL sampling. I'm going to describe the core idea and formally define the VPL "virtual sensor", but proper analysis of the results will be part of a future post (and may well indicate that this approach is not advisable).



Sampling Sun And Sky

In this post I will briefly cover how I implemented sampling of external light sources in a path tracing framework, concluding with an observation about sampling multiple external light sources that are non-zero over very different solid angles. I'm going to assume the reader is familiar with path tracing in the Veach framework.



Sketchup Cities

Ray Tracey's latest blog post has Brigade 2 renders of a nice-looking walled city scene created using Google SketchUp. The model came from this gem of a collection by "LordGood" (who evidently is a big Assassin's Creed fan) hosted on Google 3D Warehouse.

Currently I only have a Blender exporter, and sadly the SketchUp-Collada-Blender path was producing garbage, but even the free version of SketchUp allows custom ruby plugins. After a bit of hunting around I found this OBJ exporter ruby plugin which worked very well, and now I have much nicer test meshes than my bad Blender programmer art.

Walled City 1



Virtual Point Light Bias Compensation

Nothing new here, just highlighting the 2004 paper Illumination in the Presence of Weak Singularities by Kollig and Keller. This paper presents a simple and elegant solution to the problem of clamping the geometry term when using virtual point lights for illumination.

Consider the following scene with two area lights (in this case rendered using a reference path tracer):

Reference Image (Path Traced)



Projected Solid Angle Is Projected

There are two quantities flying around when writing a physically based renderer:

  • Irradiance, which is power per unit area
  • Radiance, which is power per unit area per unit projected solid angle

Any BSDF is the ratio between the two: a patch receives unit irradiance from some incident direction, the BSDF defines the radiance emitted for each outgoing direction.

At no point is (non-projected) solid angle used as a measure for the patch in question, yet many references and implementations consider probability density relative to this measure. In this post I will argue that using projected solid angle directly is more natural.



Hybrid Bidirectional Path Tracing

I'd like to share my results from converting a CPU-only bidirectional path tracer into a CPU/GPU hybrid (CPU used for shading and sampling, GPU used for ray intersections). These results are a bit old... I posted them a while ago as a thread on ompf. I found out later that this thread had been cited in Combinatorial Bidirectional Path-Tracing for Efficient Hybrid CPU/GPU Rendering, so let me summarise it here.



Now You're Lighting With Portals

I hate dome lights. You always waste a ton of rays that are occluded by geometry, and the situation gets even worse when lighting indoor scenes with exterior dome lights!

So why not help your renderer out and place portals that, when hit, teleport to the dome light. Then instead of sampling the whole skydome, we just sample the portals, and avoid sending rays where we know they will be occluded.

As an example, here's the Sponza scene using an exterior (uniform) dome light, rendered using unidirectional path tracing with multiple importance sampling:

Dome Sampling (32spp)



All Posts...