Achieving realism in a computer-rendered scene means approximating the physics of how light propagates. Accurate implementation makes a big difference: correct appearance of soft indirect light and darkened scene corners strongly makes up for lack of realism in other areas. This can enhance even many non-photorealistic graphics styles.
In an ideal world, one would directly simulate photons travelling from the light source, bouncing off object surfaces in the scene and ultimately ending up in the virtual “eye” – per each displayed frame. For today’s real-time graphics, especially in the browser, this kind of lighting computation is still too slow to do dynamically. Instead, static objects in the scene are accompanied by a precomputed lightmap that stores the amount of light received by every part of the object surface – using some variation of the “radiosity” algorithm. This brings with it many limitations and there are newer pipelines that emulate real-time global illumination very closely, but lightmaps (and their related cousin, ambient occlusion maps aka “AO maps”) will be a standard part of the graphics toolbox for a while.
This kind of computation (“lightmap baking”) is a built-in feature of many popular free 3D engines like Unity and Unreal Engine 4. However, for WebGL graphics workflows such as using ThreeJS and react-three-fiber, there is no “native” way of doing it – lightmaps are produced by one of those external engines or tools like Blender. I wanted to attempt a minimal implementation of such a lightmap baker, one that could run right in the browser, to bridge that gap for WebGL development workflow.
My baker library is built on top of ThreeJS and react-three-fiber, and uses a very simple light probe “hack” while still able to use GPU acceleration. The algorithm computes every lightmap texel by rendering the scene in five cardinal directions (away from surface, up, down, left, right) – a very simple half-cubemap light probe. The probe pixels are then averaged into a single diffuse irradiance component.
The key step is to repeat this process over several passes – this is what creates the soft indirect bounced light effect. Before each pass we set the previous pass’s output as the lightmap for our scene meshes. That allows illumination on those meshes to influence more texels in the next pass. It’s very simple but it works surprisingly well! For the purposes of computing a quick “draft” lightmap in near-real-time, it produces great results. Emissive textures on surfaces are also trivially included in the “baking” process.
In order to enable the “everything in the browser” philosophy I also had to add a quick-and-cheap UV unwrap implementation. Lightmaps usually can’t use meshes’ own UV coordinates because different meshes share the same lightmap UV atlas space: this is why in ThreeJS there is a separate “uv2” attribute for storing lightmap and AO coordinates. My proof-of-concept simply finds coplanar islands of triangles (i.e. surface n-gons), computes the bounding boxes in tangent space (with some heuristics for how the contents are rotated) and then lays out all those bounding boxes using the existing Potpack library. There are far more space-efficient and flexible layout approaches, of course, but this works well enough for typical small in-browser WebGL scenes.
One interesting caveat: the lightmap currently stores only the indirect radiance contributions from the scene. If e.g. a texel is lit by a single direct light and receives no indirect bounces, the lightmap value is zero: in other words, this baker expects all the lights to “stay on” after baking. This is not how typical lightmaps work – part of the point is to be able to turn off static point lights after baking is done, since that allows a huge performance speed-up on e.g. large game levels. The reason for my trade-off has been to simplify implementation (I don’t need to compute direct light contribution) and also to focus on dynamic shadowing effects on small scenes, since that is the “bread and butter” of typical WebGL work.
Finally, the baker implementation theoretically supports separate lightmap layers per light source – so different lights can be turned on and off independently, or have dynamic intensity. Then, for each frame, the layers are modulated and composited at runtime into a single visible lightmap. Even basic scene animation can be supported the same way, too – for example, when window blinds open in a room, the scene can transition from being dimly lit to being flooded with sunlight.
I also want to add an ambient occlusion render mode. Arguably, AO maps are even more useful than lightmaps for typical WebGL usage (small scenes with dynamic lights), and this baker can produce them with a few simple tweaks.
This proof-of-concept was eventually packaged as the official @react-three/lightmap module and expanded to support Suspense and other workflow niceties. The original project code is open source and available on GitHub.