Planet Tech Art
Last update: April 26, 2017 08:59 AM
April 25, 2017

Real Time Global Illumination

In keeping with a lot of the older posts on this blog I thought I'd write about the realtime GI system I'm using in a project I'm working on. It's a complete fresh start from the GI stuff I've written about previously. Previous efforts were based on volume textures but dealing with the sampling issues is a pain in the ass so I've switched to good old fashioned lightmaps. This is all a lot of effort to go to so why bother? The short answer is I love the way it looks. As a bonus it simplifies the lighting process and there's a subtlety to the end results that is very hard to achieve without some sort of physically based light transport. A single light can illuminate an entire scene and the bounce light helps ground and bind all the elements together.
The process can be divided into five stages, lightmap UVs, surfel creation, surfel clustering, visibility sampling and realtime update.The clustering method was inspired by this JCGT article however I'm not using spherical harmonics and I generate surfels and form factor weights differently. The JCGT article is fantastic and well worth a read.

Before you run off, here it is in action.

 Lightmap UVs

The lighting result is stored in a lightmap so the first step is a good set of UVs. These lightmaps are small and every pixel counts so you have to be pretty fussy about how the UVs are laid out. UV verts are snapped to pixel centers and there needs to be at least one pixel between all charts in order to prevent bilinear sampling from sampling incorrect charts. The meshes are unwrapped in Blender then packed via a custom command line tool. This uses a brute force method that simply tests each potential chart position in turn, for simple scenes and pack regions up to 256x256 the performance is acceptable.


Surfels and Clustering

Next up we have to divide the scene into surfels (surface elements) and then cluster those surfels into a hierarchy. At runtime these surfels are lit and the lighting results are propagated up the hierarchy. This lighting information is then used to update the lightmap.

Surfel placement plays a big part in the quality of the illumination and I've been through a few iterations. Initially I tried random placement with rejection if a surfel was too close to it's neighbours but this was hellishly slow. I also tried a 3D version of this which was much faster but looking at the results I felt the coverage could be better. Particularly around edges and on thin objects, the neighbour rejection techniques would often leave gaps that I felt could be filled. This seemed like it could be addressed by relaxing the points but I wanted to try something else.

I decided to try working in 2D using the UV's which in this case are stretch free, uniformly scaled and much easier to work with. The technique I settled on first generates a high density, evenly distributed set of points on each UV chart. N points are selected from this set and used as initial surfel locations and these locations are then refined via k-means clustering.

This results in a set of well spaced surfels that accurately approximate the scene geometry and makes it easy to specify the desired number of surfels. For each chart N is simply 

(chart_area / total_area) * total_surfel_count

The initial high density point distribution.
Surfel creation via k-means clustering of the high density point distribution.

These surfels are then clustered via hierarchical agglomerative clustering which repeatedly pairs nearby surfels until the entire surfel set is contained in a binary tree. Distance, normal, UV chart and tree balancing metrics help tune how the hierarchy is constructed. I'm still experimenting with these factors.

Hierarchical agglomerative clustering in action.

Lightmap visibility sampling

Influencing clusters for the highlighted lightmap texel.
Once the surfel hierarchy has been constructed each lightmap texel needs to locate the surfels that most contribute to it's illumination. Initially I used an analytic form factor but this would sometimes cause lighting flareouts if a texel and surfel were too close. Clamping the distance worked but felt like a bit of a hack so I switched to simply casting a bunch of cosine weighted rays about the hemisphere. Each ray hit locates the nearest surfel and the final form factor weight for each surfel is simply

 num_hits / total_rays

Once all rays have been cast the form factor weights are propagated up the hierarchy. The hierarchy is then refined by successively selecting the children of the highest weighted cluster. At each iteration the highest weighted cluster is removed and it's two children are selected in it's place. This process repeats until a maximum number of clusters is selected or no further subdivision can take place. The texel then has a set of clusters and weights that best approximate it's lighting environment.

Lighting update

The realtime lighting phase consists of several stages. First the surfels direct lighting is evaluated for each direct light source, visibility is accounted for by tracing a single ray from the surfels position to the light source. The lighting result from the previous frame is also added to the current frames direct lighting to simulate multiple bounces. There's a bit of a lag here but it's barely noticeable. Lighting values for each cluster are then updated by summing the lighting of it's two children.

Each active texel in the lightmap is then updated by accumulating the lighting from it's set of influencing clusters. The lightmap is then ready to be used.

Direct light only.
Direct light with one light bounce.

Direct light with multiple light bounces.
Timings for each stage (i7-6700k @ 4.0ghz)

  Surfel illumination (1008 surfels):               0.36ms
  Sum Clusters (2015 clusters):                     0.08ms
  Sum Lightmap texels (6453 texels * 90 clusters):  0.64ms

Environmental Lighting

Environment lighting is provided by surfels positioned in a sphere around the scene. These are treated identically to geometry surfels except for the lighting update where a separate illumination function is used. Currently it's a simple two colour blend but could just as easily be a fancy sky illumination technique or an environment map.

To finish up here are some more examples without any debug overlay. These were taken with an accumulation technique that allows for soft shadows and nice anti-aliasing.

by Stefan Kamoda ( at April 25, 2017 04:58 PM

Rust animation test

Haven't had much free time the last few months but I did manage to let this rendered animation test grow slightly out of control. Big thanks to Stephan Schutze (, Twitter: @stephanschutze) for the awesome audio work.

Concept and design work

This little guy started out as a bunch of thumbnail sketches (below left) well over a year ago, but the design also shares some similarities with an even older concept (below right).

Eventually I got around to modelling and although the concepts don't really show it, I drew a lot of inspiration from the Apple IIe and Amiga 500 computers of my misspent youth. The 3D paint-over below shows an early version with only one antennae. The final version has a second antennae which was an accident, I kept it when I realised they could work almost like ears and added a bit more personality.

And finally, a snippet from an old mock comic book panel, just for the hell of it :)

by Stefan Kamoda ( at April 25, 2017 01:19 PM

April 22, 2017

Ditching comments

Hi folks,

I just wanted to let you know that I'm ditching Disqus (the service powering comments) from this website in an effort to eliminate trackers. I silently removed google analytics some time ago for the same reason, but this time hurts a bit more because comments are the way we have to interact with each other and I felt like it deserves an explanation.

First things first: I love receiving your feedback, everytime I get a message/email from someone because of an article or some of my open source projects it totally makes my day, even "harsh" comments push me to do better by correcting some missconception or learning something new. As a sef-taught I owe a lot to the community and the whole purpose of having a website is to, in some way, pay back by sharing/helping newcomers and pushing myself by learning from your feedback. Big thanks to all of you for your support through the years.

Said that, I also have strong concerns about online privacy and the status of the web, I surely take measures to stay away from ads/trackers by using all sort of privacy oriented plugins, extensions, VPN and whatnot; but I feel like it is totally unfair from my part to push trackers to anyone in exchange for the ability to leave a comment on this website or to feed my ego by checking stats.

There might be some of you thinking what's wrong with ads/trackers? I have nothing to hide!

Well, most cloud-based services like google analytics or social widgets (facebook's likes and what not) require the inclusion of a little script to provide the service, said script is also used to track the visitor building an unique profile (which is a key piece to targeted ads... and who know what else, you have no say on what that data is used for). This alone is horrifying, but you have to consider that around 60% of the web uses google analytics and social widgets are rapidily becoming omnipresent, allowing these companies to literally follow you from website to website reconstructing your whole browser history without your knoledge/agreement.

This is wrong, we are not just talking about some script slowing down the website by adding some extra network requests, it's about respecting your freedom! Websites are literally trading your digital persona without your knowledge.

I know this website alone makes no difference to the googles and facebooks of the world, but it's about integrity and acting according to my believes... even if it means giving up on some convenient services along the way.


Hi folks,

I just wanted to let you know that I'm ditching Disqus (the service powering comments) from this website in an effort to eliminate trackers. I silently removed google analytics some time ago for the same reason, but this time hurts a bit more because comments are the way …

by Cesar Saez at April 22, 2017 02:00 PM

Blogger blues

Wherein our author uses blogger to post a blog post blogging about how much he dislikes blogger.
It's late on a Sunday night and I need to get this off my chest.

I really have come to loathe Blogger.  The sluggish, overly complicated, JS heavy theme, the sluggish, too-complex-for-speed-but-too-simple-for-interesting-stuff editor, and the way it stuffs stylesheet info into the RSS feed come to mind. but overall... it's just gotten on my nerves.

So, I'm probably going to transition the blog over to something else.  My current leading candidate for a site generator is Pelican, a Python based static html site generator which seems to be powerful enough for my not-too-complex needs.  Jekyll is another candidate but all things being equal I'd rather stick with a Python-based setup and the final output will be pretty much the same.

I'm a tad nervous about what happens to old links and traffic so I assume that I'll probably transition over gradually with duplicate postings for a while. If any of you have done something similar in the past I'd be curious to hear about how it went.

In the meantime, I'll just add that I've been dealing with the transition in typical TA fashion. I hacked up a script to download all of the existing blog posts as XML, then used the html2text module on the cheese shop to convert the HTML from the posts into markdown text. I'm still going to have to hand-finish every pieces, cleaning up dead links and missing images and so on:  I'm sure it'll be a TA-style spit'n'bailing-wire party for a while yet.

In the meantime I'm all ears if anybody has more suggestions for site generators, or a reason to go with something other than a static site on, please let me know in the comments!

update:  the new site is here

by Steve Theodore ( at April 22, 2017 12:53 AM

The New Hotness

I've finally completed rolling over to to a new, self-hosted blog platform!

The process took a bit longer than I wanted, mostly because it web development remains a messy, iterative process - at least for me.  Since I ended up unifying both the blog and the old Character Rigger's Cookbook as well as on old markdown wiki, I had to do a lot of little scripts to groom the old content into a consistent format and linking strategy.  Add in more than a modicum of CSS noodling and whatnot and my 'couple of weekends' project turned into a couple of months.

However all that is behind me now, and all my future updates are going to be coming up at  (you can also use  If you're subscribed to the current feed, you should switch over to either or, depending on your reader; one of the nice side effects of the switch is that the new feeds are much cleaner than Blogger's -- no more CSS gibberish instead of article summaries, thank you very much!)

I'm going to leave this site intact, and I'll try to keep monitoring comments and so on here, but the new site is going to be where all the news stuff comes out.  I'm also going to change the redirect so it goes to the new site, so if you're using that in a blogroll you won't have to update it.

PS. I had to touch a lot of content during the migration: there are about 150 blog posts, several dozen wiki pages, and a bunch of articles that all had to be repointed and I'd be pretty surprised if nothing odd slipped through.  Please let me know using the comments at the new site so I can fix up anything that's confusing or misleading.

So, here one more link to the new site, just in case.  Hope to see you there!

by Steve Theodore ( at April 22, 2017 12:42 AM

April 13, 2017

New Health Manager added to ProPack!

Read More Here

ProPack just got more powerful with the addition of a crucial new system to check the status of assets before they hit production. We’ve been using this internally for a while now to check our rigs before we ship them out to clients and we thought we’d expand it to all ProPack users…

by Mark Jackson ( at April 13, 2017 02:16 PM