This blog post is an in-detail explanation of part of Jane’s GDC talk about the Art of Firewatch
Hello!
I am Paolo, the graphics programmer at Campo Santo. I have never written a blog post here before, mainly because a lot of tech work that has gone into the game has only recently been finalized, and I preferred not to put temporary information which I know would have changed after a few weeks. (I think there is going to be an interesting tech postmortem at the end of this game.)
Anyway, our procedural sky system is now feature-complete and well integrated into our graphics pipeline.
In this post, I’ll explain how the shader allows Jane and Olly to make the beautiful skies they have been making for Firewatch.
Just a premise: most of the things I am saying “tech-wise” in this post might not be entirely accurate, and each one of them would probably deserve papers and papers on their own. The goal of this post it to summarize the information necessary to have most people understand in details the choices behind this system without stressing too much about technical details.
Procedural: What?
Before tackling the details of our solution, and how and why we did it, some explanations are due. Many games have used textures (i.e., images) for their skies so far. Basically, you take a picture in real life—or you ask your artist to paint one—you put it on a sphere or a cube, and you make that geometry follow the camera position (not orientation) so that you’ll never reach the sky and it will always seem to be in the same position with respect to the viewer.
We used textures for a while and they have their own advantages:
- They are fast: the shader samples the texture, does some exposure calculation, and outputs it. Done, ship it.
- Artists are comfortable with textures.
But as a specific solution for skies, textures have their own problems, although you can work around them, as we did for a while:
- Texture memory: to have a good sky, you need to use high-resolution textures or you will see pixelation artifacts in your sky. Given that at any point we might need two textures, even though memory is not a big problem nowadays, it’s still memory that is not going to other textures.
- You can interpolate and blend between different skies when you transition from one to the other (we can change time/weather in real time in our game, so this was a requirement), but that means that if you have details in the texture—such as clouds, or the sun being in two different positions in two different skies—interpolating between the two textures will look wrong.
- Mathematical reasons: if you are generating your sky with “math”, you are working in floating point, so if you then apply exposure or any other scaling to it, it works fine. A skybox is usually in RGBA8 format (8 bits per channel) which means that you have limited precision: scale it too much and you will have artifacts (and if you start using a full HDR skybox, then you are taking even more memory away from other textures).
We had a solution to most of these things, by not painting the sun or any other details directly into the skybox, and then adding them later.
There is generally a solution to everything, but the better your tools, the better the product.
Iteration-time wise, there is nothing better than tweaking colors in-game, and seeing the changes happening in real time. Being a small company, and having to create a lot of content, anything that makes our people happier and more productive is a win. So if there is a lesson here, this is it:
Make the tool that works the best for the people who have to use it.
Reasons for a procedural sky: Why a new solution was needed
If you Google “procedural skies,” you immediately see that there are countless solutions out there. Most of them try to simulate a simplification of the complex light interaction that happens when light encounters the earth atmosphere. This light/atmosphere interaction generates what is called scattering effect, by which light changes and gives a certain color to the sky and the haze and dust particles in it. To be clear, this is a really incorrect and big simplification of the complicated physical phenomena that happens in real life. A good read is found in the GPU Gems 2 article: Accurate Atmospheric Scattering
As it clearly states at the beginning:
“The equations that describe atmospheric scattering are so complex that entire books have been dedicated to the subject”
On top of all this good material, Unity 5 already provides a procedural sky tool which is automatically linked to the main directional light in the scene to generate a sun and a sky for it. The Unity 5 sky is heavily inspired by the GPU Gems 2 solution.
So why the need for a new solution? Why reinvent the wheel?
There are many advantages to physically based skies, but none of the solutions out there met our requirements which were really simple: an artist is comfortable with using Photoshop and similar tools: asking an artist to modify the Mie Scattering Factor, set the proper parameters for the Phase function and making sure that those are in line with the Rayleigh Scattering parameters is just not intuitive. Especially for an art-driven game like Firewatch, the key concepts are “color” and “tone.”
So this problem needed a more “artistic” approach then usual.
Olly gave me a couple of requirements to follow and I asked him to paint something live while I would see what and how he was doing to achieve the result. I found this very useful and it gives me a good head start on two things:
- What the shader should achieve look-wise
- What the shader should expose, what the user expects from it, and what he or she needs
On it, then!
Bringing it together
So, the final shader, which we arrived at after just a couple of iterations, achieves its look the following way.
There are four parts that make up the final look:
- A gradient made out of three colors (top, half and bottom) which controls the main sky color
- Sun disc options to control the color, size, and falloff of the sun
- Sun halo options to control what would be the halo/haze/atmospheric scattering of the light in the air
- Horizon halo to specifically control the halo around the horizon independently from the other parameters
Therefore, the following sky:
is composed by (each shot is a top-to-bottom view of the sky):
a gradient:
a sun disc:
a sun halo:
and horizon halo:
The parts simply get added together at the end. The shader comes with a tool to visualize each individual step so that the artist can fully control all its values.
Be sure to check out Jane’s talk on The Art of Firewatch if you’ll be at GDC for much more information about what goes into creating the game’s look and feel.
An excellent reminder that even if a game isn’t completely procedural, there are be lots of applications for procedural generation that make the artists’ lives easier and produce more expressive output.