Are you interested in learning to work in the ACES workflow but feel overwhelmed by color management? We’ve got you covered!
Understanding basic concepts is part of building a successful career in CG animation. One of these concepts is the ACES workflow. In this guide, you learn what ACES is and how to use it.
Want to learn more? Read on.
(Much of this guide was researched and written by our student Diana Lee. Thank you, Diana!)
What is ACES?
ACES stands for Academy Color Encoding System.
ACES was established by the Academy of Motion Picture Arts and Sciences (yes, the one that does the Oscars!) and its partners (Technicolor, ARRI, RED, etc.) to create consistent, color-accurate, and high-quality motion picture images. Essentially, it’s a way to standardize how color is handled in the TV and movie industry.
It’s one of the many processes and workflows that AMPAS and the Board of Governors want to unify for production, post-production, and archiving.
For example, this is what a picture looks like before and after the ACES workflow.
Look at the highlights on the hands, light to dark falloff on the chest, the dark point behind the character, and specular hits on the cave:
As a lighting artist, it’s important that you know how to work with ACES to achieve this level of depth and quality in our images.
First, let’s take a brief look at the history of it.
The history of ACES
Traditionally, motion picture workflows were based on film negatives. But as the movie industry transitioned to digital motion pictures, the industry lacked a color management system.
In 2004, ACES was developed to manage different codecs and formats. The ACES workflow aims to manage “color throughout the life cycle of a motion picture or television production.”
ACES version 1.0 was first released in 2014. Since then, it has made its way to dozens of blockbuster movies, TV series, animation, and indie films. It’s been adopted to Animation and VFX workflows, and soon, it could become the standard color management system in all types of projects, with ACES 2.0 currently in development.
Why is ACES used?
Different productions use various digital cameras to capture images and save the images in their respective media format. You see, the movie industry has dozens of color encoding systems and formats, most of which have no metadata.
ACES, on the other hand, was created to establish a common standard so that motion pictures could be created and preserved efficiently and predictably.
That’s why ACES is extremely useful among large-scale productions. And as a result, the workflow is popular among CG and VFX-heavy movies, including The Legend of Tarzan, The Lego Movie, and Guardians of the Galaxy Vol. 2.
But color management as such is nothing new.
In fact, you’re already using color management. However, there are a few drawbacks that ACES aims to solve.
You see, the default color management in Maya applies an sRGB 2.2 gamma curve in the View Transform. (For now, think of a view transform like a filter on a photo. Maya renders your image, and applies this “view transform” before showing it to you.)
While sRGB 2.2 gamma curve is standard for most images, for our purposes in animation and film, it comes with some challenges.
Here are two problems with this default curve (which are solved with the help of ACES):
1. The default gamma curve makes the image look washed out
The ACES sRGB transform is an S-shaped curve that emulates film, giving higher contrast in the midtones and making the image look much more appealing and contrasty without having to do a lot of post processing.
The images below are the same render with only different viewing transforms applied to them. There is lots more contrast in the bottom image.
2. The default view transform simply clips the rendered pixel values
Rendered pixel values are clipped at 1, meaning it takes any value above 1 and rewrites it as 1. That’s why information is easily lost in the highlights.
The ACES sRGB transform takes those pixel values over 1 (between 1-16) and remaps them to be below 1 (between 0.8-1), so you can get a lot more detail in your highlights before clipping.
This is called tone mapping.
Compare the two images below. All the highlights on the left have clipped, like on the yellow head, shirt, white sphere, and even the small highlight on the chrome ball.
Using the default color management, you would have to make unrealistic changes, like making your lights dimmer or adjusting the material’s color or specularity, just to avoid this kind of clipping.
With ACES tone mapping you can have far more powerful lights in your scene without worrying about your highlights blowing out.
How ACES works
How does ACES work?
The ACES Workflow covers the entire filmmaking process:
- Image capture
- Visual effects
- Future remastering
The system consists of guidelines and specifications about color management such as:
- Encoding specifications
- Metadata definitions
- Metadata specifications
- Standard screen specifications
- Archive-ready image data specifications
Because ACES is free and an open color management system, it is not limited to a specific program or platform. It will work with the latest technologies as well as future workflows.
You see, ACES is not an application that you can download and install. It’s not an update for your camera or computer, and it is not a part of a specific workflow. ACES is simply a collection of encoding rules and transforming data.
These tools, guidelines, and rules are based on standards by the Society of Motion Picture & Television Engineers (SMPTE) and the International Organization for Standards (ISO).
First, let’s start with helpful terminology when you’re using ACES:
ACES Color Spaces: Refers to the five-color spaces that cover the ACES framework related to generation, transport, processing, and archiving of still and moving images.
- AP0 Red
- AP0 Green
- AP0 Blue
- AP1 Red
- AP1 Green
- AP1 Blue
ACES2065-1: A standard ACES color space based on AP0 RGB primaries. This color space is meant for the mid and long-term storage of image and video files.
ACEScg: A linear encoding in ap1 primaries for rendering and compositing.
Input Device Transform (IDT): The process of taking capture images from the source material and then transforming them into the ACES Color space and encoding specifications.
Academy Density Exchange (ADX): A densitometric encoding to capture data from film scanners.
Reference Rendering Transform (RRT): Converting colorimetry to display-referred.
Output Device Transform (ODT): Guideline for rendering wide gamut and wide dynamic range of RRT to physically realized output device with limited gamut and dynamic range.
Input Transform: Term used for an IDT as per ACES version 1.0.
Look Modification Transform (LMT): Applying specific change in a look in combination with RRT and ODT.
Now that you have an understanding of the terminology, what are the ACES color management basics? Let’s take a look.
Color management basics
The main objective of color management is to maintain a linear workflow. That’s because real-life light behaves linearly. This means that illumination with an intensity of 2 is twice as bright as illumination with an intensity of 1.
However, not all images represent light linearly. Images from early television shows, for example, do not represent light linearly, so the data was stored non-linearly. When used with a linear monitor, the old data won’t show correctly on the monitor because of the way they were stored.
So, while it is possible to make linear monitors now, none of those monitors would correctly display any images or video that already exists in the world.
That’s why, for now, we are stuck with non-linear monitors. Moreover, the human eye does not perceive light in a linear way either. We are more sensitive to variations in darker values than lighter values, and the non-linear gamma curve helps us store more values in the darker shades.
But for our renders to be physically correct, and to better simulate real light, we want to work linearly. (Imagine having to multiply by something like 1.14 to make your light be twice as bright…)
Since that is the case, any image saved on your computer (think .jpg, .png) has to be converted into a linear image to be used with our linear workflow.
Then lighting, rendering, and compositing are done in linear space (there is a reason EXR is recommended for renders—it’s linear!).
After all that is complete, the image has to be converted back to be non-linear, and suitable for viewing on everyone’s phones, monitors, and TVs. And colorspace or transform is just fancy words for these conversions.
That’s why these are the three main elements of color management — Input space, Working space, and Output space.
And this is where the rules come in:
What needs to be converted and what doesn’t? Which conversion do I choose?
The settings in Maya and Nuke support a linear workflow by default. So, let’s look at the Maya and Nuke defaults to identify where these settings are.
1. Input color space
Applied when you are reading in an image or texture to your scene.
You’re likely familiar with this:
sRGB color space:
- For any texture that affects the color of rendered pixels, like diffuse color, specular color etc.
- Colored images you got from the internet, like an image of the sky that I intend to use as my backdrop in comp.
Raw, or linear, color space:
- For all textures that do not affect the color, like roughness, bump, displacement, emission factor, masks, etc. It means no conversion.
- HDRI for Skydome lights — hdr files are high dynamic range, and are Raw by default.
- EXR files. Your renders! They’re already linear so no conversion is needed.
2. Working or rendering space
This is where all the calculations of your render take place. This is always linear.
- Boom! Scene-linear is the default.
And same here:
3. Viewing / Output space
Applied after the render is complete, for viewing or saving to disk.
It’s important to note how the view transform and output transform differ. Remember how Maya renders, then applies the view transform before showing it to you?
Well, the view transform is just that: a temporary conversion, or preview, so that you can see what it will look like while working on the image. It doesn’t actually alter the pixel values, like an output transform does.
Notice that by default, output color transform is turned off. Maya expects that you will write out your render in a linear file such as EXR.
You can view display settings in the Arnold Render View by clicking the little gear button on the top right.
View transform in Nuke Viewer — making this the same colorspace as your view transform in the Render view will make sure you’re seeing the same image in Maya and Nuke.
Output transform in Nuke Write node — when you write a png, jpg, etc, the output transform is then “baked into” the image.
For most cases, you’ll want the View Transform and the Output transform to be the same.
Take photography. A RAW image might look washed out, or dark, and not at all like what you want the final image to be. This is because the RAW format is not meant to be viewed, but rather to hold as much information as possible, to make artistic edits easy. In computer graphics now, that format is EXR.
Next: Learn how to set up ACES.
How to use ACES
But how do you implement ACES workflow in Maya and Nuke? Here’s what you need to know.
Implementing ACES workflow in Maya
To set up the OCIO Config in Maya, open and unzip the aces_1.1 file in a location you can find.
Go to Windows > Settings/Preferences > Preferences > Color Management. Open the config.ocio file within the aces_1.1 folder
Now, check on “Use OCIO Configuration.”
- For existing projects, the conversion may take a while.
- For existing projects, it is also a good idea to click “Reapply Rules to Scene” to make sure everything is reset to the ACES default before continuing.
Then, save preferences
The rendering space and view transform have been set automatically. There’s no need to change or set these. Note also that the default Input color space is Raw.
Setting the right colorspace for your texture files
Utility – Raw
- For all textures that do not affect the color. Roughness, bump, displacement, emission factor, masks, etc
- This is the default input colorspace. For these maps you can just bring them in and not change anything.
Utility – sRGB – Texture
- For any texture or file that affects the color of your rendered pixels.
- Diffuse color, specular color, emission color, etc.
- “sRGB” in Maya default.
- Check on Ignore Color Space File Rules. This will make sure to keep the color space to the one you designated.
Utility – Linear – sRGB
- For HDR Maps, for your skydome lights.
- Maya default used Raw colorspace for HDRs. ACES has a special colorspace for them.
- Check on “Ignore Color Space File Rules.”
Lighting and Rendering
Now light and render as you normally would! If you want to, check the Arnold Render view to make sure the view transform is set to the right one: sRGB (ACES).
- Render out an EXR file.
- One thing to note is that color values will work a little differently. What previously was a Value of 1, will now be closer to 0.8 in ACES. If you are bringing ACES into an existing project and your lights or materials look very saturated or strong, try bringing it down to 0.8 first, then adjusting.
Nuke ACES workflow: Implementing ACES workflow in NUKE
Find the Project Settings panel by pressing S in the Node Graph. Nuke (12.2 at least) ships with ACES 1.1. Set OCIO config to aces 1.1.
If you don’t see it there, set OCIO config to Custom, and open the same config.ocio file from before. Set color management to OCIO.
Note that the viewer now is showing that same color space — sRGB (ACES).
Reading and Writing images in Nuke
Nuke recognizes things pretty well by default. By this point you should effectively be done with the set up. Nuke will automatically read in your EXR render as linear, and write .png or .jpg as sRGB. The images below show you how it should look.
For some reason, the write node defaults to matte paint. If you are concerned, just put it to Output – sRGB (the same as the viewer!) to ensure that you are outputting what you were seeing.
And that’s it! Now, with all of this knowledge, your color managed lighting workflow would look like this:
- Apply all the correct input color spaces to your texture files.
- Light and render in linear space, viewing the render with a view transform applied
- Output a linear image (exr) from Maya
- Bring exr into Nuke, setting the read node colorspace to linear
- Set the Nuke viewer colorspace to the same view transform as in Maya
- Comp your heart out! (Or don’t. The first several times using this workflow, I used Nuke just to convert my EXR into a png.)
- Output a .png or .jpg, with appropriate output colorspace selected (most likely the same as viewer)
Over to you!
There you have it! Now you know what ACES is and how to use it.
Apart from being a unifying standard, the ACES workflow helps create “future-proof” formats for handoffs and archival. Until the final delivery, you can capture and create footage in the highest quality and adapt the said footage for almost any screen and color space.
Now, we’d love to hear from you.
What is your #1 question about ACES?
Let us know in the comments below.