Behind the Camera: A Lab That Perfects Tegra’s Knack for Taking Snapshots

by Brian Caulfield

It’s one of the most sophisticated photo calibration labs in the world. It’s at NVIDIA. And it’s not easy to find.

To visit the labs where NVIDIA’s engineers make sure the smartphones and tablets that use Tegra take amazing photos you’ve got to leave NVIDIA’s distinctive – and crowded – main campus and wander through the bland 1980s-era office park across the street.

Step inside NVIDIA’s photo calibration lab for a tour, though, and it’s clear you’ve found something special. The floors and walls of the spotless lab’s four rooms are matte black – to absorb any stray light that could throw off the lab’s delicate instruments. The science-fictioney equipment and dramatic black walls make the lab look like something you might see on ‘CSI.’

Mastering the Mix

Pick and choose: using these 'tinker' toys, NVIDIA engineers can recreate any mobile device's camera.
Pick and choose: using these ‘tinker’ toys, engineers can recreate any mobile device’s camera.

You’ll notice the most telling detail the moment you walk through the door: a wall lined with black bins filled with scores of different image sensors, lenses and tiny flash modules. NVIDIA faces a unique challenge: our Tegra processors are used in devices from nearly a dozen companies. Each one relies on sensors and lenses from different suppliers, different displays and different software.

By reaching into these bins, NVIDIA’s engineers can replicate the combination of parts used in a boggling array of smartphones and tablets. NVIDIA engineering VP Brian Cabral calls these ‘tinker toys.’ “The first step in building a great camera is knowing what they do,” Brian says as he shows a visitor around the lab. “Every cellphone camera behaves differently.”

Brian knows how to tell this story well. A trim, quick-talking engineer with a precisely-trimmed beard who holds 13 patents, Brian has been studying photography since he grew up in California’s central valley. Brian’s mission: to put Tegra’s visual computing smarts to work in every photograph taken with a Tegra smartphone or tablet.

These efforts include technologies that deliver high-dynamic range images with a single exposure and the ability to ‘tap’ on an object in a frame to keep it in focus throughout a series of shots. Such features rely on a cutting-edge computational photography technology NVIDIA calls “Chimera” that taps into the power of Tegra’s CPU and GPU to make it possible to wring jaw-dropping images out of the tiny camera systems being built into mobile devices. “It’s almost like a Photoshop plugin for hardware,” Brian explains.

Getting Light Right

To do that, NVIDIA needs to know how that hardware works, intimately. Brian’s tour of the photo calibration lab reveals the depth of NVIDIA’s commitment to great photographs. That starts with the monochromator, which helps NVIDIA’s engineers understand the way the tiny sensors built into today’s digital devices see light.

The device – which to the untrained eye it looks like some kind of futuristic space rifle mounted on a table top — is used to measure how sensors detects light. That’s critical because the light-absorbing pixels that comprise each sensor become less sensitive to light the further you go away from the center of each sensor.

Not So Pretty In Pink

An array of instruments help engineers insure Tegra-powered phones get light right.
An array of instruments help engineers insure Tegra-powered phones get light right.

The result is easy to see, Brian explains. Pick up many smartphones and take a picture of a white surface, such as a whiteboard. Now look at the photo. You’ll notice that the board will appear “pinkish” around the edges, Brian says. That’s because the image hasn’t been corrected to account for the different ways each part of the smartphone’s sensor detects light.

Images taken by a device that uses a Tegra processor, by contrast, will show the entire surface of the whiteboard as white. That’s because Brian’s team has used the monochromator to shoot a carefully calibrated beam of light at a sensor so that each pixel gets light in exactly the same color and intensity.

This let’s NVIDIA’s engineers take measurements that let them build “spectral response curves,” measurements that allow them to adjust how each phone’s Tegra processor handles light from a camera’s sensor, correcting for the differences in the way pixels on different parts of the sensor detect light.

Going the Distance

No detail is too small. Adjacent to the monochromator is another instrument – in a box about the size of a curbside recycling bin – that uses lasers to measure the distance to the edge of the tiny lenses used in modern camera phones.

Like a point-and-shoot camera, the lenses inside smartphones move back and forth as they focus on objects at different distances – the tiny lenses in smartphones just move incredibly small distances. No room for error here.

Knowing precisely how each lens moves helps Brian and his colleagues tune software so that each smartphone knows exactly where the lens is when it moves back and forth within its housing, resulting in better focus.

That precision also helps determine the ‘settle time,’ for each lens, the fraction of a second the lens needs to stop jittering after it’s moved, so that a smartphone can quickly capture a focused image.

Even the images displayed from the camera on each smartphone’s screen are studied, Brian explains, as he picks up a high-speed camera the size of a football. The camera can shoot 50,000CK frames per second. That’s fast enough to study every detail of how the pixels are painted on smartphone’s digital display, so the team at the lab can know how fast an image on the screen can be refreshed and how the colors captured by the camera look on it.

The Still Life From Hell

The tour doesn’t end here, though. Leave the sleek black-walled lab and follow Brian and he strides briskly across the building and  you’ll find a room that looks very different from the sleek black lab where we started the tour.

In one room, you’ll find a series of abstract black-and-white patterns are used to test the quality of the images taken by each freshly-tuned Tegra device.

Next door: a room that could be the live-work space from hell. It’s the opposite of the crisp, all-black labs where Brian began his tour.

Every possible indoor scene jammed into a single tennis court sized space: an office, a kitchen, a living room, a dining room are all duplicated in a room whose walls alternate between warm orange and yellow tones and cool blues and greens.

The odd colors and strange combinations let NVIDIA’s engineers test a wide-range of common – and hard to photograph – combinations.

It’s enough to give a visitor a headache. If the resulting photos don’t result in the same, NVIDIA’s engineers know they’ve tuned their Tegra device to perfection.