“I dream my painting, then I paint my dream.”
—Vincent van Gogh
Through the ages, artists of all types have been creating beautiful, richly detailed oil paintings on canvas that have inspired us all. But these artists likely never dreamed that one day they would be able to choose any brush they like, pick from a limitless array of paint colors, and use the same natural twists and turns of the brush to create the colorful texture of oil, all on a digital canvas.
Technically speaking, Project Wetbrush is the world’s first real-time simulation-based 3D painting system with bristle-level interactions. The painting and drawing tools most of us have used are 2D. They’re simple and they’re fun. But Project Wetbrush is completely different. This is a full 3D simulation, complete with multiple levels of thickness, depth and texture. It feels real and it’s immersive.
Oil painting on an actual canvas is full of complex interactions within the paint, between the brush and the paint, and among the bristles themselves. Project Wetbrush simulates all this in real-time, including the complexity of maintaining paint viscosity, variable brush speeds, color mixing and even the drying of paint. The bottom line is that it’s not easy to build a digital oil painting tool that lets artists paint so fluidly and naturally that they can ignore the technology and simply immerse themselves in their art.
So what’s new here? Digital painting tools certainly aren’t new, but realistic digital oil painting that dynamically simulates the motions and interactions of each bristle is absolutely a breakthrough development. Adobe Research first developed the core algorithms for this in 2015. It’s an ambitious project that demanded a huge computing resource. That’s why Adobe targeted NVIDIA GPUs for their supercomputer-class parallel processing power. With collaboration from NVIDIA software experts, the entire system was highly tuned with key GPU optimizations to take Project Wetbrush even further.
Like most research, this is just the beginning. The future holds promise for more optimizations, better rendering and even deep learning, where NVIDIA GPUs play a major role. Using GPU-accelerated deep learning, some of the most computationally difficult physical simulations could potentially be handled to create more responsive and realistic brush dynamics. Wetbrush could even learn from itself in the future. It’s possible that a database of realistic high-quality painting and brush strokes can be used to train a deep learning system for oil painting effect synthesizing.
To see Project Wetbrush in action, visit the NVIDIA booth #509 at SIGGRAPH for a live demo. If you want to dig into the nuts and bolt of how this technology came to life, read the Project Wetbrush technical paper or hear Adobe’s Zhili Chen and NVIDIA’s Chris Hebert speak at the NVIDIA Theater on Wednesday, July 27. See complete list of NVIDIA talks at SIGGRAPH here. Follow the latest happenings at #SIGGRAPH2016.
Thanks to Adobe Research principals, Zhili Chen and Byungmoon Kim, for their imagination and skill in developing Wetbrush. And special thanks to Chris Hebert of NVIDIA for his tireless efforts to guide the GPU optimization effort. All images created with Wetbrush by artist Daniela Flamm Jackson.