AI Before You Buy: Israeli Startup Renders 3D Product Models for Top Retailers

by Isha Salian

Preparing a retailer’s online catalog once required expensive physical photoshoots to capture products from every angle. A Tel Aviv startup is saving brands time and money by transforming these camera clicks into mouse clicks.

Hexa uses GPU-accelerated computing to help companies turn their online inventory into 3D renders that shoppers can view in 360 degrees, animate or even try on virtually to help their buying decisions. The company, which recently announced a $20.5 million funding round, is working with brands in fashion, furniture, consumer electronics and more.

“The world is going 3D,” said Yehiel Atias, CEO of Hexa. “Just a few years ago, the digital infrastructure to do this was still so expensive that it was more affordable to arrange a photographer, models and lighting. But with the advancements of AI and NVIDIA GPUs, it’s now feasible for retailers to use synthetic data to replace physical photoshoots.”

Hexa’s 3D renders are used on major retail websites such as Amazon, Crate & Barrel and Macy’s. The company creates thousands of renders each month, reducing the need for physical photoshoots of every product in a retailer’s catalog. Hexa estimates that it can save customers up to 300 pounds of carbon emissions for each product imaged digitally instead of physically.

From Physical Photoshoots to AI-Accelerated Renders

Hexa can reconstruct a single 2D image, or a set of low-quality 2D images, into a high-fidelity 3D asset. The company uses differing levels of automation for its renders depending on the complexity of the shape, the amount of visual data that needs to be reconstructed, and the similarity of the object to Hexa’s existing dataset.

To automate elements of its workflow, the team uses dozens of AI algorithms that were developed using the PyTorch deep learning framework and run on NVIDIA Tensor Core GPUs in the cloud. If one of Hexa’s artists is reconstructing a 3D toaster, for example, one algorithm can identify similar geometries the team has created in the past to give the creator a head start.


Another neural network can scan a retailer’s website to identify how many of its products Hexa can support with 3D renders. The company’s entire rendering pipeline, too, runs on NVIDIA GPUs available through Amazon Web Services.

“Accessing compute resources through AWS gives us the option to use thousands of NVIDIA GPUs at a moment’s notice,” said Segev Nahari, lead technical artist at Hexa. “If I need 10,000 frames to be ready by a certain time, I can request the hardware I need to meet the deadline.”

Nahari estimates that rendering on NVIDIA GPUs is up to 3x faster than relying on CPUs.

Broadening Beyond Retail, Venturing Into Omniverse

Hexa developers are continually experimenting with new methods for 3D rendering — looking for workflow improvements in preprocessing, object reconstruction and post-processing. The team recently began working with NVIDIA GET3D, a generative AI model by NVIDIA Research that generates high-fidelity, three-dimensional shapes based on a training dataset of 2D images.

sneaker generated by GET3D
By training GET3D on Hexa’s dataset of shoes, the team was able to generate 3D models of novel shoes not part of the training data.

In addition to its work in ecommerce, Hexa’s research and development team is investigating new applications for the company’s AI software.

“It doesn’t stop at retail,” Atias said. “Industries from gaming to fashion and healthcare are finding out that synthetic data and 3D technology is a more efficient way to do things like digitize inventory, create digital twins and train robots.”

The team credits its membership in NVIDIA Inception, a global program that supports cutting-edge startups, as a “huge advantage” in leveling up the technology Hexa uses.

“Being part of Inception opens doors that outsiders don’t have,” Atias said. “For a small company trying to navigate the massive range of NVIDIA hardware and software offerings, it’s a door-opener to all the cool tools we wanted to experiment with and understand the potential they could bring to Hexa.”

Hexa is testing the NVIDIA Omniverse Enterprise platform — an end-to-end platform for building and operating metaverse applications — as a tool to unify its annotating and rendering workflows, which are used by dozens of 3D artists around the globe. Omniverse Enterprise enables geographically dispersed teams of creators to customize their rendering pipelines and collaborate to build 3D assets.

“Each of our 3D artists has a different software workflow that they’re used to — so it can be tough to get a unified output while still being flexible about the tools each artist uses,” said Jonathan Clark, Hexa’s CTO. “Omniverse is an ideal candidate in that respect, with huge potential for Hexa. The platform will allow our artists to use the rendering software they’re comfortable with, while also allowing our team to visualize the final product in one place.”

To learn more about NVIDIA Omniverse and next-generation content creation, register free for NVIDIA GTC, a global conference for the era of AI and the metaverse, taking place online March 20-23.

Images and videos courtesy of Hexa