New offerings from NetApp, NVIDIA and our partners announced at NetApp Insight are giving IT teams the opportunity to be the catalysts for their companies’ AI transformation.
Here are five of the most important pieces of news and areas related to AI at Insight 2019.
Making the Case for AI Infrastructure
Many organizations have witnessed the sprawl of discrete silos of AI development, led by business units with a mandate to build applications to address new revenue opportunities, reduce costs and improve customer relationships.
A lot of this work happens outside the view of corporate IT, often due to lack of viable infrastructure that can support model development. This can addressed with easily accessed cloud compute.
As AI workloads scale from prototyping and experimentation to production-scale training, IT needs to reclaim the mantle of delivering the platform that can support the full development lifecycle. They can do this by deploying ONTAP AI.
The Multi-Cloud Approach to Building AI
Where a single hammer was previously the tool for every nail, we’re witnessing the rise of a hybridized approach to AI infrastructure that straddles cloud and on-prem to support an end-to-end, optimized development workflow.
Early prototyping and model experimentation in the cloud offers the fastest path to getting started, when used in combination with developer tools like pre-optimized AI software stacks on NGC and pre-built models and model scripts that allow data scientists to spend more time on creative experimentation.
As their work matures into models that are ready for production training at scale, the deterministic performance of on-prem infrastructure combined with the cost-efficiency of co-resident data and compute becomes the accelerant for delivering AI applications that deliver the best predictive accuracy and the fastest time to insights.
It’s not a case of cloud vs. on-prem — it’s both. And ONTAP AI-Ready Data Center is designed to help organizations that need to straddle both, in combination with NetApp’s Data Fabric architecture that enables effortless mobility of data sets across edge, core and cloud.
Kick the Tires, Buy It If You Love It
There’s nothing like putting a solution through its paces and getting as close to the post-purchase user experience as possible. The same is true for AI infrastructure, where developers want to be able to train their models on their own datasets on the gear they’ll eventually use full-time.
We’ve enabled exactly that with ONTAP AI Test Drive. Flexential recently launched this offering, and it’s a great way for data science teams to experience new levels of productivity while giving IT teams an opportunity to manage the platform that will soon be theirs. Test Drive appeals to both camps and helps take the risk out of infrastructure deployment.
The Utility Model for AI Compute
This is basically having your cake and eating it. What if you needed access to an AI training system, but didn’t have the capital to purchase it? What if you needed something to support a specific project? What if you already have on-prem infrastructure but you needed a way to access extra compute cycles and storage as needed?
ONTAP AI-as-a-Service supports these scenarios, available through select partners like Core Scientific. Whether you prefer a 100 percent op-ex approach for AI infrastructure or you follow the mantra of “Own the Base, Rent the Spike,” this offering can help you get there.
No Data Center, No Problem!
This has been a favorite tagline of ours when describing the value of our colocation services program for organizations whose facilities are not optimized for AI compute. But what if we showed you a rack of supercomputing infrastructure that could be deployed anywhere, in a completely self-contained environment, with integrated security, fire and noise suppression?
ScaleMatrix has this covered with the ONTAP AI data center in a completely self-contained environment, which can go almost anywhere you need it.