AI & ML News

NetApp Turbocharges AI Innovation with Intelligent Data Infrastructure

NetApp

NetApp announces simplified, secure, high-performing infrastructure in partnership with NVIDIA

NetApp introduces enhanced features to boost the effectiveness of generative artificial intelligence (Gen AI) endeavors, empowering users to gain a competitive edge. By integrating NetApp’s smart data infrastructure with NVIDIA’s high-performance compute, networking, and software, customers can elevate their AI initiatives.

Gen AI has captured global attention for its potential to automate tedious tasks, uncover new insights, and drive product innovation. Nearly three out of four companies are already using Gen AI, according to the NetApp 2023 Data Complexity report. To unlock the potential of Gen AI, organisations need secure, high-performance access to data spread across complex hybrid and multicloud environments. NetApp has a long and successful history of expertise in supporting AI with solutions that deliver management simplicity anywhere data lives, provide high performance without requiring new infrastructure silos, and supply trusted, secure data to drive responsible AI.

“Our pivotal partnerships with AI trailblazers such as NVIDIA also allow us to create sophisticated data pipelines for enterprises embarking on innovative AI ventures.”

Masahiro Waki, AI Business Lead for NetApp Asia Pacific

“NetApp is the intelligent data infrastructure company, with solutions optimised to unleash the full potential of our customers’ investments in AI”, said Ravi Chhabria, Managing Director at NetApp India. “Our distinct approach to AI provides complete access and control over their data across the entire pipeline, moving seamlessly between public cloud and on-prem environments.”

To support companies leveraging Gen AI to improve their operations and strategic decision-making, NetApp released updates to its intelligent data infrastructure capabilities including:

  • NetApp AIPod™ is NetApp’s AI-optimised converged infrastructure for organisations’ highest priority AI projects, including training and inferencing. NetApp AIPod powered by NVIDIA DGX is now a certified NVIDIA DGX BasePOD solutions using NVIDIA DGX H100 systems integrated with NetApp AFF C-Series affordable capacity flash systems to drive a new level of cost/performance while optimising rack space and sustainability. NetApp AIPod powered by NVIDIA DGX also continues to support NVIDIA DGX A100 systems.
  • New FlexPod for AI reference architectures extend the leading converged infrastructure solution from NetApp and Cisco. FlexPod for AI now supports the NVIDIA AI Enterprise software platform. FlexPod for AI can now be extended to leverage RedHat OpenShift and SuSE Rancher. New scaling and benchmarking have been added to support increasingly GPU-intensive applications. Customers can use these new FlexPod solutions as an end-to-end blueprint to efficiently design, deploy, and operate the FlexPod platform for AI use cases.
  • NetApp is now validated for NVIDIA OVX systems. NetApp storage combined with NVIDIA OVX computing systems can help streamline enterprise AI deployments, including model fine-tuning and inference workloads. Powered by NVIDIA L40S GPUs, validated NVIDIA OVX solutions are available from leading server vendors and include NVIDIA AI Enterprise software along with NVIDIA Quantum-2 InfiniBand or NVIDIA Spectrum-X Ethernet, and NVIDIA BlueField-3 DPUs. NetApp is one of the first partners to complete this new storage validation for NVIDIA OVX.

 Masahiro Waki, AI Business Lead for NetApp Asia Pacific, said,  “Our pivotal partnerships with AI trailblazers such as NVIDIA also allow us to create sophisticated data pipelines for enterprises embarking on innovative AI ventures.”

Related posts

Production of 200+ Layer QLC NAND Begins

enterpriseitworld

AHAD ropes in Somnath Sarkar as CISO

enterpriseitworld

New Cyber Risk Management can Anticipate and Eliminate Breaches

enterpriseitworld
x