• new product

WEKA Unveils Powerful AI-Native Data Platform Appliance for NVIDIA DGX SuperPOD with NVIDIA DGX H100 Systems

PRNewswire March 19, 2024

CAMPBELL, Calif., March 19, 2024 /PRNewswire/ — From NVIDIA GTC: WekaIO (WEKA), the AI-native data platform company, today unveiled WEKApod™: A powerful new data platform appliance certified for NVIDIA DGX SuperPOD™ with NVIDIA DGX H100 systems. The appliance integrates WEKA’s AI-native data platform software with class-leading storage hardware to provide a ready-to-use, purpose-built environment for AI applications. Customers can deploy the WEKApod Data Platform Appliance as part of their NVIDIA DGX SuperPOD solution, helping them rapidly provision compute, storage, and networking resources and get their AI projects into production faster.

Introducing WEKApod™ Certified for NVIDIA DGX SuperPOD™ With NVIDIA DGX H100 Systems

The WEKApod Data Platform Appliance offers an exceptional, highly performant data management foundation for NVIDIA DGX SuperPOD deployments – a single cluster can deliver up to 18,300,000 IOPs. It also provides efficient write performance for AI model checkpointing, delivering superior training efficiency, scalability, enterprise-grade resiliency, and support for real-time applications.

WEKApod delivers all the capabilities of WEKA’s Data Platform software in an easy-to-deploy appliance for enterprise AI, generative AI, and GPU cloud customers. Key benefits include: 

Class-Leading Performance
The WEKA Data Platform’s AI-native architecture delivers the world’s fastest AI storage and exceptionally high performance for AI data pipelines. It ensures low latency regardless of file size and provides high levels of write throughput required by checkpointing operations to ensure business-critical continuity.

Maximum Efficiency for Improved Sustainability
The WEKA Data Platform provides increased performance density that lowers energy usage by maximizing space utilization, improving cooling efficiency, optimizing power distribution, and reducing idle energy consumption. This enables WEKA customers to decrease their carbon footprint by up to 260 tons of CO2e per petabyte stored.

Further, the WEKA Data Platform delivers efficient storage access, ensuring that computational tasks, such as training and inference processes, are executed with maximum performance, reducing idle time and overall energy consumption. By efficiently utilizing resources and minimizing data movement, the WEKA platform helps lower the associated energy costs and the carbon footprint of AI deployments, enabling organizations to achieve sustainability and environmental goals while driving AI initiatives forward.

Ultimate Choice of Deployment Options: On-Premises and in Hyperscaler, Hybrid, and GPU Clouds with the WEKA Data Platform
WEKA pioneered the concept of a software data platform for AI and high-performance computing in 2013 and was the first vendor to deliver a subscription software solution that could run on hardware from the world’s leading server vendors and in all major hyperscale public clouds. Now, customers have a new way to consume WEKA Data Platform software on-premises for large-scale AI and GPU cloud deployments with WEKApod.

“WEKA is thrilled to achieve NVIDIA DGX SuperPOD certification and deliver a powerful new data platform option for enterprise AI customers,” said Nilesh Patel, chief product officer at WEKA. “Study after study has shown enterprises are still struggling to move their AI initiatives from pilot to production, and others are still struggling to get started. Through this latest certification, we are providing a high-performance data infrastructure foundation to fuel the next wave of AI innovation. Using the WEKApod Data Platform Appliance with DGX SuperPOD delivers the quantum leap in the speed, scale, simplicity, and sustainability needed for enterprises to support future-ready AI projects quickly, efficiently, and successfully.”

“As enterprise AI adoption grows, organizations are seeking high-performance infrastructure to get their AI projects into production faster,” said Charlie Boyle, vice president of DGX Systems at NVIDIA. “The WEKApod Data Platform Appliance certification for NVIDIA DGX SuperPOD gives enterprises a proven solution at data center scale for tackling the most complex AI workloads, powering their AI transformation.”

To learn more about the WEKApod Data Platform Appliance, certified for NVIDIA DGX SuperPOD, visit https://www.weka.io/wekapod.

About WEKA
WEKA is the AI-native data platform company setting the standard for AI infrastructure with the industry’s only cloud and hardware-agnostic software solution for performance-intensive applications. The WEKA® Data Platform delivers unprecedented performance at scale. It transforms stagnant data silos into dynamic data pipelines that help enterprise AI, ML, and GPU workloads run faster and more efficiently, providing seamless data access on-premises, in the cloud, at the edge, and in hybrid and multicloud environments. WEKA helps hundreds of the world’s leading enterprises and preeminent research organizations overcome complex data challenges to reach discoveries, insights, and outcomes faster – including 11 of the Fortune 50. Learn more at www.weka.io or connect on LinkedIn, X/Twitter, and Facebook.

See why WEKA was recognized as a Visionary for three consecutive years in the Gartner® Magic Quadrant™ for Distributed File Systems and Object Storage – get the report.

WEKA and the WEKA logo are registered trademarks of WekaIO, Inc. Other trade names used herein may be trademarks of their respective owners.

 

Cision View original content:https://www.prnewswire.com/apac/news-releases/weka-unveils-powerful-ai-native-data-platform-appliance-for-nvidia-dgx-superpod-with-nvidia-dgx-h100-systems-302091921.html

SOURCE WekaIO

AAPR aggregates press releases and media statements from around the world to assist our news partners with identifying and creating timely and relevant news.

All of the press releases published on this website are third-party content and AAP was not involved in the creation of it. Read the full terms.