Nvidia Introduces the New H200 for Generative AI

h200

The Nvidia H200, an upgrade from the H100, is a graphics processing unit designed for training and utilizing all kinds of artificial intelligence models that are powering the generative AI boom. The latest chip includes 141 GB of next-generation ‘HBM3’ memory that will help perform inference, or use a large model after it’s trained to generate text, images or predictions.

Nvidia said the H200 will generate output nearly twice as fast as the predecessor H100. It will compete with AMD’s MI300X GPU. The H200 is compatible with the H100. This means AI companies will not need to change their server systems or software to use the new version.

The H200 will be available in four-GPU or eight-GPU server configurations on Nvidia’s HGX complete systems. It boasts faster, larger memory to fuel the acceleration of generative AI and large language models, while advancing scientific computing for HPC workloads.

Nvidia H200 Processes Vast Amounts of Data

Ian Buck, vice president of hyperscale and HPC at Nvidia, said to create intelligence with generative AI and HPC applications, vast amounts of data must be efficiently processed at high speed using large, fast GPU memory. “With Nvidia H200, the industry leading end-to-end AI supercomputing platform just got faster to solve some of the world’s most important challenges.”

H200 is slated to lead further performance leaps. This includes nearly doubling inference speed on Llama 2, a 70-billion parameter LLM, compared to the H100. The new chip will be available in Nvidia HGX H200 server boards with four- and eight-way configurations. These are compatible with the hardware and software of HGX H100 systems.

Moreover, the H200 can be deployed in every type of data center, including on premises, cloud, hybrid-cloud and edge. And partner server makers like Lenovo, Dell Technologies, Hewlett Packard Enterprises, Wistron and Wiwynn and ASRock Rack, among others, can update their existing systems with an H200.

Keep Reading

Egyptian Startup Almouneer Revolutionizing Healthcare

AI SmartSocks for People With Dementia and Autism

Nvidia Dominates the Market

Introducing innovative chip one after the other, Nvidia reigns the market for AI chips. It powers OpenAI’s ChatGPT service and other similar generative AI services that respond to queries with human-like writing.

With the launch of the latest H200 AI chip, Nvidia shares have more than tripled in value year-to-date. The H200 will be available starting in the second quarter of 2024. It’s also available within Nvidia’s GH200 Grace Hopper Superchip, which powers more than 40 AI supercomputers globally and is set to be used by companies like Lenovo and Dell.

Share:

administrator

Ahmed Kane is an entertainment reporter who loves to cover the latest news in the world. He's passionate about bringing people the latest and greatest in entertainment.

Leave a Reply

Your email address will not be published. Required fields are marked *