CIO Insider

CIOInsider India Magazine

Separator

Nvidia tweaks New Features under its AI Chip-Line

CIO Insider Team | Tuesday, 14 November, 2023
Separator

Under its top-of-the-line chip for artificial intelligence, Nvidia brought in new features saying the new offering will start to roll out following time with Amazon.com, Alphabet's Google and Oracle.

Named H200, the chip will overtake Nvidia's current top H100 chip. The primary upgrade is more high-bandwidth memory, one of the costliest parts of the chip that defines how much data it can process quickly.

Nvidia dominates the market for AI chips and powers OpenAI's ChatGPT service and numerous similar generative AI services that respond to queries with human-like writing.

The addition of more high- bandwidth memory and a faster connection to the chip's processing elements means that such services will be able to spawn an answer more quickly.

The H200 has 141- gigabytes of high- bandwidth memory, up from 80 gigabytes in its previous H100.

Nvidia didn't disclose its suppliers for the memory on the new chip, but Micron Technology said in September that it was working to become an Nvidia supplier.

Nvidia also buys memory from Korea's SK Hynix, which said last month that AI chips are helping to revive sales.

Nvidia said that Amazon Web Services, Google Cloud, Microsoft Azure and Oracle Cloud Infrastructure will be among the first cloud service providers to offer access to H200 chips, in addition to specialty AI cloud providers CoreWeave, Lambda and Vultr.

The chipmaker said the H200 would nearly double the inference speed on the 70 billion- parameter model of Meta’s Llama 2 open source large language model, compared to the H100.

Nvidia said the H200 will see additional advancements with software updates.

The primary upgrade is more high-bandwidth memory, one of the costliest parts of the chip that defines how much data it can process quickly.

The news comes as Nvidia rivals unveiled chips that aim to challenge its GPU dominance especially since there's a deficit of its GPUs due to high demand.

AMD is expected to start shipping its MI300X chip this year, which offers up to 192 GB of memory. Big AI models need a lot of memory because they do a lot of calculations. AMD had showcased a demonstration of the MI300X chip by running a Falcon model with 40 billion parameters.

Intel is coming out with AI- powered PC chips called Meteor Lake in December. It's a chiplet SoC design − small and meant to work with other chiplets. Meteor Lake represents Intel’s first dedicated AI engine that's integrated directly into an SoC to ‘bring AI to the PC at scale’, the company had said.

SambaNova Systems has its SN40L chip, which it said can handle a 5- trillion parameter model and support over 256k sequence length on a single system for better quality and faster outcomes at a lower price.



Current Issue
Sandlogic Technologies: Pioneering Innovation In AI & Edge Computing



🍪 Do you like Cookies?

We use cookies to ensure you get the best experience on our website. Read more...