CIO Insider

CIOInsider India Magazine

Separator

Meta Discloses Information About its Data Center Projects, a Proprietary Chip Family

CIO Insider Team | Friday, 19 May, 2023
Separator

With the goal to further assist artificial intelligence development, Meta Platforms disclosed additional information about its data center projects, including a proprietary chip family that is being created in-house.

It was an attempt by Meta to project strength, as the company has historically been sluggish to adopt hardware that is conducive to AI, hindering its ability to stay up with competitors like Google and Microsoft.

The discovery engines, moderation filters, and ad recommenders found throughout Meta's apps and services are currently powered by AI that was built over the course of about a decade ago, with billions of dollars invested in both recruiting top data scientists and developing new types of AI.

However, the company has had trouble commercializing many of its more ambitious AI research advances, particularly in the field of generative AI.

The MTIA is an ASIC, a type of chip that combines various circuits on a single circuit board and can be designed to perform one or more functions simultaneously.

Up until 2022, Meta mainly used a combination of CPUs, which are frequently less effective for those kinds of jobs than GPUs, and a specialized chip made for accelerating AI algorithms to run its AI workloads. A large-scale rollout of the bespoke processor, which was scheduled for 2022, was canceled by Meta in favor of orders for Nvidia GPUs worth billions of dollars, which necessitated a significant overhaul of several of its data centers.

In an effort to turn things around, Meta started preparations to begin creating a more ambitious internal chip, scheduled to be released in 2025 and capable of both operating and training AI models.

In a series of blog posts, the owner of Facebook and Instagram said that as part of the Meta Training and Inference Accelerator (MTIA) program, it will design a first-generation microprocessor in 2020. The goal was to make the recommendation models, which are used to distribute adverts and other material in news feeds, more effective.

According to the posts, the initial MTIA chip was only focused on the AI technique known as inference, in which computers trained on vast quantities of data make decisions about whether to display, for example, a dancing video or a cat meme as the next item in a user's feed.

Custom AI chips are becoming more and more popular among major players in the technology industry. To train massive generative AI systems like PaLM-2 and Imagen, Google developed a processor known as the TPU (short for tensor processing unit).

The MTIA is an ASIC, a type of chip that combines various circuits on a single circuit board and can be designed to perform one or more functions simultaneously.

Current Issue
Datasoft Computer Services: Pioneering The Future Of Document Management & Techno-logical Solutions