Amazon is constructing one of the most potent AI supercomputers in the world in partnership with Anthropic. This OpenAI competitor is striving to expand the realm of artificial intelligence possibilities. When finished, it will be five times bigger than the cluster that currently houses Anthropic’s most potent model.
The supercomputer, which will house hundreds of thousands of Amazon’s newest AI training chip, Trainium 2, is expected to be the largest AI machine in the world when it is completed, according to Amazon.
At Amazon Web Services Re: Invent conference in Las Vegas, CEO Matt Garman unveiled the supercomputer ambitions, known as Project Rainer, along with a number of other announcements that solidified Amazon’s position as a growing dark horse in the field of generative artificial intelligence.
Also Read: A new feature on Amazon Prime Video recaps what you’re watching using Generative AI
Additionally, Garman revealed that Tranium 2 will be widely accessible on “Trn2 UltraServer clusters,” which are dedicated to teaching frontier AI. Amazon’s cloud is already being used by many businesses to create and train custom AI models, sometimes in conjunction with Nvidia GPUs. However, according to Garman, the new AWS clusters are between 30 and 40 percent less expensive than those that use GPUs from Nvidia.
The business has invested $8 billion in Anthropic this year and has subtly released several tools through the Bedrock AWS platform to assist businesses in utilizing and managing generative AI.
Amazon also unveiled Trainium 3, a next-generation training chip that it claims will provide four times the performance of its existing chip, during Re: Invent. Customers will be able to purchase it in late 2025.
Amazon will be able to lower the cost of its AI software by using its line of chips.
Also Read: Anthropic Claims Claude AI can Mimic your Distinct Writing Style
According to Garman, many clients are much more interested in figuring out how to make generative AI more affordable and dependable than they are in pushing the technology’s boundaries.
For example, Model Distillation, a recently released AWS service, can create a smaller model with comparable capabilities to a larger one that is faster and less costly to run. Garman says, “Suppose you are an insurance company. You can train the smaller model to be an expert on those things by feeding a whole set of questions into a sophisticated model.”
The creation and management of so-called AI agents that automate practical tasks like order processing, analytics, and customer care is possible with Bedrock Agents, another new cloud application that was unveiled. It consists of a master agent who will oversee a group of AI subordinates, coordinating modifications and reporting on their performance.
Also Read: Amazon allegedly postponed the release of its AI-powered Alexa until next year