Categories: AI India

Sarvam AI Launches Sarvam-1: India’s First Multilingual LLM Empowering AI in Indian Languages

Sarvam AI's launch of Sarvam-1, India's first indigenous multilingual LLM, is set to redefine AI for Indian languages, supporting 10+ native languages on domestic infrastructure. This powerful LLM enhances token efficiency, enabling faster, more inclusive applications.

Sarvam AI introduced Sarvam-1, a revolutionary Large Language Model (LLM) created especially for Indian languages, on October 24, 2024. This model is unique because it was trained entirely on domestic infrastructure and is described as India’s first indigenous multilingual LLM. 

Sarvam-1, which has almost 2 billion parameters and supports ten major Indian languages in addition to English, promises to improve AI’s capabilities in a linguistically diverse nation like India.

What’s New:

Sarvam-1 is a significant development in artificial intelligence, especially for Indian languages. It supports Indian Languages such as Bengali, Tamil, Telugu, Gujarati, Kannada, Malayalam, Marathi, Oriya, Punjabi, and Hindi. 

The Sarvam-2T dataset, which comprises around 2 trillion tokens was used to train the model. The model was constructed using cutting-edge domestic AI infrastructure driven by NVIDIA’s H100 GPUs. The purpose of this dataset was to boost the quality of training data for Indic languages.

Key Insight:

One of the most impressive features of Sarvam-1 is its ability to handle token efficiency effectively. In many existing models, words in Indian languages are broken down into 4 to 8 tokens for processing. The token efficiency rate of Sarvam-1, on the other hand, ranges from 1.4 to 2.1 tokens per word. This indicates that compared to its predecessors, it can process information more quickly and effectively. The model also claims to outperform larger models like Meta’s Llama-3.2-3B in several benchmarks while maintaining competitive performance.

The model can be downloaded from 🤗 Hub.

Comparison of token fertility between Sarvam-1 and other popular LLMs

How This Works:

The development of Sarvam-1 involved addressing two major challenges: token inefficiency and poor data quality in Indic languages. By utilizing synthetic data generation techniques, Sarvam AI created a robust training compilation that ensures better performance in tasks like cross-lingual translation and question-answering. The model’s architecture allows it to process language more effectively, making it suitable for practical applications across different devices.

Result:

Sarvam-1 has demonstrated superior performance on industry benchmarks such as MMLU, Arc-Challenge, and IndicGenBench. It achieved an accuracy score of 86.11 on the TriviaQA benchmark across Indic languages, significantly higher than the scores of larger models like Llama-3.1 8B. Moreover, its inference speed is reported to be 4 to 6 times faster than that of larger models, making it particularly effective for real-time applications.

Sarvam-1 accuracy over other LLMs

Why This Matters:

The launch of Sarvam-1 is crucial for several reasons:

  • Inclusivity: It makes advanced AI technology accessible to speakers of diverse Indian languages.
  • Efficiency: The improved token efficiency can lead to faster processing times in applications like chatbots and translation services.
  • Local Development: By developing this model domestically, Sarvam AI contributes to India’s growing tech ecosystem and reduces reliance on foreign technology.

This development aligns with India’s ambition to become a leader in AI innovation tailored to its unique linguistic landscape.

We’re Thinking:

The launch of Sarvam-1 may revolutionise the way AI interacts with Indian languages in the future. Its open-source status on websites such as Hugging Face will motivate researchers and developers to investigate potential applications and improve the model. Sarvam-1’s success may encourage similar initiatives in other linguistically diverse regions of the world.

NVIDIA NVLM 1.0: Know All About Open-Source Multimodal LLM

This post was last modified on October 26, 2024 4:25 am

Bilal Abbas

Bilal Abbas holds a Master’s in International Relations from Jamia Millia Islamia, Delhi, and a Bachelor’s in Economics from the University of Lucknow. A creative yet logical thinker, Bilal is deeply curious about the intricacies of the global economy and international politics. His interest in technology has led him to explore and write on fintech topics, blending his academic expertise with a passion for innovation. Bilal also finds joy in nature and appreciates the serenity of greenery. In his leisure time, Bilal can be found sketching, or immersed in a good book.

Recent Posts

Perplexity AI Voice Assistant: How to Use and Benefits for iOS and Android Phones

Perplexity AI Voice Assistant is a smart tool for Android devices that lets users perform…

May 10, 2025

Meta AI App: How to Download? Check Its Key Features and Benefits

Meta AI is a personal voice assistant app powered by Llama 4. It offers smart,…

May 10, 2025

AI in U.S. Education for American Youth by President DONALD TRUMP

On April 23, 2025, current President Donald J. Trump signed an executive order to advance…

May 10, 2025

Google is moving Android news to a virtual event before I/O

Google is launching The Android Show: I/O Edition, featuring Android ecosystem president Sameer Samat, to…

April 29, 2025

Top Generative AI Companies of the World 2025

The top 11 generative AI companies in the world are listed below. These companies have…

April 28, 2025

Veo 2 extends access to more Gemini Advanced Users

Google has integrated Veo 2 video generation into the Gemini app for Advanced subscribers, enabling…

April 25, 2025