Meta’s Llama models get 350 million downloads – Computerworld

The recent 20 million downloads could be seen as an effect of the company’s Llama 3.1 update that included a 405 billion parameter model as well as 70 billion parameter and 8 billion parameter variants — all of which performed better on various benchmarking tests, such as MATH and HumanEval.

“Hosted Llama usage by token volume across our major cloud service provider partners more than doubled May through July 2024 when we released Llama 3.1,” Al-Dahle wrote, adding that the company’s largest variant of LLM, the 405 billion parameter variant, was also gaining traction.

Separately, Meta has been actively trying to increase the number of partners that either host or distribute the Llama family of models. These partners include the likes of AWS, Azure, Google Cloud Platform, Databricks, Dell, Google Cloud, Groq, NVIDIA, IBM watsonx, Scale AI, and Snowflake among others.

Leave a Comment

Scroll to Top