Llama API Now Live as Meta Expands AI Ecosystem

Meta’s Maverick Ranks High—But Is It the Same Model? Meta’s Maverick Ranks High—But Is It the Same Model?
IMAGE CREDITS: THE GUARDIAN

Meta has introduced the Llama API, giving developers early access to build with its popular Llama AI models. This new API, announced at Meta’s first LlamaCon conference, is now in limited preview.

With the Llama API, developers can experiment with Llama-powered products. They can generate training data, fine-tune models, and test their performance. Meta’s evaluation tools are included to make this easier.

The API starts with support for the Llama 3.3 8B model. Developers can train custom models and use Meta’s tools to track how well they perform. Meta says it won’t use customer data from the API to train its own systems. Also, users can move their custom models to other platforms if needed.

Meta is also working with Cerebras and Groq to offer faster ways to test Llama 4 models. These early options are available by request. Developers just need to select a provider inside the API dashboard. All usage data is kept in one place for easier tracking.

This launch comes as Meta races to stay ahead in the open-source AI space. The Llama models have already been downloaded more than a billion times. Still, rivals like DeepSeek and Alibaba’s Qwen are growing fast. By offering this API, Meta hopes to keep developers close and encourage them to build on Llama.

Meta plans to open access to the Llama API more widely in the coming weeks. For now, developers can apply to try it out during the preview phase.

Share with others

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Service

Follow us