Meta’s Llama 3.3 70B model is now available in Amazon Bedrock. Llama 3.3 70B represents a significant advancement in model efficiency and performance optimization. This instruction-tuned model delivers impressive capabilities across diverse tasks, including multilingual dialogue, text summarization, and complex reasoning. Llama 3.3 70B is a text-only instruction-tuned model that provides enhanced performance relative to Llama 3.1 70B–and to Llama 3.2 90B when used for text-only applications.
The new model delivers similar performance to Llama 3.1 405B, while requiring only a fraction of the computational resources. Llama 3.3 demonstrates substantial improvements in reasoning, mathematical understanding, general knowledge, and instruction following. Its comprehensive training enables robust language understanding across multiple domains. You can use Llama 3.3 for enterprise applications, content creation, and advanced research initiatives. The model supports multiple languages and outperforms many existing conversational models on industry standard benchmarks. It also supports the ability to leverage model outputs to improve other models including synthetic data generation and distillation. Llama 3.3 provides an accessible and powerful generative AI solution for businesses seeking high-quality, efficient language model capabilities.
Meta’s Llama 3.3 70B model is available in Amazon Bedrock in the US East (Ohio) Region, and in the US East (N. Virginia) and US West (Oregon) Regions via cross-region inference. To learn more, visit the Llama product page and documentation. To get started with Llama 3.3 70B in Amazon Bedrock, visit the Amazon Bedrock console.
Source:: Amazon AWS