Mistral Small foundation model now available in Amazon Bedrock

The Mistral Small foundation model from Mistral AI is now generally available in Amazon Bedrock. You can now access four high-performing models from Mistral AI in Amazon Bedrock including Mistral Small, Mistral Large, Mistral 7B, and Mixtral 8x7B, further expanding model choice. Mistral Small is a highly efficient large language model optimized for high-volume, low-latency language-based tasks. It provides outstanding performance at a cost-effective price point. Key features of Mistral Small include retrieval-augmented generation (RAG) specialization, coding proficiency, and multilingual capabilities.

Mistral Small is perfectly suited for straightforward tasks that can be performed in bulk, such as classification, customer support, or text generation. The model specializes in RAG ensuring important information is retained even in long context windows, which can extend up to 32K tokens. Mistral Small excels in code generation, review, and commenting, supporting all major coding languages. Mistral Small also has multilingual capabilities delivering top-tier performance in English, French, German, Spanish, and Italian; it also supports dozens of other languages. The model also comes with built-in efficient guardrails for safety.

Mistral AI’s Mistral Small foundation model is now available in Amazon Bedrock in the US East (N. Virginia) AWS region. To learn more, read the AWS News launch blog, Mistral AI in Amazon Bedrock product page, and documentation. To get started with Mistral Small in Amazon Bedrock, visit the Amazon Bedrock console.
 

Source:: Amazon AWS