Site icon GIXtools

AWS announces Amazon Redshift integration with Amazon Bedrock for generative AI

AWS announces the integration of Amazon Redshift with Amazon Bedrock, a fully managed service offering high-performing foundation models (FMs) making it simpler, and faster for you to build generative AI applications. This integration enables you to leverage large language models (LLMs) from simple SQL commands alongside your data in Amazon Redshift.

With this new feature, you can now easily perform generative AI tasks such as language translation, text generation, summarization, customer classification, and sentiment analysis on your Redshift data using popular FMs like Anthropic Claude, Amazon Titan, Llama 2, and Mistral AI. First, your Redshift admin adds a policy to invoke Bedrock models in the IAM role to your Redshift Serverless namespace or cluster. Then, you can simply use the CREATE EXTERNAL MODEL command to point to a LLM in Amazon Bedrock, without requiring any model training or provisioning. You can invoke these models using familiar SQL commands, making it easier than ever to integrate generative AI capabilities into your data analytics workflows. You do not incur additional Amazon Redshift charges for using Large Language Models (LLMs) with Amazon Redshift ML beyond the standard Amazon Bedrock pricing.

Amazon Redshift integration with Amazon Bedrock is now generally available in all regions where Amazon Bedrock and Amazon Redshift ML are supported. To get started, visit the Amazon Redshift machine learning documentation and the Amazon Bedrock product page.
 

Source:: Amazon AWS

Exit mobile version