Amazon MemoryDB for Redis now supports vector search in preview, a new capability that enables you to store, index, and search vectors. MemoryDB is a database that combines in-memory performance with multi-AZ durability. With vector search for MemoryDB, you can develop real-time machine learning (ML) and generative AI applications with the highest performance demands using the popular, open-source Redis API. Vector search for MemoryDB supports storing millions of vectors, with single-digit millisecond query and update response times, and tens of thousands queries per second (QPS) at greater than 99% recall. You can generate vector embeddings using AI/ML services like Amazon Bedrock and SageMaker, and store them within MemoryDB.
Source:: Amazon AWS