Amazon OpenSearch Service adds multimodal support on Neural Search for OpenSearch 2.11 deployments. This empowers builders to create and operationalize multimodal search applications with significantly reduced undifferentiated heavy-lifting. For years, customers have been building vector search applications on OpenSearch k-NN, but they’ve been burdened with building middleware to integrate text embedding models into search and ingest pipelines. OpenSearch builders can now power multimodal search through out-of-the-box integrations with Amazon Bedrock text and image multimodal APIs to power search pipelines that run on-cluster.
Source:: Amazon AWS