New MongoDB Atlas Vector Search integration with Amazon Bedrock to help accelerate development of highly engaging applications powered by generative AI
MongoDB integrates Atlas Vector Search with Amazon Bedrock, streamlining generative AI and semantic search for next-gen AWS applications. Developers can leverage this integration to create engaging, customized user experiences with up-to-date responses across various use cases.
“Customers of all sizes from startups to enterprises tell us they want to take advantage of generative AI to build next-generation applications and future proof their businesses. However, many customers are concerned about ensuring the accuracy of the outputs from AI-powered systems while protecting their proprietary data,” said Sahir Azam, Chief Product Officer at MongoDB.”
“In this next wave of widespread AI adoption, organizations want to strengthen their data strategies to develop differentiating and competitive generative AI solutions,”
Vasi Philomin, Vice President of Generative AI at AWS
Amazon Bedrock is a fully managed service from AWS that offers a choice of high-performing foundation models (FMs) via a single API, along with a broad set of capabilities to build generative AI applications with security and privacy. This new integration with Amazon Bedrock allows organizations to quickly and easily deploy generative AI applications on AWS that can act on data processed by MongoDB Atlas Vector Search and deliver more accurate and relevant responses. Unlike add-on solutions that only store vector data, MongoDB Atlas Vector Search powers generative AI applications by functioning as a highly performant and scalable vector database with the added benefits of being integrated with a globally distributed operational database that can store and process all of an organization’s data.
Using the integration with Amazon Bedrock, customers can privately customize FMs—from AI21 Labs, Amazon, Anthropic, Cohere, Meta, and Stability AI—with their proprietary data, convert data into vector embeddings, and process these embeddings using MongoDB Atlas Vector Search. Leveraging Agents for Amazon Bedrock for retrieval augmented generation (RAG), customers can then build applications that respond to user queries with relevant, contextualized responses—without needing to manually code. For example, a retail apparel organization can more easily develop a generative AI application to help employees automate tasks like processing inventory requests in real time or to help personalize customer returns and exchanges by suggesting similar styles of in-stock merchandise. With fully managed capabilities, this new integration will enable joint AWS and MongoDB customers to securely use generative AI with their proprietary data to its full extent throughout an organization and realize business value more quickly—with less operational overhead.
“In this next wave of widespread AI adoption, organizations want to strengthen their data strategies to develop differentiating and competitive generative AI solutions,” said Vasi Philomin, Vice President of Generative AI at AWS.”