Couchbase adds vector search to the database platform

Database company Couchbase has added vector search Capella sofa and Couchbase server.

According to the company, vector search it allows similar objects to be discovered in a search query, even if they are not a direct match, as it returns “nearest neighbor results”.

Vector search also supports text, images, audio and video by first converting them to mathematical representations. This makes it suitable for AI applications that may use all these formats.

Couchbase believes that semantic search powered by vector search and assisted by augmented retrieval generation will help reduce hallucinations and improve response accuracy in AI applications.

By adding vector search to its database platform, Couchbase believes it will help users create personalized AI-powered applications.

“Couchbase is seizing this moment, bringing together vector search and real-time data analytics on the same platform,” said Scott Anderson, senior vice president of product management and business operations at Couchbase. “Our approach provides customers with a secure, fast and simplified database architecture that is multipurpose, real-time and AI-ready.”

Additionally, the company also announced integrations with LangChain and LlamaIndex. LangChain provides a common API interface to interact with LLMs, while LlamaIndex provides a range of choices for LLMs.

“Fetching has become the dominant way to combine data with LLM,” said Harrison Chase, CEO and co-founder of LangChain. “Many LLM-driven applications require user-specific data beyond the model training dataset, relying on robust databases to ingest additional data and context from various sources. Our integration with Couchbase provides users with another powerful vector storage database option to help them build AI applications.”

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *