Data Collections
Include pertinent business data and documents related to the ongoing conversation directly in your prompts, without requiring a vector database.
Include pertinent business data and documents related to the ongoing conversation directly in your prompts, without requiring a vector database.
In Dataconvo, prioritizing users and their persistent chat sessions effectively streamlines the management of interactions with your Assistant.
Dataconvo provides a rapid user experience. Memory recall, dialog classification, data extraction, and other functions perform significantly faster than comparable features offered by leading LLM vendors, achieving speeds up to 80% faster.
Dataconvo's data model covers text, dates, numbers, phones, emails, and zip codes. Its Regex field manages e-nums and CSV parsing, outperforming OpenAI’s LLM providers
Dataconvo’s Structured Data Extraction is a powerful tool designed for extracting data from chat histories stored in Zep’s Long-term Memory service. Notably, it outperforms GPT-4 by being up to 10 times faster.
Dataconvo now supports searching chat history summaries, enabling developers to provide rich, concise context to LLMs.
Dataconvo simplifies compliance with data privacy regulations like GDPR and CCPA for LLM applications by providing robust data handling and management features.
Hierarchical Navigable Small World (HNSW) indexes are faster, more accurate, and easier to maintain than the previously used IVFFLAT indexes in Dataconvo, as they don't require a manual indexing step.
Creating an intent router with Langchain ensures your LLM app understands user intent effectively, automatically selecting the most suitable prompt for each task.