Skip to main content

About the AI Context Engine

The AI Context Engine™ is a powerful tool that helps generative AI to better understand the context of your business and data. It utilizes your own business context and data to provide more accurate and insightful responses, leading to better decision-making.

Key capabilities of the AI Context Engine:

  • Natural Language Interface: It allows your team to ask complex data questions using natural language and get trustworthy answers at scale.

  • Tools Deployment: The engine provides APIs, SDKs, and plugins to incorporate tools into your preferred AI applications and chat interfaces.

  • Audit Trace and Tools: It comes with full audit and traceability provision along with natural language explanations for sensitive data recovery.

  • Automated Agents: The engine uses agents for key tasks such as glossary lookups, query validation, and query explanation.

The AI Context Engine connects the dots between your data and your business context like terms, definitions, metrics, and processes. This allows AI-powered applications to provide real analytics and business decision support. It also ensures governance, stopping hallucinations of generative AI by providing explainable, governable responses. It helps everyone understand how AI reached an answer, making AI responses auditable and reusable for future queries.

High-Level Overview of the AI Context Engine Architecture

The AI Context Engine is designed with an API-first philosophy, seamlessly integrating into various applications and workflows. This versatile tool can function as a custom LLM tool, a LangChain (or equivalent) tool, or be called directly providing a standardized interface for capitalizing on the engine's capabilities. The role of a Data Catalog is of pivotal importance acting as a central repository of metadata about utilized data sources.

Much like the data.world platform, AI Context Engine derives its power from its carefully structured input and processes. In data.world, metadata forms the core, offering details about data like location, structure, quality, usage, relationships, meaning, and lineage. AI Context Engine™, too, benefits from a similar, structured collection of data, particularly in ensuring the delivery of accurate and governed responses.

Moreover, just as data connections in data.world interact with the data through metadata collection, virtual queries, and data extraction, the AI Context Engine's capabilities are critically dependant on the various means of interaction with the data.

The AI Context Engine™'s architecture is structured into several primary layers, all of which can be accessed through the API:

  1. API Gateway: This is the access point for all interactions with the AI Context Engine. It handles tasks like authentication, authorization, rate limiting, and others, similar to the API functions detailed in the Developer docs.

  2. Query Processing: This layer processes user queries acquired through the API. It mirrors the advanced query handling described in data.world’s API query guide, utilizing natural language processing techniques to comprehend the query's intent and identify relevant information.

  3. Semantic Layer: This layer utilizes semantic web technologies for structured representation of knowledge and data, similar to how data.world uses RDF (Resource Description Framework) for data presentation. The custom OWL ontology used here is based on your Data Catalog and defines the relevant concepts, relationships, and constraints for your domain.

  4. Data Virtualization Layer: This layer integrates data from a variety of sources, such as databases, APIs, and cloud storage, reminiscent of the various sources supported by data.world. It uses R2RML mappings to convert structured data into RDF, ensuring seamless integration with the semantic layer. Your Data Catalog plays a vital role here by enabling accurate data source interpretation and querying by the AI Context Engine.

  5. LLM Interaction: This layer interacts with the LLM (Mixtral) to generate responses. This interaction is powered by the structured data and knowledge representation from the Semantic and Data Virtualization layers. The Data Catalog may also supply metadata to enrich the context and guide LLM's response generation.

  6. Response Generation: The final response to user queries is created in this layer, with potential post-processing of LLM's output to optimize accuracy, relevance, and policy compliance akin to the custom SQL queries described by data.world. The Data Catalog can further enhance the response quality by supplying additional information or validation.

Integration Capabilities

The AI Context Engine's API-first design makes it highly adaptable to various integration scenarios:

  • Custom LLM Tool: You can easily integrate the AI Context Engine™ into your own custom LLM tool, allowing you to leverage its advanced capabilities for semantic understanding, data integration, and response generation.

  • LangChain (or Equivalent) Tool: The API can be seamlessly integrated with LangChain or similar frameworks, enabling you to incorporate the AI Context Engine™ into complex LLM-powered workflows.

  • Direct API Calls: For maximum flexibility, you can directly call the AI Context Engine™ API from your applications, giving you full control over the integration process.