A graph‑aware exploration engine for AI agents. Neural Voyager connects models to your data layers (databases, files, APIs) via MCP‑compatible tools, builds a living map of knowledge, and plans safe read/write journeys with SoI‑aware guardrails.
MCP Protocols
Database‑native
Zero‑Trust
Observability
It interprets natural input and routes it to the right models, systems, or workflows—with no-code flexibility and multi-model intelligence. It’s not just a model—it’s an AI mission brain.
Overview
Bridge LLM agents to SQL/NoSQL, files, and internal APIs using MCP tools and adapters.
Traverse schemas and documents to build a knowledge graph of entities, relations, and constraints.
Plan queries and mutations with SoI‑aware policies, approvals, and least‑privilege scopes.
Whether you're handling mission-critical data, performing complex data analysis, or deploying LLMs across applications, Neural Voyager provides the infrastructure for scalable, secure, and intelligent data handling
Access data with high precision and speed
Uncover hidden patterns within interconnected data
Streamline LLM interactions with balanced load distribution and effective parsing
Leverage embedded models for in-context data processing tailored to unique user requirements
Architecture
Pluggable tools, typed plans, and policy‑gated execution.
Tool Bridge (MCP): Discovery & registration of tools
Schema Explorer: Samples DBs, infers FKs, annotates PII
Planner: Multi‑step typed plans (read→transform→write)
Executor: Policy checks, approval prompts, rollbacks
Telemetry: Step traces, spans, metrics for SLOs
PostgreSQL / MySQL
BigQuery / Snowflake
MongoDB / Redis
S3 / GCS (Parquet/CSV)
Notion / Confluence
HTTP/GraphQL APIs
Capabilities
Catalog schemas, detect joins, sensitive fields.
Generate queries, datasets, dashboards.
Dry‑run mutations with rollback plans.
Neural Voyager leverages advanced graph-based algorithms to retrieve relevant nodes swiftly, allowing users to access precise information from complex, interconnected datasets
Maps and visualizes relationships between nodes to reveal the connections and patterns within datasets. This feature is particularly valuable for understanding data context, uncovering trends, and enhancing decision-making processes
By connecting relevant data points, Neural Voyager highlights correlations and dependencies that would otherwise remain hidden, providing a comprehensive view of the data landscape
Protocols
Databases + MCP Protocols IV — DB exploration & ACID‑aware mutations.
{
"name": "db.explorer",
"tools": [
{ "name": "schemas.list" },
{ "name": "tables.describe" },
{ "name": "query.run", "readonly": true },
{ "name": "mutation.plan" }
]
}
type Plan = {
id: string,
soi: 'Private'|'Social'|'Public',
steps: [read|transform|write],
approvals?: { required: boolean, roles: string[] }
}
Neural Voyager’s vector database indexes data based on contextual similarity, enabling rapid retrieval of relevant information. This indexing method ensures that related data is grouped and accessible, improving search efficiency and data relevance
The vector database allows for a high level of accuracy in data retrieval, ensuring that similar data points are indexed together and surfaced based on user queries
Optimized for high-speed, real-time access, the vector database supports both small-scale operations and extensive enterprise-level datasets
Security
Private/Social/Public controls applied per data source and action.
Readonly by default
Column & row‑level policies
Scoped credentials
Risk‑tiered with human‑in‑the‑loop
Dry‑run impact reports
SOAR playbooks
Trace steps with spans/metrics
Record provenance
Export to SIEM
Neural Voyager powers real-time intelligence and optimized performance for organizations requiring reliable, contextual data solutions, enabling impactful insights across industries.
Explore NV works with eVa to codify and secure data provenance
Examples
goal: "Explain revenue dip"
plan:
- read: SELECT * FROM revenue WHERE quarter='Q2'
- transform(sql): WITH ...
- synthesize: dashboard
soi: Social
approvals: none
goal: "Fix malformed emails"
plan:
- discover: schema
- simulate: UPDATE ...
- approval: owner + SOC
- write: UPDATE ...
soi: Private
approvals: required
Embedded models within Neural Voyager enable the platform to perform in-context data processing, extracting meaningful insights and adapting to real-time user inputs
The embedded models can be tailored to various data types and user needs, enhancing the platform’s adaptability across different applications
These models process data within its unique context, ensuring that the information delivered is relevant, actionable, and reliable for users in both high-stakes and day-to-day scenarios
API
HTTP+JSON; MCP manifests for tools.
POST /v1/voyager/plans { goal: "find churn", soi: "Social" }
POST /v1/voyager/plans/{id}/execute { dryRun: true }
Neural Voyager functions as a load balancer between large language models (LLMs), managing requests and distributing tasks to optimize performance and resource usage
Acts as a parser, breaking down complex queries and routing them to the appropriate LLM or model, ensuring accuracy and reducing processing time
By balancing and parsing requests between LLMs, Neural Voyager prevents overload, reduces latency, and ensures that each request is handled by the best-suited model for the task
Get started
Bring your database and a problem. We'll wire MCP tools, define policies, and run a safe agent pilot in days.