Air-gapped, on-premises SDC Studio for organizations that require complete data sovereignty. Local LLM inference, local cryptographic signing, zero cloud dependencies.
By requesting an evaluation, you agree to the Evaluation License Agreement.
Deployment
Air-Gap Capable
LLM Inference
100% Local (Ollama)
Services
10-Service Stack
Every component runs inside your network. No telemetry, no cloud calls, no data exfiltration vectors.
Complete network isolation. Delivered as a self-contained package — no outbound connections required at any point. Operate on classified networks, SCIFs, or any environment where data must never leave the perimeter.
AI-powered semantic modeling with qwen2.5:7b-instruct running entirely on-premises via Ollama. All inference stays on your hardware. No prompts or data sent to external APIs.
Cryptographic integrity with local ECDSA P-256 keys. Sign and verify data models without relying on cloud HSMs. Your signing keys never leave your infrastructure.
Graphwise GraphDB with OWL 2 RL reasoning as the enterprise triplestore. Production-grade semantic reasoning, SPARQL 1.1 support, and automatic inference over your knowledge graph.
Enterprise directory integration out of the box. Authenticate users against your existing Active Directory or OpenLDAP infrastructure. No external identity providers required.
Your sovereign AI assistant accumulates knowledge about your specific data architecture, components, workflows, and conventions over time - in human-readable files on your hardware. No model retraining required. No data leaves your building. You can read, edit, and audit everything it learns.
Single-command deployment with a 7-service Docker Compose stack. No Kubernetes required. Reproducible, auditable, and simple enough for any infrastructure team to operate.
Core platform plus XMI2SDC conversion agents, all in Docker Compose
ASGI application server with WebSocket support. React SPA + Django templates.
Async task processing for data parsing and agentic model generation.
Periodic task scheduler for maintenance and background operations.
Primary data store with vector embeddings for RAG-powered semantic search.
Message broker for Celery tasks and caching layer.
Enterprise triplestore with OWL 2 RL reasoning. SPARQL 1.1 endpoint for semantic queries.
Local LLM inference server. AI-powered semantic modeling without external API calls.
XMI2SDC Conversion Agents
Reads UML models (XMI 2.1) from Sparx EA, MagicDraw, StarUML. Fully offline.
Maps UML classes and attributes to SDC component types. Deterministic, fully offline.
Submits mapping plans to the Assembly API to create data models. Calls SDCStudioSov at localhost:8000.
Convert UML models from enterprise modeling tools directly into SDC4-compliant data models. No manual re-entry, no cloud dependency.
XMI2SDC reads XMI 2.1 exports from Sparx Enterprise Architect, MagicDraw, StarUML, and other UML tools. A three-agent pipeline parses the model, maps UML types to SDC components, and assembles the result into a complete data model in SDCStudioSov.
Delivery
Included in the sovereign evaluation tarball. Separate docker-compose.xmi2sdc.yml alongside the main stack.
Services
3 persistent MCP servers (parser, mapper, assembler) + CLI for one-shot commands.
Network
All containers use network_mode: host. Parser and mapper are fully offline. Assembler reaches SDCStudioSov at localhost:8000.
MCP Integration
Each agent exposes tools via the Model Context Protocol. Works with Claude Desktop, Claude Code, and Cursor out of the box.
Telemetry
All OpenTelemetry exporters disabled at the image level. Zero phone-home.
Quick Start
# Configure
cp xmi2sdc.example.yaml xmi2sdc.yaml
export SDC_API_KEY="eval-token-2026"
# Start MCP servers
docker compose -f docker-compose.xmi2sdc.yml up -d --build
# Convert a UML model
docker compose -f docker-compose.xmi2sdc.yml run --rm xmi2sdc convert /data/model.xmi
# Assemble into SDCStudioSov
docker compose -f docker-compose.xmi2sdc.yml run --rm xmi2sdc assemble /home/sdc/output/model_mapping.json --title "My Model"
Whether you're bridging from legacy systems or building greenfield infrastructure, the Sovereign Suite meets you where you are.
Legacy systems (SAP, Oracle, custom databases) continue operating. A pipeline feeds existing data into SDCStudioSov, which models it as SDC4-compliant structures and populates the sovereign knowledge graph. No rip-and-replace required.
SDCStudio generates new sovereign applications that ingest legacy data once during standup, then the old systems decommission. New systems are native SDC from day one — fully interoperable, self-describing, with no ongoing ETL pipelines or migration debt. This is the leapfrog path.
Same SDC compliance. Same output formats. Different deployment model.
| Feature | Cloud | Sovereign |
|---|---|---|
| LLM | Vertex AI (Gemini) | Ollama (qwen2.5:7b-instruct) |
| Triplestore | Apache Fuseki | Graphwise GraphDB |
| Authentication | Django accounts + Stripe | LDAP + local admin |
| Signing | Cloud ECDSA | Local ECDSA P-256 |
| Deployment | Google Cloud Run | Docker Compose |
| Network | Internet required | Air-gap capable |
| UML Import | — | XMI2SDC (3 agents) |
Contact us for a sovereign evaluation package. We will work with your team to deploy SDC Studio inside your network.
Subject to the Evaluation License Agreement.