Deterministic semantic coherence infrastructure.
26MB. CPU-only. Air-gapped. Built for environments where ambiguity is expensive.
Computing moved from hardware to software to models.
The next layer is meaning itself — measured, ranked, and made operational.
Each section below is a concrete result where ordinary software surfaces options but cannot decide. Arbiter resolves the constraint geometry directly.
Live API · same input, same score · coherence as infrastructure
Vector arithmetic over 72-dimensional space. Not a metaphor — a measurement.
Huth Lab used a $3M fMRI machine and measured one million neurons per voxel. ARBITER used 26 megabytes and language alone. The geometry matched.
| Program | Agency | Status | ARBITER maps to |
|---|---|---|---|
DARPA AIDA Active Interpretation of Disparate Alternatives | DARPA I2O | 2026 Recompete | Multi-INT Fusion. 0.746 coup vs 0.534 invasion in 0.563s. |
CJADC2 Combined Joint All-Domain C2 | CDAO / J6 | Active · $100M+ | +165% ranking reversal across command levels. JADC2 alignment made computable. |
DoD Machine Common Sense MCS Program | DARPA I2O | Window Open | Deployment gate scored CrowdStrike CF291 at 0.449. No Falcon data required. |
DARPA SemaFor Semantic Forensics | DARPA I2O | Option / Recompete | Absence detection. Normal activity scores LOW against absence query. |
NIWC Pacific CSO N6600126SC001 | NIWC Pacific | Open · Feb 2029 | 26MB. CPU-only. Air-gapped. SCIF-ready. SAM active. Ready now. |
ARBITER was built by someone who was in the room when JADC2 failed to coordinate — when semantic gaps between command levels produced operationally incoherent decisions at speed.
The foundational insight: meaning exists in geometric relationships, not token frequencies. Compress the geometry correctly and you preserve meaning. Preserve meaning and you can measure coherence. Measure coherence and you can gate decisions.
The doctrine implication: Capability × Intent = Hostility, not Threat. Threat is the analyst's label. Hostility is what the geometry actually measures. That distinction has operational consequences.
Proprietary compression architecture. 768D → 72D with 99.6% cosine retention. 11% better than PCA.
Run a query against a problem you already understand. If the geometry returns what you expect — and one thing you did not — that is the moment it makes sense.
Schedule a Briefing →© 2025–2026 Actual General Intelligence Inc.
Not AI. Not probability. Geometry.