An Intelligent Knowledge Server (MCP) for LLMs. Bridging the gap between raw text and semantic understanding using SQL 2025 Vectors.
Initialize ConnectionStandard agents hit token limits instantly and miss conceptual links.
CherryPicker finds code by intent and serves compressed context.
Runs ONNX embeddings (all-MiniLM-L6-v2) locally on GPU. No code leaves your firewall. True data sovereignty.
Combines Vector Cosine Distance (Semantic) with Lexical Boosting (Keywords) for 100% recall accuracy.
Agents perform "Dry Run" refactors in memory. Changes are committed atomically via file swaps. Zero corruption risk.
Feeds deterministic Roslyn diagnostics (Compiler Errors) into the probabilistic LLM context. Facts meet creativity.
Leverages native VECTOR(384) columns for sub-millisecond similarity search across millions of lines of code.
Fully compliant with the Model Context Protocol. Plug-and-play with Claude Desktop, Cursor, and Windsurf.
Token-Optimized Object Notation.
Standard JSON is wasteful. CherryPicker compresses architectural knowledge into a dense, high-signal format designed specifically for LLM consumption.