API Reference
Every public class, function, and parameter in the Python and Polars/DuckDB I/O layer. With docstrings, type signatures, and runnable examples. Generated directly from the source — if it’s in the codebase, it’s in the docs.
Installation. Quickstart. API reference. Deployment recipes. Metric formulas with academic citations. Read it once. Run it forever.
Observatory ships with everything you need to install, run, and deploy it — no support contract, no onboarding workshop, no “reach out to your account manager.” The documentation is the entire manual. If something isn’t in here, it isn’t a feature yet.
One line. Standard PyPI package. No build tools required — the closed C calculation core ships as a precompiled wheel for Linux, macOS, and Windows. The CPAL-licensed I/O layer is also available on GitHub for inspection or fork.
Load an export, mine it, export the results. Four lines of Python. Works the same on every supported source platform — the schema-detection layer handles the differences.
Laptop, server, container, CI pipeline, WASM browser bundle. The same library, the same code, the same outputs — wherever it runs. The closed core is identical across targets.
Every public class, function, and parameter in the Python and Polars/DuckDB I/O layer. With docstrings, type signatures, and runnable examples. Generated directly from the source — if it’s in the codebase, it’s in the docs.
Every metric the closed C core computes, with its formula, its assumptions, and its peer-reviewed academic source. The binary stays closed; the math stays auditable. Five hundred metrics across Lean, Six Sigma, theory of constraints, and statistical process control.
Step-by-step recipes for laptop, server, Docker container, Kubernetes pod, GitHub Actions, GitLab CI, and WASM browser bundle. Every recipe is end-to-end — from blank machine to running pipeline — and tested on a clean environment.
Ready-to-run analyses by source platform. Jira sprint diagnostics. ServiceNow incident replay. SAP procurement flow. Each pattern is a complete script you can copy into your environment, point at your export, and run.
Everything you need to go from zero to a Power BI-ready Parquet file. Install, load, mine, export. This block is the entire end-to-end pipeline — nothing is hidden behind a paywall or an enterprise tier.
# 1. Install $ pip install ghostcitadel # 2. Load & auto-detect platform from ghostcitadel import Observatory obs = Observatory("jira_export.csv") obs.detect_platform() # 3. (Optional) Connect your AI provider obs.connect_ai("mistral", api_key=os.environ[...]) # 4. Mine the metrics metrics = obs.mine( metrics=["value_yield", "flow_efficiency", "rework_rate", "end_loading"], lean_mode=True ) # 5. Export to Power BI obs.export("power_bi_ready.parquet") # Done. Total runtime: 0.42s on 1M events.
If the docs say it works, it works. Every example in the documentation is automatically tested against the live codebase on every release. When the code changes, the docs change. When the docs are wrong, the build fails. The manual you read tonight is the manual that ships tomorrow.
Every code block in the documentation is extracted and run against the latest build on every commit. If an example breaks, the release is blocked. Documentation drift is a build failure, not a wishlist item.
Each release ships a frozen documentation snapshot. If you’re running v10.5.6, you can read the v10.5.6 docs — not whatever happens to be on the main branch today. Reproducibility goes both ways.
The metric catalogue, formulas, and academic citations are all public. The closed binary doesn’t hide the math — it just protects the implementation. You can verify any computation against the published reference.
Continue Exploring
Open Polars + DuckDB I/O. Closed C calculation core. Academic backing.
Read the spec 06 · LicensingCPAL 1.0 free tier, unlimited usage tier, and the partnership tier.
See the options ← HubAll six paths into the engine, from one map.
Back to the hub