Scientific imaging — evidence-first pipelines for black-hole investigations
Telescopes give you data. Software gives you images. Between them sits a pipeline where every decision — which frames to stack, which stretch curve to apply, which flare candidate to flag, which color space to render — either preserves the evidence trail or destroys it. Most astronomical imaging software treats the evidence as disposable, and the result is an industry-wide culture where an image's provenance is folklore rather than artifact. AstroRoom treats the evidence as the product. This is where my background in cost-and-change-management engineering meets my background in photography. I spent years auditing capital expenditure across hundreds of millions of dollars; the artifact that mattered at the end of a project was not the final spreadsheet, it was the trail that let someone years later reconstruct exactly what was decided, by whom, under what assumptions, and against which constraints. Without that trail, the number on the spreadsheet was unfalsifiable. With it, every line was auditable end to end. AstroRoom applies the same philosophy to scientific imaging. Every artifact it produces signs its own provenance: every render appends to a hash-chained event ledger, every evidence row carries a confidence score and a claimability indicator, every export can be replayed from its own recipe. If the output of the pipeline ever becomes part of a scientific claim, the claim can be verified back to the original FITS frames without a single gap in the chain. AstroRoom is an evidence-first scientific imaging workspace for Sagittarius A* and Galactic Center investigations. FastAPI backend, React 19 + Vite frontend. The workflow is organized as a six-mode shell, traversed left-to-right like a darkroom, with each stage producing auditable artifacts rather than just passing pixels downstream. Library is where an investigation begins: session entry with resumable state, an astronomical asset gallery, working sets for staging FITS frames. You register a target and the session becomes the unit of provenance for everything that follows. Develop is the FITS viewer. Per-frame adjustments: stretch curves, clipping, colormap selection, cosmic-ray cleanup. Analogous to a raw-image developer in a darkroom — every adjustment is reversible, every adjustment is logged. Build is the heavy-compute stage. Epoch construction aggregates frames into temporally consistent groups. Deep-stack mega-exposures combine many frames into a single high-SNR image. Timelapse generation produces video of structural change over time. Volumetric 3D builds generate true volumetric data cubes from multi-epoch series. Zoom-ladder matrices produce multi-scale views for publication-grade presentation. Analyze has four tabs, each producing auditable outputs rather than just visualizations. Flare detection runs statistical anomaly scoring against baseline epochs. Spectral index maps compute per-pixel spectral gradients. The Galactic Center Excess dashboard is built specifically for SgrA* investigations. Source extraction identifies and catalogs discrete sources against the background. Evidence is the calibrated v2.95 evidence engine. You inspect ledger rows directly — each row carries a confidence score, a claimability label, and the full recipe needed to reproduce it. Export paths produce JSON, CSV, and schema artifacts that downstream tools (or collaborators, or reviewers, or future you) can consume without loss. Export is where the pipeline meets the world. PDF reports bake the full recipe into document metadata. Timelapse MP4s carry provenance sidecars. Zoom-ladder matrices export as annotated PNGs with embedded scale bars, grid overlays, and titles. This is the publication-ready terminus. Two tools are available from every mode. Lit, powered by KnowledgeSessions, harvests papers from arXiv and ADS, embeds them, and lets you query semantically — results can be linked directly into evidence reports so a paper's argument and its empirical backing live in the same document. Agent is a natural-language command bar that accepts any language, normalizes to English, routes to a JSON action, and executes the matching pipeline phase. Recent work has included flare hunts on Sagittarius A*, comparative black-hole quintet studies across SgrA*, M87, NGC 1275, NGC 6240, and Cygnus X-1, deep mega-exposures of M31 rendered as false-color composites with auto-annotation, and triple-product timelapses over Sagittarius A* that emit observed, residual, and confidence MP4s simultaneously from a single run. Every render appends to the hash-chained event ledger; verification is a single API call. Public on GitHub. Licensed PolyForm Noncommercial 1.0.0.