To “say what the law is” in the age of machine decision‑making is also to say what the model did—and whether it may lawfully do it again.
— Aditya Mohan, Founder, CEO & Philosopher-Scientist, Robometrics® Machines
The chamber was hushed, save for the sputter of candles fighting drafts that slipped through tall windows. Inkpots gleamed on crowded counsel tables, papers stacked in uneven towers. Young clerks leaned forward, quills poised, while advocates who had argued long into the night watched with sharpened anticipation. At the center sat Chief Justice John Marshall, his frame imposing in black robes, eyes steady beneath a furrowed brow. He bent over the bench, voice measured yet resonant, and began to read. The words seemed to press against the wood‑paneled walls: “It is emphatically the duty of the Judicial Department to say what the law is.” The line, deliberate and unhurried, shifted the air itself. Lawyers exchanged glances; a few jurors gasped. In that moment, Marshall claimed for the judiciary not a sword nor a purse, but the quiet authority of review—the power to bind the political branches to the Constitution.
The gallery sensed the magnitude though they could not yet name it. The decision denied William Marbury his commission, but the deeper declaration was unmistakable: courts could strike down laws and executive acts that violated the supreme law of the land. Marshall’s candlelit pronouncement transformed the judiciary from passive referee into an active guardian of the constitutional order.
Marbury v. Madison established judicial review—the authority of courts to nullify laws and executive actions that conflict with the Constitution. In practice, it meant that every exercise of public power must be reasoned, reviewable, and tied to lawful authority. Two centuries later, that duty reaches beyond statutes and signatures to the systems agencies use to decide. When public action is mediated by software, the Constitution still governs the decision.
Agencies now lean on automated tools to set benefits, flag fraud, assess risk, and allocate scarce resources. Those outputs can shape liberty and property as surely as any order signed in ink. Under Marbury’s logic—and the Administrative Procedure Act—courts scrutinize these decisions: Was the agency authorized to use such a system? Is the record complete? Is the reasoning intelligible and supported by evidence rather than deference to a “black box”? Judicial review demands a path the court can follow from data to decision.
If key elements of an automated system—model weights, training data and provenance, feature definitions, validation methods, and error rates—are withheld from the parties and the court, meaningful review collapses. That is Star‑Chamber logic in digital dress. Courts should require disclosures sufficient for adversarial testing or, where trade secrets are at stake, supervised access through protective orders, court‑appointed neutrals, or independent audits. Either way, the record must allow a judge to test reliability, bias, drift, and fit to statutory purpose.
Marbury’s promise endures when automated decisions meet constitutional standards: (1) lawful delegation and clear statutory footing; (2) explainability adequate for due process—human‑readable reasons, not just confidence scores; (3) auditability—logs, datasets (or samples with provenance), model documentation, and reproducible evaluations; (4) contestability—the affected person’s real chance to challenge inputs, assumptions, and outputs; and (5) non‑substitution—human adjudicators treat model outputs as evidence to be weighed, not verdicts to be rubber‑stamped.
Marshall’s candle lit a boundary that still holds. To “say what the law is” in the age of machine decision‑making is also to say what the model did—and whether it may lawfully do it again. Judicial review does not fear new tools; it requires that power, no matter how encoded, remains answerable to reason, record, and the Constitution.