Jordan Leahy
Back to Thoughts

AI is Not a Doctor. It is a Detective.

Feb 2, 2026Philosophy6 min read
AI analysis abstract visualization

Ask a layman what AI in healthcare does, and they imagine a robot surgeon. Ask a SmarterDx engineer, and they see something far more practical: The world's most relentless detective.

We founded SmarterDx on a premise that seems contradictory to the "AI Hype" cycle: AI shouldn't make the diagnosis. It's not there to replace the physician's judgment. It's there to ensure that judgment is capture, documented, and paid for.

The "Clinical Nuance" Gap

Healthcare data is messy. A patient isn't a row in a database; they are 14 days of progress notes, messy labs, and fragmented nursing shifts.

Generic LLMs struggle here. They see keywords. But clinical reality is nuance:

  • The Keyword: "Heart Failure"
  • The Nuance: Is it acute? Chronic? Systolic? Diastolic? Or is it just "Fluid Overload" from too much IV saline?

The difference between those words isn't semantics. It's thousands of dollars in reimbursement and a completely different care plan.

Augmentation is the Strategy

My design philosophy at SmarterDx wasn't about "Automating" the Clinical Documentation Integrity (CDI) nurse. It was about Super-Powering them.

We built the AI to read 30,000 data points per chart. It connects dots that a human, tired at the end of a shift, might miss.

The Feedback Loop

The AI says: "I see a pattern of rising Creatinine and a Doctor's note about Lasix. Is this Acute Kidney Injury?"

The Human says: "Yes. Good catch." OR "No, that's baseline for this patient."

This is the "Second-Level Review." The AI is the detective that combs the scene. The clinician is the judge who issues the verdict.

Conclusion

When we stop trying to make AI a "Doctor" and start treating it as a "Revenue Integrity Engine," the conversation shifts. We stop worrying about it "hallucinating" a diagnosis and start using it to "hallucinate" opportunities for humans to verify.

That is the difference between a tech demo and a product that generates $2.5M in net impact per hospital.

Agent Context Analysis
query_mcp --topic "clinical-ai-augmentation" --limit 3
[1]
AI in Health Care: The Hope, The Hype, The Promise, The Peril
Source: National Academy of Medicine2025
[2]
[3]
Augmented Intelligence in Medicine
Source: AMA Journal of Ethics2023
[STATUS] Context Verified • 142ms latency