The Clinical Ops pack took longer than any other vertical we have built. It started as a simple taxonomy project and became a lesson in why Principle 5 — Verticals, not horizontals — is not an aspiration. It is a constraint. Generic outbound tools cannot do this work. We barely could, and we tried.
Here is what eleven weeks actually looked like.
Week one: what we thought we knew
Clinical operations sits at the intersection of healthcare administration, software procurement, and compliance. The buyers are typically VP-level operators managing care coordination, clinical documentation, revenue cycle management, or workforce scheduling. They are busy, suspicious of vendors, and — unlike, say, a devtools VP — have regulatory exposure that makes a bad purchase genuinely costly.
We knew this going in. We thought it meant we needed compliance-flavored angles and language that acknowledged HIPAA. That assumption was wrong in two directions.
First: the compliance angle alone is weak. Every vendor selling into clinical ops leads with compliance. It is the vertical equivalent of "we take security seriously." The signal has to be specific to the organization, not the regulation.
Second: the language requirement is stricter than we anticipated. Clinical ops buyers talk to a lot of vendors who do not understand what they actually do. They have developed a fast filter for the ones who do. If your email uses the wrong term — "patient engagement" when you mean "care coordination," or "EHR optimization" when you mean "documentation workflow" — it goes in the bin.
We spent week one learning how wrong our initial taxonomy was.
The advisors
We brought in three advisors. Their backgrounds determined what we built.
Advisor 1 spent nine years in revenue cycle management at a regional health system before moving to a vendor side. She could tell us, specifically, which pain signals a VP of RCM would find credible vs. which ones would flag us as outsiders. She reviewed every signal in the RCM cluster — 14 signals — and killed 7 of them in the first pass.
Advisor 2 ran clinical documentation improvement programs at two academic medical centers. His contribution was the CDI signal cluster: 11 signals around documentation gaps, coding accuracy, and denial rates that a generic outbound tool would never surface because they require understanding what a CDI query rate is and why it matters.
Advisor 3 worked in care coordination technology, most recently at a digital health company that sold into hospital systems. She was the most skeptical of the three. Her consistent feedback: "No one in this space is going to believe a cold email that cites this." She made us raise the evidence bar for every signal she reviewed.
We paid all three as consultants for the duration. It was the right call.
The signal taxonomy process
We started with around 45-50 candidate signals , generated from a combination of prior Devtools and Fintech pack methodology applied to clinical ops research, plus signals the advisors nominated directly.
The first advisor review cut 47 to 31. Not by vote — by applying one question to each signal: Can you find this in a public data source without inside knowledge?
Seven signals failed because they required knowing things you could only know from a sales call — like whether a health system had recently changed its coding vendor. Useful for account planning. Useless for outbound research, because the point is to get to the sales call.
Three more failed because the observable evidence existed but was too noisy. One signal relied on CMS quality measure scores as a proxy for operational pain. The problem: a low score could mean operational failure, could mean a documentation gap with no operational consequence, or could mean the system genuinely serves a harder patient population. A signal that requires that much interpretation is not a signal. It is a hypothesis for the sales call.
The remaining 31 went into a second review. Advisors rated each on two dimensions: signal clarity (how unambiguous is the evidence) and signal urgency (how acute is the pain, typically). Signals below a combined threshold were cut or merged.
Final count after two advisor reviews: 28 signals.
The 3 signal clusters that actually produce replies
After testing the 28-signal Clinical Ops taxonomy across 340 sends in beta, three clusters pull ahead.
Cluster 1 — Denial rate exposure
When a health system's denial rate exceeds a visible threshold — observable through CMS quality data, state reporting requirements, or, in some cases, press coverage of financial audits — the pain is acute and timed. Denials cost money. They accumulate. They are also a known problem, which means the buyer already has budget context and does not need to be educated.
Generic outbound tools cannot do this work. We barely could, and we tried.
The signal fires when two conditions meet: a visible denial-rate indicator above 12% (industry average is ~8%) and a recent job posting for a denial management role or revenue integrity function. When both are present, the pain is current and the organization is actively trying to solve it.
Reply rate in beta testing across this cluster: ~22-24% . Small sample — 67 sends — but the pattern held across both regional health systems and large academic medical centers.
Cluster 2 — Staffing velocity mismatch
Clinical ops hiring is a leading indicator of operational pressure. When a health system posts 6+ clinical documentation specialist roles in 60 days and their current vacancy rate (sometimes visible via state workforce data) is above 15%, they are in a staffing cycle that technology is often brought in to address.
This signal is time-sensitive. The window between "we are hiring aggressively" and "we gave up and bought software" is roughly 90 days. If you catch the organization during that window, the conversation is already about what you sell. If you are late, they have either filled the roles or moved to a different solution.
Cluster 3 — Regulatory compliance deadline
This is the most predictable signal in the pack. Joint Commission surveys, CMS Conditions of Participation reviews, and state licensure renewals follow known cycles. A health system with a survey due in the next 90 days is under preparation pressure. If you can find evidence that their last survey had findings in an area relevant to your product, the opening is specific and credible.
Observable via: state health department public data, CMS Certification & Survey data (public), Joint Commission accreditation public disclosure, local press coverage of survey results.
What did not make it in (and why)
Telehealth expansion signals. We built a telehealth cluster and killed it. The market moved too fast and in too many directions for a stable signal taxonomy. What was true about telehealth investment in early 2024 reversed in late 2024 as reimbursement policy shifted. Signals that decay faster than we can maintain them are not worth including.
EMR migration signals. These seem obvious. A health system migrating to Epic is under operational pressure. The problem is that EMR migrations are 18-to-36-month projects that are announced, delayed, and re-announced. The signal fires, but the timing window for a productive conversation is unclear. We tested it and got no reply-rate lift over baseline. Removed.
Leadership transition signals. New CIO, new CMO, new VP of Clinical Ops — we tested these. Reply rates were flat to baseline. The hypothesis was that a new leader would be receptive to new vendors. The actual pattern was that new leaders spend the first 90 days listening internally, not to cold email. We may revisit this for a tenure-aware variant (45–90 days post-start, not 0–30).
The tone profile
Every pack ships a tone profile alongside the signal taxonomy. The Clinical Ops tone profile is the strictest we have written.
Key rules:
The email cannot reference "patient outcomes" unless the product directly affects patient care, not just administrative operations. Clinical ops buyers find this language condescending when it comes from a vendor whose product is about documentation workflow.
The email must acknowledge regulatory familiarity without leading with it. One sentence that demonstrates contextual knowledge is enough. A paragraph about compliance is a tell.
The email must be shorter than average. Clinical ops buyers read on mobile between meetings. The draft target for this vertical is 90–110 words, versus our standard 120–160.
Receipts
Illustrative figures from Paitho Clinical Ops pack beta testing, Q4 2025 – Q1 2026.
- Candidate signals entering first advisor review: 47
- Signals cut in first review: 16
- Signals cut in second review: 3
- Final taxonomy: 28 signals
- Beta sends: 340 across 4 operators
- Average reply rate across all clusters: ~14-16%
- Cluster 1 (Denial rate exposure) reply rate: ~22.1% (small sample)
- Cluster 3 (Regulatory deadline) reply rate: ~19.4% (small sample)
- Signals retired post-beta for insufficient lift: 4 (telehealth cluster fully removed, EMR migration removed)
- Advisor hours billed: 61 total across 11 weeks
Closing
Principle 5 — Verticals, not horizontals — means you are not selling generic outbound capability into clinical ops. You are selling a signal taxonomy that advisors with nine years in revenue cycle management reviewed, cut in half, and approved. That distinction is the product.
A generic outbound tool cannot make that claim. Neither could we, before we did the eleven weeks.
Related:
- The Anatomy of a Pain Signal: Reverse-Engineering 28 of Them
- The 10-K Is a Pain-Signal Goldmine. Here's the Read Order.
- Vertical Packs — Docs
— Sam Park , Vertical Pack Lead
Principle 5 — Verticals, not horizontals.