TechFides — May 2026
Your front-desk coordinator opens ChatGPT. She pastes the morning's voicemails to summarize them into a triage list. Patient names. Reasons for calling. Insurance complaints. By 9 AM, she has a clean digest in her inbox. By 9:01 AM, OpenAI has the patient names, reasons for calling, and insurance complaints.
You have a Business Associate Agreement with your EHR vendor. You have one with your billing service. You do not have one with OpenAI. You cannot, because OpenAI does not sign them for the standard ChatGPT product.
This is the most common HIPAA exposure I see in medical practices in 2026. It is not a hacker. It is not a stolen laptop. It is a competent staff member trying to be productive with the tools she already has on her phone.
Here is what private AI for a medical practice actually looks like — and why I think most practice administrators are about a year away from being asked the wrong question by an OCR investigator.
The HIPAA reality nobody put in the orientation packet
The Office for Civil Rights enforces HIPAA. They do not enforce intention. They enforce data flow.
If protected health information leaves your covered entity to a third party that has not signed a Business Associate Agreement, you are out of compliance. It does not matter how the data left. Email, fax, screenshot pasted into a chatbot — the data flow is the same.
OCR's enforcement budget went up in 2025. The settlement amounts are getting larger. The first published settlement involving a clinic that allowed staff to use a public AI tool with PHI is somewhere on the calendar. It will be a five- or six-figure settlement, plus a corrective action plan, plus the press release that no practice administrator wants to read.
You do not need to be the test case. You need a path that lets your staff use AI without the data flow that triggers the violation.
What private AI actually looks like in a 12-provider practice
Picture a multi-provider primary care practice in Plano. Twelve physicians, eight nurses, four front-desk staff, two billing coordinators. Their AI footprint today probably looks like:
- Two physicians who quietly use ChatGPT to draft patient education handouts ("explain hypertension at a sixth-grade reading level")
- A nurse who pastes encounter notes into Claude to "tighten" the SOAP format before they hit the chart
- A front-desk coordinator who summarizes voicemails into a triage list every morning
- A billing coordinator who runs denial letters through ChatGPT to draft appeals
None of these uses are malicious. All of them are productive. Every single one is technically a HIPAA violation, because patient-identifying information is leaving the covered entity to a third party without a BAA.
Now picture the same practice with private AI installed. A small server in their server closet. A medical-tuned open-source model running on it. Staff access it through the same kind of chat interface they already know — but the data never leaves the building.
The physician drafts the patient education sheet. The nurse cleans up the SOAP note. The coordinator summarizes voicemails. The billing coordinator drafts appeals. Same workflows. Same productivity gain. Zero data flow to a third party. The covered entity stays bounded.
In our Private AI tier structure, a practice this size lands at Growth — $2,299/month, hardware loaned, monitoring included. That is roughly comparable to one EHR add-on module, except this one prevents a category of HIPAA exposure rather than creating new ones.
Why HIPAA-eligible cloud AI is not the same answer
The pushback I hear from sophisticated practice administrators is fair: "Microsoft offers HIPAA-eligible Azure OpenAI. AWS has HealthLake. Google Cloud has BAA-covered AI. Why not just use one of those?"
You can. Three reasons most SMB practices do not:
1. The BAA only covers what the BAA covers. A HIPAA-eligible cloud product has a long list of "covered services" and a longer list of "non-covered services." Staff who paste a prompt into the wrong product, or use a model variant that is not on the BAA list, are still creating a violation. The complexity of staying compliant inside the covered services is itself a compliance burden.
2. The data still leaves your building. Even with a BAA, the data crosses the public internet, sits in someone else's data center, and is processed by someone else's team. You have signed away the ability to audit it directly. Some practices are comfortable with that. Others, especially behavioral health and substance-use practices, are not.
3. The cost scales with use. Cloud AI is priced per call. The more your staff uses it, the more it costs. The exact opposite of what you want when you are trying to encourage adoption.
Private AI costs the same whether your team uses it ten times a day or ten thousand. That predictability is its own compliance benefit — your staff is not rationing access for budget reasons.
The objections, in order
"We don't have IT staff."
Most SMB medical practices do not have full-time IT. We loan the hardware. We install it on a Saturday so we don't disrupt clinic. We monitor it 24/7 from our network operations center. Your office manager is not learning Linux. Your providers click a bookmark in their browser, the same way they click into your EHR.
"What about the EHR integration?"
Most modern EHR systems (Epic, Athena, eClinicalWorks, NextGen) have APIs that can read and write to a private AI tool installed on your network. We do that integration as part of the deployment. Your providers do not paste between systems — they click "summarize this encounter" inside the EHR and the private AI does the work behind the scenes.
For practices on older EHRs without APIs, the workflow is browser-based and works for everything except direct chart writes. We are honest about which workflows fit and which do not before you spend a dollar.
"What if we get audited?"
The OCR auditor's first question is going to be: "Show us your data flow diagram for any tool that touches PHI." With private AI, that diagram is one box on your network with an arrow pointing in and an arrow pointing back to the same network. No third party. No external data flow. The auditor's checklist gets shorter, not longer.
We provide the documentation needed to add private AI to your HIPAA risk assessment as part of our deployment.
"What if our staff prefers ChatGPT?"
They preferred their personal Gmail before you set up the practice email system. They preferred their personal cell phone before you set up the practice phone tree. People prefer what they know until you replace it with something just as good that does not put their job at risk.
Private AI feels like ChatGPT. Same chat interface. Same speed. The difference is that when an OCR investigator shows up, your staff does not have to explain why they pasted patient names into a tool that does not have a BAA.
Where to start
The next move is not to buy a server. It is to do an honest 30-minute AI inventory.
This week, ask your office manager to make a list of every place AI has shown up in your practice. Personal accounts your staff uses. Subscriptions on the practice card. Browser extensions. Anything. You will find more than you expect.
Then ask the simple question: which of these have signed BAAs with you?
If the answer is "none," you have a HIPAA exposure that grew in the dark. Private AI is the way to bring it back into the light without taking the productivity gains away from your team.
We built TechFides Private AI for medical practices specifically because this is one of the verticals where the data-flow problem is sharpest and the consequences are clearest.
If you want a tailored view of what this looks like for your practice — including a HIPAA risk assessment for your current AI use — start with our 8-minute readiness assessment. It produces a real plan, not a sales pitch.
The practices that move on this in the next year will look back and treat private AI the same way they treat their EHR: a normal cost of practicing medicine in the 2020s. The practices that wait are betting that the first OCR settlement involving public AI doesn't reference a clinic that looks like theirs.
Like this? Get the next one Wednesday.
One email per week. No marketing filler. Unsubscribe anytime.