CleanRoom strips identifiable patient information locally — on the device, in Australia — before any AI model sees it. Every session generates a cryptographic audit record your compliance team can actually use.
Clinicians paste patient notes, names, MRNs and Medicare numbers into AI tools every day. Not from negligence — from invisibility. The data leaves the device the moment "send" is clicked, and there is no artefact to show what was sent, where it went, or whether it was retained.
BAAs and contractual assurances solve a paperwork problem. They do not solve an evidentiary one. When the OAIC asks what specific patient information was disclosed in a specific session, "we have a vendor agreement" is not an answer. APP 11 requires reasonable steps. Reasonable steps are now technical, not contractual.
Most healthcare AI products — including the Australian-built ones — transmit identifiable data to a cloud model and rely on contractual de-identification at the other end. That model is structurally incompatible with APP 8 cross-border disclosure rules and the ADM Framework's accountability obligations. It is also the model the regulator is now writing guidance against.
What's missing is not another AI product. What's missing is the layer that makes every other AI product safe to use under Australian law.
CleanRoom sits at the moment of disclosure — the paste, the send, the API call. It detects identifying information, strips it locally, and proves what it did.
The full CleanRoom round-trip in fifteen seconds — clinical text in, tokenised before it leaves the device, processed by the AI without ever seeing PHI, returned, re-identified locally for the clinician, and recorded in the Sentinel audit trail.
CleanRoom is a binary classifier at the token level — every word is either flagged as PHI or not. The right metrics are the same ones you use to evaluate a screening test.
Australia's privacy regime is principles-based. The "reasonable steps" tests in APPs 1 and 11 are what your compliance team has to defend. CleanRoom is the technical artefact that defends them.
Most healthcare privacy tools are built by engineers who have never written a clinical note, lodged a referral, taken a verbal handover, processed a claim, or filed a care plan at the end of a long shift. They build for a sanitised idea of how healthcare works — not the Friday-afternoon reality of a busy clinic, ward, pharmacy, or community service.
CleanRoom started inside the system, not outside it. Every entity type, every false-negative case, every workflow assumption was shaped by frontline reality — not abstract privacy theory. The taxonomy includes Medicare and IHI numbers, RACF identifiers, NDIS numbers, ACFI codes, hospital URNs, pathology accession numbers, allied health referral details, interpreter and witness fields, next-of-kin entries — because those are the identifiers that appear in the notes, letters, plans, scripts, claims and care records that Australian health professionals actually produce, every day.
Sovereignty is not a marketing line. CleanRoom processes patient data on the user's device. There is no central server in Australia, and no shadow server in the United States. The architecture makes it impossible for PHI to traverse a foreign jurisdiction during the de-identification step. That is what APP 8 actually requires — and what BAAs cannot deliver.
The market CleanRoom serves is the long tail of Australian healthcare: general practice, specialist clinics, allied health, pharmacy, nursing services, residential aged care, community health, disability services, dental, mental health, private practice — and the back-office functions (practice managers, billing, intake, referral coordinators) that handle PHI without making the headlines. None of these settings will ever deploy a private LLM in their own data centre. They will use commodity AI tools, and they need the layer that makes those tools safe under Australian law.
On the device. CleanRoom's processing layer runs locally — in the browser, in the application, or in a sidecar process inside your environment. Identifying patient information is not transmitted to any external service for de-identification. PHI does not leave Australian jurisdiction at any point in the strip. That is the architectural guarantee.
Standard application logs are mutable, contextual, and not designed as evidentiary artefacts. The Sentinel Record is a per-session, hash-chained audit trail using SHA-256: it records entity counts, timestamps, model destinations, and session integrity in a form that can be verified after the fact. It is built to satisfy the "reasonable steps" evidentiary standard under APPs 1.7–1.9 and 11, not to satisfy a developer debugging a bug.
No. CleanRoom is the layer between your clinician and your AI vendor — whichever vendor that is. Heidi, GPT, Gemini, Claude, ambient scribes, structured data extractors. CleanRoom does not compete with them. It makes them defensible under the Privacy Act.
Recall (sensitivity) is the metric we optimise hardest because false negatives are the compliance risk. The current evaluation corpus shows 92.5% recall across all 28 entity types and 100% sensitivity on structured Australian identifiers. Known gaps — primarily non-Anglo names in unstructured prose — are tracked publicly in the evaluation summary and prioritised in every release. We will not market a number we cannot reproduce on request.
Public hospital tertiary networks running fully internal large-language-model deployments behind IRAP-certified infrastructure. If you have your own private LLM hosted in your own data centre, you have already solved the disclosure problem at the network layer. CleanRoom is built for the long tail — GP clinics, RACFs, specialist practices, allied health, regional services — that will use commodity AI and need the layer that makes commodity AI safe.
The OAIC does not certify products. It issues guidance and enforces principles. CleanRoom's strategy is to make architectural de-identification the recognised reasonable-steps standard for clinical AI — through OAIC consultation submissions, the APP Code mechanism, and engagement with indemnity insurers. The goal is not certification. It is becoming the architecture compliance is written around.
If you're responsible for clinical AI risk in an Australian healthcare organisation, an indemnity insurer, or a digital health vendor — book the call. The fastest way to find out whether CleanRoom is relevant is a 20-minute conversation with the founder.