Assistance, not decision
The AI suggests, you decide. For clinicians, AI is a copilot — never a substitute for clinical judgment.
AI is a tool, not an oracle. It assists, suggests, proposes. It never decides on behalf of a human — patient or clinician.
Every AI feature is designed, audited and deployed against these four red lines.
The AI suggests, you decide. For clinicians, AI is a copilot — never a substitute for clinical judgment.
Every clinical suggestion goes through human validation. GDPR Article 22 honoured natively, not patched after the fact.
You choose which AI feature applies, by pathology, by period. Revocable any time.
Models hosted in France/EU on HDS v2 certified infrastructure. No Big Tech. Open-source where possible.
Every AI suggestion shows its sources: HAS guidelines, indexed papers, step-by-step reasoning.
You can challenge a suggestion, flag an error, request a human second read — directly from the app.
Performance measured in production, drift watched, models retrained and audited. Nothing is frozen.
No regulatory greenwashing. Each badge maps to a technical dossier reviewable by an auditor.
HDS v2
Certified Health Data Hosting — data and AI in French/EU datacentres.
GDPR art.25
Privacy by design & by default. Minimisation, explicit purpose, actionable rights.
AI Act high-risk
Classified high-risk (health) — risk management, traceability, transparency, human oversight.
LNE-GMED
French Notified Body for CE marking of medical software (MDR / IVDR).
Five non-negotiable principles and eight contractual commitments that make your passport your property, not the vendor's product.