AI Healthcare Regulations in California (2025 Guide)
California AI healthcare regulations explained: key laws, CDPH & DMHC requirements, bias audits, patient rights, and compliance steps for 2025.
California has no single AI-in-healthcare law. Compliance is assembled from multiple overlapping frameworks. If you deploy AI in clinical or administrative healthcare settings in California, you must address data privacy, utilization management, patient-communication disclosure, and algorithmic fairness simultaneously.
Quick Answer: What AI Healthcare Rules Apply in California?
California regulates healthcare AI through a patchwork of statutes and agency rules, not a single omnibus law.
The core frameworks:
- CMIA (Confidentiality of Medical Information Act, California Civil Code §§ 56–56.37): Governs the collection, use, and disclosure of medical information. Any AI system that processes or infers health data is in scope.
- Patient Communication Disclosure: Recent legislation may require healthcare providers to disclose when patient-facing communications are generated by AI and to offer a human-contact pathway. Consult the California Department of Public Health (CDPH) for current statutory requirements.
- Utilization Management Guardrails: Recent legislation may prohibit health plans from using AI as the sole basis for utilization management denials and may require licensed clinician review. Consult the Department of Managed Health Care (DMHC) for current rules.
- CPRA (California Civil Code §§ 1798.100 et seq.): Grants consumers rights regarding automated decision-making with significant effects; health data exemptions are narrow.
- Federal overlay: FDA Software as a Medical Device (SaMD) rules apply to AI tools meeting the device definition. The FTC Act's unfair-practices authority applies to deceptive AI health claims.
Who is covered: Health plans, hospitals, physician groups, digital health vendors, and health data brokers operating in California.
Penalties snapshot: Violations of the Confidentiality of Medical Information Act (CMIA) can result in significant civil penalties. The DMHC has enforcement authority over health plans, which may include actions related to the misuse of AI. Consult the text of the CMIA and DMHC regulations for specific penalty amounts and enforcement procedures.
Data privacy, utilization management restrictions, patient-communication transparency, and emerging bias-audit obligations all apply at once.
Core California Statutes Governing AI in Healthcare
Patient Communication Disclosure
Legislation may require healthcare providers to disclose when patient communications are generated by AI. Key obligations could include:
- A clear disclosure on any written patient communication generated in whole or in part by generative AI.
- A pathway for the patient to request follow-up with a human provider.
- The rules may target after-visit summaries, care instructions, and similar patient-facing documents, requiring labeling and a human fallback rather than banning AI use.
Consult the California Health and Safety Code and CDPH for guidance on specific disclosure language and statutory requirements.
Utilization Management Restrictions
Legislation may impose guardrails on the use of AI in utilization management for health plans and insurers. Key obligations could include:
- A prohibition on using AI or algorithmic tools as the sole basis for denying, delaying, or modifying a utilization management decision.
- A requirement that a licensed, clinically appropriate physician or other qualified clinician review and approve any adverse determination.
- A mandate that AI systems be configured to meet existing utilization review timelines, such as 72 hours for urgent decisions and five business days for standard decisions.
Consult the California Health and Safety Code, Insurance Code, and regulations from the DMHC and California Department of Insurance (CDI) for current rules.
CMIA (California Civil Code §§ 56–56.37)
This is a foundational California health privacy law. Key provisions for AI deployments:
- Civil Code § 56.05: Defines "medical information" broadly to include any individually identifiable information regarding a patient's medical history, mental or physical condition, or treatment. AI-inferred health data can qualify.
- Civil Code § 56.10: Restricts disclosure of medical information without patient authorization. AI systems that share data with third-party model providers may trigger this.
- Civil Code § 56.06: Third-party vendors and contractors that receive medical information from a provider are treated as providers of health care for CMIA purposes. This means an AI vendor may carry direct liability.
- Civil Code § 56.36: Establishes civil penalties for negligent disclosure and criminal penalties for intentional violations.
CPRA (California Civil Code §§ 1798.100 et seq.)
The CPRA's automated decision-making provisions are relevant for healthcare AI. Civil Code § 1798.185(a)(16) directs the California Privacy Protection Agency (CPPA) to issue regulations on automated decision-making technology (ADMT). Draft regulations have included healthcare scenarios. Consult the CPPA regulatory docket at cppa.ca.gov for the current status of these rules.
The CPRA exemption for HIPAA-covered data applies to the data, not the business. Digital health companies that are not HIPAA-covered entities may face full CPRA obligations.
Insurance Code and DMHC Anti-Discrimination Rules
California law prohibits discriminatory insurance and health plan coverage practices. An AI-driven coverage tool that produces statistically disparate outcomes across protected classes is a potential violation under existing rules, regardless of intent.
What Changed Recently: 2024–2025 Legislative & Regulatory Activity
Recent Enactments
Multiple budget acts were chaptered in 2025 that include appropriations for state health agencies, potentially funding increased oversight and enforcement capacity. These include SB 101 (Chapter 4, Statutes of 2025) and SB 105 (Chapter 104, Statutes of 2025). Consult the text of these budget acts and communications from CDPH and DMHC for details on how these funds may affect AI-related oversight.
Pending Legislation
The California Legislature frequently considers bills related to artificial intelligence, data privacy, and healthcare. These may include proposals to require impact assessments or bias audits for high-risk AI systems. Verify the current status of any pending legislation at leginfo.legislature.ca.gov.
Agency Guidance
State agencies such as the DMHC, CDPH, and CPPA may issue guidance letters, FAQs, or formal regulations that clarify how existing rules apply to AI. Consult these agencies directly for the most current guidance, as formal rulemaking can be preceded by informal directives.
Federal Interaction
Federal rules, such as the CMS Prior Authorization Interoperability Rule (CMS-0057-F), create timelines and technical requirements that interact with state laws. Health plans subject to California's utilization management rules should map their AI workflows against both state and federal timelines, particularly where federal rules add API-based prior authorization requirements that affect how AI systems route and document decisions.
Algorithmic Bias, Transparency & Audit Requirements
No standalone California algorithmic-bias audit mandate for healthcare AI is currently enacted as of mid-2025. However, several existing provisions create de facto audit and documentation obligations.
Existing Anti-Discrimination Rules Create Audit Pressure
DMHC regulations prohibit health plan practices that result in disparate impact across protected classes. If an AI tool produces racially or ethnically skewed coverage decisions, the plan faces enforcement exposure under rules that predate AI-specific legislation. To defend against such a claim, disparity testing data is necessary.
CPPA Automated Decision-Making Regulations
California Civil Code § 1798.185(a)(16) requires the CPPA to issue regulations on automated decision-making technology. Draft ADMT regulations have included requirements for opt-out rights and impact assessments for certain AI uses. Monitor the CPPA docket at cppa.ca.gov. Once finalized, these rules will apply to any business subject to CPPA that uses ADMT for decisions with significant effects on consumers, which covers most clinical AI applications.
Pending Legislation
Proposed legislation, if enacted, could require documented impact assessments for high-risk AI systems, including healthcare applications. Documentation practices such as tracking training data provenance, validation results, and disparity testing represent the emerging standard that regulators will expect.
Federal Floor: FDA SaMD Requirements
For AI tools that meet the FDA's Software as a Medical Device definition, the FDA's AI/ML-Based SaMD Action Plan requires predetermined change control plans that include bias monitoring. California facilities deploying SaMD must confirm the vendor has cleared or authorized the device before clinical use.
Recommended Documentation
Align with the NIST AI Risk Management Framework 1.0 and ONC Health IT certification criteria for clinical decision support transparency. At minimum, maintain:
- Training data sources and demographic composition
- Model validation results, including performance across race, ethnicity, sex, age, and disability status
- Disparity testing methodology and results
- Version-change logs with re-validation records
- Incident response procedures for AI-related adverse events
Compliance Requirements by Entity Type
| Entity Type | Key Statutes | Primary Regulator | Core Obligation |
|---|---|---|---|
| Hospitals & health systems | Patient communication disclosure rules; Civil Code §§ 56–56.37 (CMIA) | CDPH | Disclose AI-generated patient communications; ensure vendor contracts address CMIA § 56.06 liability |
| Health plans (DMHC-regulated) | Utilization management rules; anti-discrimination rules | DMHC | Clinician review of all AI-assisted UM denials; meet decision timelines; disparity testing |
| Health plans (CDI-regulated) | Insurance anti-discrimination laws; utilization management rules | CDI | Adhere to UM restrictions; avoid AI-driven discrimination subject to market conduct exams |
| Digital health / AI vendors (B2B) | Civil Code § 56.06 (CMIA vendor liability); Civil Code § 1798.140(ag) (CPRA service provider) | CPPA; CDPH | Direct CMIA liability if receiving PHI; CPRA service-provider contractual terms; flow-down obligations from covered-entity clients |
| Health data brokers | CPRA data broker registry rules | CPPA | Annual data broker registration; restrictions on selling inferred health data |
| Telehealth platforms using AI | Patient communication disclosure rules; Civil Code §§ 56–56.37 (CMIA) | CDPH; Medical Board of California | Disclose AI-generated communications; consult Medical Board guidance on AI-assisted diagnosis |
Under Civil Code § 56.06, if an AI vendor receives individually identifiable medical information from a provider, the vendor is treated as a provider of health care under CMIA. A Business Associate Agreement does not eliminate that liability; it allocates it.
Permit, Registration & Filing Requirements: Fees and Timelines
| Filing Type | Agency | Fee | Timeline | Trigger |
|---|---|---|---|---|
| Health plan license amendment (material UM change) | DMHC | Consult DMHC for current schedule | Consult DMHC | Adding or materially changing AI-assisted utilization management process |
| CDPH facility material-change notification | CDPH Licensing & Certification | Varies by facility type | Consult CDPH | Adding AI-based clinical decision support to licensed facility EHR workflows |
| CPPA data broker registration | CPPA | Verify current amount under CPRA | Annual renewal | AI vendor selling or sharing inferred health data |
| FDA 510(k) or De Novo submission | FDA | Federal fee schedule | Varies | AI tool meets SaMD definition; California facilities must confirm clearance before clinical deployment |
| CPRA service-provider contract update | N/A (contractual) | No filing fee | Before data sharing begins | Any vendor receiving personal information from a CPRA-covered entity |
No California-specific AI permit or license exists as of 2025. Compliance is achieved through existing health plan, facility, and privacy filings.
Decision Timelines That AI Systems Must Meet
State regulations establish strict timelines for utilization review decisions.
- Urgent utilization review decisions: Within 72 hours of receiving the request.
- Standard utilization review decisions: Within five business days.
An AI system's workflow must be designed so that routing, flagging, and clinician handoff happen within these windows. An AI tool that creates bottlenecks or delays clinician review does not satisfy compliance.
DMHC License Amendment
If a DMHC-regulated health plan materially changes its utilization management process to incorporate AI, that change may require a license amendment filing. Consult DMHC directly at dmhc.ca.gov or call 1-888-466-2219 for current requirements and fees.
CDPH Facility Notification
Hospitals adding AI-based clinical decision support to EHR workflows may trigger a material-change notification under CDPH licensing conditions. Consult the CDPH Licensing and Certification Division at (916) 552-8700 or cdph.ca.gov for current requirements before deployment.
Next Steps & Who to Contact
Action Checklist
Step 1: Map your AI use cases. Use the entity-type table above. Identify every AI system touching patient data, utilization decisions, or patient-facing communications. Assign the applicable statutes to each use case.
Step 2: Audit patient-facing AI communications. Ensure you comply with any applicable disclosure rules, including labels and human-contact pathways for AI-generated after-visit summaries, care instructions, or similar documents.
Step 3: Review utilization management workflows. If you operate a health plan, document how licensed clinician review is integrated into every AI-assisted adverse determination. Confirm your systems meet the 72-hour urgent and five-business-day standard decision timelines.
Step 4: Update vendor contracts. Business Associate Agreements and service-provider agreements must address CMIA § 56.06 vendor liability and CPRA service-provider obligations under Civil Code § 1798.140(ag).
Step 5: Monitor CPPA rulemaking on ADMT. Subscribe to CPPA updates at cppa.ca.gov/regulations. When the automated decision-making technology regulations are finalized, they will likely require new compliance measures for healthcare AI applications.
Step 6: Track pending legislation. Monitor relevant bills at leginfo.legislature.ca.gov. If new laws pass, bias-audit and impact-assessment requirements could become mandatory.
Step 7: Schedule semi-annual compliance reviews. The legislative and regulatory pace in California requires a recurring review cadence.
Agency Contacts
| Agency | Function | Contact |
|---|---|---|
| DMHC | Health plan licensing, UM enforcement, utilization review | 1-888-466-2219 | dmhc.ca.gov |
| CDPH Licensing & Certification | Hospital and facility licensing, provider questions | (916) 552-8700 | cdph.ca.gov |
| CPPA | CPRA enforcement, ADMT rulemaking, data broker registry | cppa.ca.gov/regulations |
| CDI | Insurance company AI oversight, market conduct exams | 1-800-927-4357 | insurance.ca.gov |
| California Legislative Information | Bill tracking for pending legislation | leginfo.legislature.ca.gov |
Gear & Tools for California Projects
Affiliate disclosure: some links below are affiliate links (Amazon and partner programs). If you buy through them, we may earn a small commission at no extra cost to you. Product selection is not influenced by commission — see our full disclosure.