StateReg.Reference

AI Healthcare Regulations in Colorado (2024–2025)

Colorado AI healthcare regulations explained: SB 24-205, consumer protections, provider obligations, enforcement, and compliance steps for 2024–2025.

Last updated April 21, 20261 statute sources

Colorado has enforceable AI-in-healthcare rules. SB 24-205, signed May 17, 2024, covers high-risk AI systems, including clinical decision support and prior authorization tools. Most deployer obligations take effect February 1, 2026. The Colorado Division of Insurance also separately regulates algorithmic insurance decisions. Federal rules, including HIPAA, the ONC HTI-1 Final Rule, and FDA SaMD guidance, also apply.

Quick Answer: Does Colorado Regulate AI in Healthcare?

Yes.

Governor Jared Polis signed SB 24-205, the Colorado Artificial Intelligence Act, on May 17, 2024. The law applies to developers and deployers of "high-risk AI systems." This category includes clinical decision support tools, diagnostic AI, and prior authorization algorithms. Most obligations for deployers take effect February 1, 2026.

Separately, the Colorado Division of Insurance (DOI) issued Bulletin B-5.38. This bulletin addresses the use of external consumer data and algorithmic models in insurance decisions, including health plan underwriting and claims. That bulletin operates independently of SB 24-205 and is currently in effect.

In addition to state law, Colorado providers and developers must also comply with:

  • HIPAA Privacy and Security Rules (45 CFR Parts 160 and 164)
  • The ONC Health Data, Technology, and Interoperability (HTI-1) Final Rule (45 CFR Parts 170 and 171), which mandates transparency for AI-based clinical decision support
  • FDA's AI/ML Software as a Medical Device (SaMD) framework for products that meet the definition of a medical device

Who is covered under SB 24-205: Both developers (companies that design and sell high-risk AI systems) and deployers (entities that use those systems to make or substantially influence consequential decisions affecting Colorado consumers). For example, a hospital using a vendor's prior-authorization algorithm is a deployer. The vendor is a developer. Both have distinct obligations.


What Colorado Law Says: SB 24-205 and Healthcare AI

SB 24-205 is codified in the Colorado Revised Statutes. Based on the enrolled bill, the consumer-protection provisions are located at C.R.S. § 6-1-1701 through § 6-1-1711. Consult the Colorado General Assembly's official enrolled bill text to confirm final codification.

What Counts as a "High-Risk AI System" in Healthcare

The statute defines a high-risk AI system as one that makes or substantially influences a "consequential decision" affecting a Colorado resident. Consequential decisions in healthcare include those about the provision or denial of healthcare services, treatment recommendations, and health insurance coverage determinations (Colorado SB 24-205, § 6-1-1702 definitions). Clinical decision support tools, algorithms that generate prior authorization approvals or denials, and AI-driven triage systems meet this definition.

Developer Obligations

Companies that build high-risk healthcare AI systems must, under SB 24-205:

  • Maintain a risk management policy that addresses reasonably foreseeable risks of algorithmic discrimination.
  • Conduct bias testing and document the results.
  • Disclose known limitations of the system to any deployer that licenses or uses the tool.
  • Make available to deployers the information those deployers need to conduct their own impact assessments.

If a developer becomes aware of a material risk of algorithmic discrimination, the developer must notify deployers (Colorado SB 24-205, developer obligations provisions).

Deployer Obligations

Entities deploying high-risk AI in healthcare settings must:

  • Complete an impact assessment before deploying the system and annually thereafter, evaluating the risk of algorithmic discrimination.
  • Notify consumers, in plain language, when a high-risk AI system makes or substantially influences a consequential decision affecting them.
  • Provide consumers with a meaningful opportunity to request human review of an adverse AI-driven decision.
  • Publish a public statement describing the types of high-risk AI systems in use and the nature of the consequential decisions they influence.

Anti-Discrimination Requirement

SB 24-205 prohibits high-risk AI systems from using protected characteristics, such as race, color, national origin, disability, sex, religion, and age, as proxies in ways that result in algorithmic discrimination (Colorado SB 24-205, § 6-1-1703). This requirement is relevant to clinical AI tools trained on datasets with historical disparities.

Enforcement and Private Rights of Action

The Colorado Attorney General holds exclusive enforcement authority under SB 24-205. The statute does not create a private right of action. However, conduct that violates SB 24-205 may also constitute a deceptive trade practice under the Colorado Consumer Protection Act (C.R.S. § 6-1-101 et seq.), which allows private civil actions in some circumstances. The AG can seek civil penalties and injunctive relief.

Licensed Clinician Obligations

HB 19-1172 (2019) recodified and reorganized Colorado Title 12 (Professions and Occupations). The recodification restructured licensing statutes for physicians, nurses, and other health professionals but did not add AI-specific provisions. Clinician obligations around AI-assisted diagnosis currently flow from professional standard-of-care requirements under the recodified Title 12 statutes. Consult the Colorado Medical Board (dora.colorado.gov/medical) for any formal guidance.


What Changed Recently: 2023–2025 Regulatory Activity

SB 24-205 (Signed May 17, 2024)

Colorado became one of the first states to enact a comprehensive AI law covering high-risk systems across sectors, including healthcare. The effective date for most developer and deployer obligations is February 1, 2026. Confirm this date against any 2025 amendments via the Colorado General Assembly website (leg.colorado.gov).

Colorado DOI Bulletin B-5.38

The Division of Insurance issued Bulletin B-5.38 to address health insurers' use of external consumer data, algorithms, and predictive models in underwriting and claims decisions. The bulletin requires insurers to ensure that algorithmic tools do not produce discriminatory outcomes and to be able to demonstrate compliance on examination. Consult the DOI directly (doi.colorado.gov) to confirm whether any successor bulletin has been issued.

ONC HTI-1 Final Rule (Federal, Effective 2024)

The Office of the National Coordinator for Health Information Technology's HTI-1 Final Rule (45 CFR Parts 170 and 171) requires health IT developers to provide transparency about AI and predictive algorithms embedded in certified electronic health record systems. Colorado providers using certified EHR technology must comply with these federal transparency requirements.

FDA AI/ML SaMD Framework

The FDA's AI/ML Software as a Medical Device Action Plan governs diagnostic AI products that meet the statutory definition of a medical device. Colorado-based health tech developers building diagnostic tools should assess whether their products require 510(k) clearance or De Novo authorization. Consult the FDA's Digital Health Center of Excellence for current guidance.

Colorado AG Enforcement Posture

The Colorado Attorney General's office identified AI-related consumer harms as an active monitoring priority in 2024. The AG's consumer protection section has been engaged on algorithmic discrimination issues under existing consumer protection authority.

2025 Legislative Session

The Colorado General Assembly may consider amendments to SB 24-205 during the 2025 legislative session. Monitor leg.colorado.gov for introduced bills.


Compliance Requirements by Entity Type

AI Developers (Health Tech Companies)

If you build diagnostic tools, prior authorization algorithms, clinical decision support systems, or AI-driven triage products used by Colorado deployers, SB 24-205 applies to you as a developer. Your obligations include:

  • Conduct and document bias testing before deployment.
  • Prepare a risk management policy covering algorithmic discrimination.
  • Disclose known limitations to every deployer that licenses your system.
  • Provide deployers with the documentation they need for their own impact assessments.
  • Notify deployers if you discover a material risk of algorithmic discrimination post-deployment.

The compliance deadline is February 1, 2026.

Deployers: Hospitals and Health Systems

You are a deployer if you use a high-risk AI system to make or substantially influence consequential decisions about patients. Obligations include:

  • Complete an impact assessment before go-live and annually thereafter.
  • Implement patient notification protocols for AI-influenced decisions.
  • Build a human-review request process for adverse decisions.
  • Publish a public statement on your AI system use.
  • Maintain records sufficient to demonstrate compliance on AG inquiry.

Begin updating patient consent forms and vendor contracts to meet the February 1, 2026 deadline.

Deployers: Health Insurance Plans

Health insurers have a two-layer obligation. Under Colorado DOI Bulletin B-5.38, algorithmic underwriting and claims decision tools must not produce discriminatory outcomes, and insurers must be able to demonstrate this during DOI market conduct examinations. Under SB 24-205, health plans are also deployers when their AI systems make consequential coverage decisions affecting Colorado consumers.

Contact the DOI (303-894-7499 / doi.colorado.gov) to confirm current examination expectations under B-5.38. For SB 24-205 deployer obligations, the February 1, 2026 deadline applies.

Telehealth Platforms

AI-driven symptom checkers, triage tools, and care-routing algorithms used by telehealth platforms qualify as high-risk AI systems under SB 24-205 if they substantially influence a consequential healthcare decision. Consent and disclosure requirements apply. Platforms should also assess ONC HTI-1 obligations if they use certified health IT components.

Licensed Clinicians

Individual clinicians using AI-assisted diagnostic tools remain subject to standard-of-care obligations under Colorado's recodified Title 12 licensing statutes. Consult the Colorado Medical Board (dora.colorado.gov/medical) for any updated policy statements. Clinicians should document their review of AI-generated recommendations in the medical record.

Small Provider Exemption

SB 24-205 includes size-based thresholds that may exempt small practices from some obligations. Confirm the current employee count and revenue cutoffs against the enrolled bill text at leg.colorado.gov.


Permit Fees, Filing Timelines, and Compliance Deadlines Compared

SB 24-205 does not impose a permit fee or registration requirement on AI developers or deployers. There is no filing fee to register an AI system with the Colorado AG's office.

RequirementGoverning RuleWho It Applies ToDeadlineCost/FeeEnforcement Body
High-risk AI impact assessmentColorado SB 24-205 (C.R.S. § 6-1-1701 et seq.)Deployers (hospitals, health plans, telehealth)Before deployment; annually thereafter; Feb 1, 2026 compliance dateNone (no filing fee)Colorado AG
Consumer notification for AI-influenced decisionsColorado SB 24-205DeployersFeb 1, 2026NoneColorado AG
Developer bias testing and documentationColorado SB 24-205AI developersFeb 1, 2026NoneColorado AG
Algorithmic insurance decision complianceColorado DOI Bulletin B-5.38Health insurersIn effect now; ongoingVaries (consult DOI for examination fee schedule)Colorado DOI
AI/ML SaMD 510(k) or De Novo clearanceFDA SaMD frameworkHealth tech developers with device-classified AIPrior to marketVaries by product class (consult FDA fee schedule)FDA
Clinical decision support transparencyONC HTI-1 Final Rule (45 CFR Parts 170 and 171)Health IT developers; certified EHR usersIn effect 2024None (compliance cost only)ONC / HHS
HIPAA data governance for AI training data45 CFR Parts 160 and 164All covered entities and business associatesOngoingNone (compliance cost only)HHS OCR

For DOI filing fees related to algorithmic decision-tool documentation, consult the DOI directly (doi.colorado.gov).

Colorado's approach is broader in scope than Illinois's AI Video Interview Act or New York City Local Law 144, which both cover only employment screening.


Patient Rights and Consumer Protections Under Colorado AI Rules

If a Colorado provider, health plan, or telehealth platform uses a high-risk AI system in a decision that affects your care or coverage, you have specific rights under SB 24-205.

Right to Notice

Deployers must notify you, in plain language, when a high-risk AI system makes or substantially influences a consequential decision about your healthcare or coverage (Colorado SB 24-205, consumer notification provisions). This notice must describe the AI's role in the decision.

Right to Human Review

If an AI system produces an adverse decision, such as a prior authorization denial, you have the right to request that a human being review that decision. Deployers must provide a meaningful process for this request (Colorado SB 24-205, human review provisions).

Right to Explanation

The plain-language description of the AI's role must be accessible and understandable.

Anti-Discrimination Protections

SB 24-205 prohibits algorithmic discrimination based on race, color, national origin, disability, sex, religion, and age (Colorado SB 24-205, § 6-1-1703). These protections operate alongside the Colorado Civil Rights Act (C.R.S. § 24-34-301 et seq.), which independently prohibits discrimination in places of public accommodation, including healthcare facilities.

HIPAA Intersection

Your existing HIPAA rights over your health data apply when that data is used to train or operate an AI system. A covered entity cannot use your protected health information to train an AI model without a compliant authorization or an applicable exception under 45 CFR § 164.

How to File a Complaint

  • Colorado AG Consumer Protection Section: coag.gov / 720-508-6000. For SB 24-205 violations and deceptive trade practice claims under C.R.S. § 6-1-101 et seq.
  • Colorado Division of Insurance: doi.colorado.gov / 303-894-7499. For complaints about health insurer algorithmic decisions under Bulletin B-5.38.
  • HHS Office for Civil Rights: For HIPAA violations involving AI and health data, file at hhs.gov/ocr.

Next Steps and Who to Contact in Colorado

For AI Developers

Conduct a gap analysis against SB 24-205 developer obligations (C.R.S. § 6-1-1701 et seq.) before the February 1, 2026 deadline. Audit bias testing documentation, review deployer disclosure packages, and confirm your risk management policy covers algorithmic discrimination. If your product meets the FDA's definition of a medical device, begin the SaMD regulatory pathway assessment.

For Hospitals and Health Systems

Appoint an AI governance lead. Begin impact assessment documentation for every high-risk AI system. Update patient consent forms and intake materials to reflect notification obligations. Amend vendor contracts to require developer disclosures under SB 24-205.

For Health Insurers

Contact the Colorado Division of Insurance for current guidance on Bulletin B-5.38 compliance. Confirm your algorithmic underwriting and claims tools have been reviewed for discriminatory outcomes. Separately, map your SB 24-205 deployer obligations for any AI systems that make consequential coverage decisions.

For Patients

If you believe AI was used in a coverage or care decision without proper notice, or if you were denied the opportunity to request human review, file a complaint with:

  • Colorado AG Consumer Protection Section: coag.gov / 720-508-6000
  • Colorado Division of Insurance: doi.colorado.gov / 303-894-7499
  • Colorado Medical Board (for clinician-related AI concerns): dora.colorado.gov/medical

Monitor These Sources

  • Colorado General Assembly (leg.colorado.gov): Track 2025 session bills for SB 24-205 amendments.
  • Colorado DOI bulletin page (doi.colorado.gov): Monitor for successor guidance to B-5.38.
  • ONC and HHS: Monitor HTI-1 implementation guidance and any HHS AI-in-healthcare rulemaking.
  • FDA Digital Health Center of Excellence: Track SaMD guidance updates affecting AI diagnostic tools.

Affiliate disclosure: some links below are affiliate links (Amazon and partner programs). If you buy through them, we may earn a small commission at no extra cost to you. Product selection is not influenced by commission — see our full disclosure.