StateReg.Reference

Oregon's AI Healthcare Regulations: A Comprehensive Guide

Understand Oregon's current and emerging regulations for AI in healthcare. Navigate state-specific laws, compliance requirements, and future trends impacting AI adoption in Oregon's medical sector.

Verified April 26, 2026
AI-drafted, human-reviewed

How we verify

Each guide is built from authoritative sources (state legislatures, FAA, IRS, DSIRE, OpenStates, etc.), drafted by AI, edited by a second AI pass, polished, then spot-reviewed by a human before publication.

OregonAI in healthcare

Oregon has no dedicated AI healthcare statutes. Federal law (FDA, HIPAA) sets the floor; existing Oregon statutes on privacy, medical licensing, and consumer protection address gaps. Compliance currently means layering federal requirements on top of Oregon's general healthcare law.

Quick Answer: Current State of AI Healthcare Regulation in Oregon

Oregon lacks a comprehensive, dedicated state statute governing artificial intelligence in healthcare. Providers, developers, and health systems deploying AI tools in Oregon primarily face compliance obligations from federal law (FDA device regulations and HIPAA), supplemented by Oregon's existing healthcare, privacy, and consumer protection statutes.

The Oregon Health Authority (OHA) has not issued AI-specific administrative rules. The Oregon Medical Board (OMB) has not published formal guidance on AI use by licensed physicians beyond existing standard-of-care obligations. This creates a patchwork of federal frameworks, general Oregon statutes (ORS Chapter 192 for privacy and public records, ORS Chapter 677 for medicine regulation), and professional licensing board rules that predate AI's clinical use. National discussions on algorithmic bias, data governance, and patient safety will likely drive future state-level action.

Federal Frameworks and Their Impact on Oregon Healthcare AI

FDA Oversight of AI as a Medical Device

The FDA treats AI and machine learning (AI/ML) tools that meet the definition of Software as a Medical Device (SaMD) as regulated products requiring pre-market review or clearance. Oregon providers and developers are fully subject to this framework, regardless of state law.

The FDA's guidance documents, including "Clinical Decision Support Software" and "Marketing Submission Recommendations for a Predetermined Change Control Plan for Artificial Intelligence/Machine Learning-Enabled Device Software Functions," establish how AI tools are classified, reviewed, and monitored post-market. Tools qualifying as clinical decision support (CDS) but not meeting the device definition under 21 U.S.C. §321(h) fall outside FDA jurisdiction. However, the distinction is narrow and fact-specific. Oregon developers must carefully assess their AI tools against these criteria to determine regulatory pathways.

A cross-sectional analysis by Litt H et al. in the Journal of Cancer Policy (2026) examined FDA-authorized oncology AI/ML devices and their supporting clinical evidence. This study underscored that federal authorization processes already shape which tools reach clinical practice. A separate review by Bracken A et al. in Clinical Orthopaedics and Related Research (2025) found that few FDA-approved AI/ML orthopaedic devices have EU MDR equivalents or peer-reviewed validation. This highlights gaps in evidence standards even within the existing federal framework, meaning Oregon providers adopting these tools inherit those evidentiary limitations.

HIPAA and AI Data Handling

Any AI system processing protected health information (PHI) in Oregon must comply with the HIPAA Privacy Rule (45 CFR Part 164 Subpart E) and the Security Rule (45 CFR Part 164 Subpart C). This applies to covered entities and their business associates, including AI vendors with access to patient data. AI systems often process diverse PHI, from diagnostic images to patient notes, requiring robust compliance.

Practically, this means business associate agreements (BAAs) must be in place with AI vendors. Data minimization principles apply to training datasets, and any breach involving PHI triggers notification obligations under both HIPAA and Oregon's data breach statute (ORS 646A.600 et seq.).

HHS Guidance on Algorithmic Equity

The U.S. Department of Health and Human Services (HHS) has issued guidance and proposed rules addressing algorithmic discrimination in healthcare, including under Section 1557 of the Affordable Care Act. These federal equity frameworks apply in Oregon, particularly for AI tools in Medicaid clinical decision-making. They mandate that AI systems do not perpetuate or exacerbate health disparities based on race, ethnicity, national origin, sex, age, or disability.

Patient Rights and Standard of Care

Oregon's general healthcare statutes establish patient rights and standard-of-care obligations that apply regardless of whether a clinical decision is human-made or algorithm-assisted. Under ORS Chapter 677, licensed physicians remain accountable for clinical decisions. An AI-generated recommendation does not transfer liability to the software vendor if the physician adopts it without appropriate clinical judgment; the physician's duty of care persists.

Medical records generated or influenced by AI tools are subject to Oregon's medical records access and retention requirements. Patients retain rights to access their records under ORS Chapter 192, and those rights extend to records reflecting AI-assisted diagnoses or treatment plans. This requires clear documentation of AI's role.

Oregon Consumer Privacy Act and Data Breach Law

Oregon enacted the Oregon Consumer Privacy Act (OCPA) (ORS 646A.570 et seq.), which took effect July 1, 2024. The OCPA grants Oregon consumers rights over their personal data, including sensitive data categories that overlap with health information. While HIPAA-covered entities have a partial exemption for data governed by HIPAA, AI applications operating outside the HIPAA perimeter, such as consumer wellness apps or direct-to-consumer health AI tools, face direct OCPA compliance obligations. These applications must respect consumer rights to access, delete, and opt out of the sale or processing of their data.

Oregon's data breach notification law (ORS 646A.600 et seq.) requires notification to affected individuals and the Oregon Attorney General when a breach of personal information occurs. This requirement applies to AI systems storing or processing patient data, necessitating robust security protocols and incident response plans.

Professional Licensing Board Rules

The Oregon Medical Board (OMB) governs physician conduct under ORS Chapter 677. The Oregon Board of Nursing governs nursing practice under ORS Chapter 678. Neither board has issued AI-specific rules, but both enforce existing standards of care, scope of practice, and delegation requirements. These rules constrain how AI tools can be used in clinical settings. For instance, AI cannot perform tasks outside a licensee's scope of practice, nor can it replace the independent judgment required for diagnosis or treatment.

Consult the Oregon Medical Board and Oregon Board of Nursing directly for current interpretive guidance on AI-assisted clinical tasks, as informal guidance from these bodies can precede formal rulemaking.

Consumer Protection

Oregon's Unlawful Trade Practices Act (ORS

Affiliate disclosure: some links below are affiliate links (Amazon and partner programs). If you buy through them, we may earn a small commission at no extra cost to you. Product selection is not influenced by commission — see our full disclosure.