StateReg.Reference

AI Healthcare Regulations in North Dakota: A Comprehensive Guide

Navigate AI healthcare regulations in North Dakota. Understand state and federal compliance, data privacy, and ethical considerations for AI implementation in ND healthcare.

Verified April 26, 2026
AI-drafted, human-reviewed

How we verify

Each guide is built from authoritative sources (state legislatures, FAA, IRS, DSIRE, OpenStates, etc.), drafted by AI, edited by a second AI pass, polished, then spot-reviewed by a human before publication.

North DakotaAI in healthcare

North Dakota has no AI-specific healthcare statute. Compliance today requires applying existing state healthcare and privacy laws (primarily North Dakota Century Code Titles 23 and 43) alongside federal requirements from the FDA, HIPAA, and emerging HHS guidance. This combination governs every AI tool a North Dakota provider uses.

Quick Answer: AI Healthcare Regulations in North Dakota

North Dakota healthcare providers using AI operate under a patchwork of laws, not a single unified statute. This framework includes:

State level: North Dakota's existing healthcare statutes, data privacy rules, and consumer protection laws apply to AI tools by extension, even though these laws were not written with AI in mind. No North Dakota law exclusively regulates AI in healthcare as of mid-2025.

Federal level: Federal frameworks carry most of the regulatory weight. The FDA classifies certain AI and machine learning (ML) tools as Software as a Medical Device (SaMD) and requires premarket authorization before clinical use. HIPAA's Privacy Rule, Security Rule, and Breach Notification Rule govern any AI system that handles Protected Health Information (PHI). The NIST AI Risk Management Framework and HHS initiatives add further requirements.

Providers must satisfy both state and federal requirements. A gap in either state licensing compliance or federal device/privacy compliance creates legal exposure.


North Dakota's Existing Regulatory Framework for Healthcare Technology

North Dakota Century Code: Healthcare Facilities and Medical Practice

North Dakota Century Code (NDCC) Title 23 (Health and Safety) governs hospitals, critical access facilities, and other licensed healthcare entities. NDCC § 23-16 covers hospital licensing requirements, including standards for patient care and facility operations. AI clinical decision support tools used in licensed facilities fall under operational standards enforced by the North Dakota Department of Health and Human Services (ND DHHS) per that title.

NDCC Title 43 (Professions and Occupations) governs licensed healthcare professionals, including physicians (Chapter 43-17), nurses (Chapter 43-12.1), and other clinical practitioners. These chapters establish the standard of care and scope of practice. When a licensed clinician uses an AI tool to inform a diagnosis or treatment decision, that clinician remains accountable under their professional license. The AI does not hold a license. The professional's board can act on conduct falling below the standard of care, regardless of algorithm involvement.

Data Privacy and Security Under State Law

North Dakota does not have a comprehensive consumer health data privacy statute like Washington's My Health MY Data Act. However, several provisions apply:

  • NDCC § 23-12-13 addresses patient record confidentiality for facilities licensed under state law.
  • NDCC § 51-30 (Identity Theft) and the North Dakota data breach notification statute (NDCC § 51-30-02) require notification to affected individuals and the Attorney General when a breach of personal information occurs. Health data qualifies as personal information under that definition.
  • NDCC § 6-08.1 (Financial Privacy) does not directly apply to health AI, but the breach notification framework it references is instructive for understanding North Dakota's general approach to data security obligations.

These statutes establish a minimum standard for AI systems storing, processing, or transmitting patient data. Federal HIPAA requirements typically set a higher bar, but state breach notification timelines and Attorney General reporting obligations operate independently.

Licensing and Oversight of AI Tool Usage

ND DHHS has not issued AI-specific guidance for licensed facilities. Consult ND DHHS directly for any current advisories. Existing facility licensing standards under NDCC Title 23 require facilities to maintain policies and procedures for clinical operations. Regulators may interpret these requirements to cover AI-assisted clinical workflows during inspections.

Professional licensing boards (the North Dakota Board of Medicine, the North Dakota Board of Nursing, etc.) have not published formal AI-specific position statements. Practitioners should monitor these boards, as guidance can emerge faster than legislation.

Consumer Protection

The North Dakota Consumer Fraud Act (NDCC § 51-15) prohibits deceptive acts or practices in trade or commerce. A healthcare company marketing an AI diagnostic tool with unsubstantiated efficacy claims to North Dakota consumers could face action under this statute, enforced by the Attorney General's office.


Federal Oversight: FDA, HIPAA, and Other Key Agencies Impacting North Dakota AI Healthcare

FDA Classification of AI/ML as Software as a Medical Device

The FDA regulates AI and ML tools that meet the definition of a medical device under 21 U.S.C. § 321(h). The agency's "Artificial Intelligence/Machine Learning (AI/ML)-Based Software as a Medical Device Action Plan" (FDA, January 2021) and subsequent guidance describe a risk-tiered approach. Higher-risk SaMD requires premarket approval (PMA) or De Novo authorization. Lower-risk tools may qualify for 510(k) clearance or fall under enforcement discretion.

A 2026 cross-sectional analysis of FDA-authorized oncology AI/ML devices found that clinical evidence supporting those authorizations varied substantially in study design, sample size, and generalizability (Litt H et al., Journal of Cancer Policy, 2026 Apr 21, PMID 42025919). This finding has direct operational relevance: FDA authorization alone does not guarantee a device performs adequately in a specific patient population. North Dakota providers, particularly those serving rural or Indigenous communities, should scrutinize whether AI tool training data reflects populations similar to their own.

North Dakota providers purchasing or deploying FDA-regulated AI/ML SaMD must verify the device's authorization status through the FDA's 510(k) database or De Novo database before clinical deployment. Using an unauthorized device that meets the SaMD definition is a federal violation.

HIPAA: Privacy, Security, and Breach Notification

HIPAA applies to covered entities and their business associates regardless of state. Every North Dakota hospital, clinic, and health plan that uses an AI tool handling PHI must:

  • Execute a Business Associate Agreement (BAA) with the AI vendor (HIPAA Privacy Rule, 45 C.F.R. § 164.502(e)).
  • Conduct a Security Risk Analysis covering AI systems that store or transmit electronic PHI (HIPAA Security Rule, 45 C.F.R. § 164.308(a)(1)).
  • Report breaches of unsecured PHI within 60 days of discovery for breaches affecting 500 or more individuals, with simultaneous notice to the HHS Office for Civil Rights (HIPAA Breach Notification Rule, 45 C.F.R. § 164.408).

The HHS Office for Civil Rights (OCR) signals increased scrutiny of AI-related HIPAA compliance. OCR's December 2022 guidance on tracking technologies (which includes AI-powered analytics tools embedded in patient-facing platforms) clarified that many such tools trigger HIPAA obligations even when the vendor does not traditionally consider itself a healthcare company. North Dakota providers using third-party AI platforms for patient engagement, scheduling, or clinical documentation should review this OCR guidance directly.

NIST AI Risk Management Framework

The National Institute of Standards and Technology released the AI Risk Management Framework (AI RMF 1.0) in January 2023. It is voluntary at the federal level but is increasingly referenced in HHS guidance and procurement requirements. The framework organizes AI risk management around four functions: Govern, Map, Measure, and Manage. North Dakota providers building internal AI governance programs can use AI RMF 1.0 as a practical structure.


Key Considerations for Implementing AI in North Dakota Healthcare Settings

Bias, Fairness, and Algorithmic Transparency

AI models trained on non-representative datasets can produce systematically worse outcomes for underrepresented groups. North Dakota's patient population includes significant rural and American Indian communities. Research on health disparities in these populations (e.g., Yang M et al., Alzheimer's & Dementia, 2026 Apr, PMID 42002809, documenting care gaps among American Indian and Alaska Native Medicare beneficiaries) underscores the clinical, not just ethical, importance of demographic representativeness in training data. Before deploying any AI diagnostic or triage tool, request the vendor's model card or algorithmic impact assessment and evaluate subgroup performance data.

Data Governance

Build your data governance program around these requirements:

RequirementGoverning AuthorityKey Obligation
PHI handlingHIPAA Security Rule, 45 C.F.R. § 164Risk analysis, access controls, encryption
Breach notification (state)NDCC § 51-30-02Notify AG and affected individuals
Breach notification (federal)45 C.F.R. § 164.408Notify HHS OCR within 60 days
Facility data policiesNDCC Title 23 / ND DHHSDocumented policies and procedures

Data sharing with AI vendors for model training or improvement requires careful BAA drafting. Confirm that the vendor's use of your patient data for model retraining is explicitly addressed and limited.

Clinical Validation and Qualified Oversight

An AI tool authorized by the FDA was validated on a specific population under specific conditions. Facility responsibility does not end at purchase. Establish an internal validation process: run the tool in parallel with standard clinical workflows before full deployment, track performance metrics, and assign a qualified clinician to review outputs. Document that process. If an issue arises, this documentation serves as a defense.

Liability and Accountability

North Dakota medical malpractice law (NDCC § 32-42) applies to adverse outcomes regardless of whether an algorithm contributed. Courts have not yet produced a settled framework for AI-specific liability in healthcare; however, the current default assigns responsibility to the treating clinician and the facility. Contracts with AI vendors should address indemnification for device failures, but vendor indemnification may not cover all exposure.

No North Dakota statute currently mandates specific AI disclosure to patients, but informed consent doctrine under NDCC § 23-12-13 and professional standards require that patients receive material information about their care. Disclosing AI involvement in diagnosis or treatment planning is increasingly considered a best practice and may become a legal requirement. Incorporate disclosure language into consent forms now.


Recent Developments and Evolving Guidance in AI Healthcare Regulation

Federal Activity

The Biden Administration's Executive Order on Safe, Secure, and Trustworthy AI (October 2023) directed HHS to develop a strategy for responsible AI use in healthcare. HHS subsequently released a concept paper on AI and healthcare in early 2024 and continues issuing guidance through its Office of the National Coordinator for Health Information Technology (ONC). ONC's final rule on Health Data, Technology, and Interoperability (HTI-1), effective 2024, includes provisions touching on algorithmic transparency for clinical decision support tools (consult ONC for current implementation status).

The FDA has continued refining its SaMD framework and has proposed a lifecycle-based approach to AI/ML oversight that would allow for iterative model updates without full reauthorization under defined conditions. Consult FDA's Digital Health Center of Excellence for current guidance status.

North Dakota Legislative Activity

As of mid-2025, no North Dakota legislation specifically targeting AI in healthcare has been enacted. The state legislature has shown general interest in AI governance (a broad AI task force discussion occurred in recent sessions), but no bill specifically addressing clinical AI has advanced. Monitor the North Dakota Legislative Assembly's bill tracking system at legis.nd.gov for any introduced legislation.

Several states, including Colorado (SB 24-205 on algorithmic discrimination) and California (various AI transparency bills), have moved ahead of North Dakota on AI-specific regulation. Federal preemption questions remain unresolved, but the trend is toward greater transparency and accountability requirements. North Dakota providers who

Affiliate disclosure: some links below are affiliate links (Amazon and partner programs). If you buy through them, we may earn a small commission at no extra cost to you. Product selection is not influenced by commission — see our full disclosure.