StateReg.Reference

AI Healthcare Regulations in South Dakota: A Comprehensive Guide

Understand the current landscape of AI healthcare regulations in South Dakota. Learn about federal oversight, state-specific considerations, and compliance for AI in medical settings.

Verified April 26, 2026
AI-drafted, human-reviewed

How we verify

Each guide is built from authoritative sources (state legislatures, FAA, IRS, DSIRE, OpenStates, etc.), drafted by AI, edited by a second AI pass, polished, then spot-reviewed by a human before publication.

South DakotaAI in healthcare

Quick Answer: AI Healthcare Regulation in South Dakota

No state statute specifically regulates artificial intelligence in South Dakota healthcare as of mid-2025. A review of the South Dakota Legislature's codified laws confirms no enacted chapter dedicated to AI in clinical or health data contexts.

Federal law fills this gap. This includes FDA oversight of software-based medical devices, HIPAA's privacy and security requirements, and ONC interoperability standards. South Dakota's existing medical practice statutes, professional licensing board rules, and general consumer protection law apply to AI tools that interact with licensed practitioners or patient data.

Practically, a South Dakota hospital using an AI diagnostic tool must comply with FDA's Software as a Medical Device framework, maintain HIPAA-compliant data handling, and ensure its clinicians remain accountable under SDCL Title 36 standards of care. No additional state AI permit or registration currently exists.

Federal Framework Governing AI in Healthcare

FDA Oversight of AI and Machine Learning as Medical Devices

The FDA regulates AI and machine learning tools that meet the definition of a medical device under 21 U.S.C. §321(h). Software that analyzes patient data to inform diagnosis or treatment typically qualifies as Software as a Medical Device (SaMD). The agency classifies these tools into three risk tiers:

ClassRisk LevelRegulatory Pathway
Class ILowGeneral controls, often exempt from premarket review
Class IIModerate510(k) premarket notification
Class IIIHighPremarket Approval (PMA)

The FDA's regulatory framework for these products is primarily in 21 CFR Parts 800 through 892, with device-type-specific subparts. The agency's "Artificial Intelligence and Machine Learning (AI/ML)-Based Software as a Medical Device Action Plan" (published January 2021) outlined a risk-based approach. It also introduced predetermined change control plans, allowing adaptive algorithms to update within pre-approved boundaries without requiring a new submission.

A 2026 analysis of FDA-authorized oncology AI and ML devices found that clinical evidence supporting these authorizations varies considerably in study design and rigor (Litt H et al., Journal of Cancer Policy, 2026, PMID 42025919). A separate study of orthopedic AI/ML devices found that few FDA-cleared tools have peer-reviewed validation studies or EU MDR equivalents, raising questions about evidence standards across specialties (Bracken A et al., Clinical Orthopaedics and Related Research, 2025, PMID 41915013). Both findings are relevant for South Dakota providers evaluating vendor claims about cleared AI tools.

HIPAA Compliance for AI Systems

Any AI application that processes, stores, or transmits Protected Health Information (PHI) is subject to HIPAA's Privacy Rule (45 CFR Part 164, Subpart E) and Security Rule (45 CFR Part 164, Subpart C). The transaction and code set standards are located at 45 CFR Part 162.

Key considerations for AI include:

  • Business Associate Agreements (BAAs) must cover AI vendors who handle PHI on behalf of a covered entity (45 CFR §164.502(e)).
  • The Security Rule requires administrative, physical, and technical safeguards. This includes access controls and audit logs that must extend to AI system infrastructure.
  • De-identification under 45 CFR §164.514 is a common approach for training datasets, but the standard is strict. Safe harbor de-identification requires removing 18 specific identifiers.

Enforcement is handled by the HHS Office for Civil Rights (OCR). South Dakota providers are subject to the same penalty tiers as any other state, ranging from $100 to $50,000 per violation category under 45 CFR §160.404.

ONC and Interoperability Standards

The Office of the National Coordinator for Health Information Technology (ONC) does not regulate AI directly. However, its certification program for Electronic Health Record (EHR) technology under 45 CFR Part 170 shapes the data environment AI tools operate in. ONC's information blocking rules (45 CFR Part 171), finalized under the 21st Century Cures Act, affect how health data flows between systems. This directly impacts AI training pipelines and real-time clinical decision support integration. South Dakota providers using ONC-certified EHR systems already operate within this framework.

South Dakota's Existing Healthcare Laws and Their Application to AI

Medical Practice Acts and Standard of Care

The South Dakota Board of Medical and Osteopathic Examiners licenses and disciplines physicians under SDCL Title 36, Chapter 4. The board has not issued specific guidance on AI use in clinical practice as of mid-2025. Consult the South Dakota Board of Medical and Osteopathic Examiners directly for any interim policy statements.

Licensed physicians retain professional responsibility for clinical decisions. Using an AI tool does not typically transfer liability to the software vendor for erroneous recommendations. The standard of care under SDCL §36-4-30 (unprofessional conduct provisions) requires physicians to exercise appropriate clinical judgment, even when using AI output.

Data Privacy and Security

South Dakota does not have a comprehensive consumer data privacy law equivalent to California's CCPA or Virginia's CDPA as of mid-2025. The state's primary data breach notification requirement is found at SDCL §22-40-20. This statute requires notification to affected residents following a breach of personal information, which includes categories relevant to patient records.

For health data specifically, HIPAA preempts state law where state law is less protective (45 CFR §160.203). South Dakota's breach statute does not extend HIPAA's protections in meaningful ways, so federal rules remain the primary floor.

Telehealth Regulations

South Dakota has codified telehealth provisions at SDCL §34-12-1.1 and §34-12-1.2. These establish that telehealth services must meet the same standard of care as in-person services. AI tools embedded in telehealth platforms, such as symptom checkers, remote monitoring algorithms, or AI-assisted video triage, fall under this standard. A telehealth provider using AI in South Dakota cannot claim a lower duty of care because the interaction was software-mediated.

Consumer Protection

The South Dakota Consumer Protection Act (SDCL Chapter 37-24) prohibits deceptive acts or practices in trade or commerce. An AI vendor making unsupported efficacy claims to a South Dakota hospital or patient population could face scrutiny under this chapter. The Attorney General's Consumer Protection Division enforces these provisions.

Professional Licensing Boards

Beyond medicine, the South Dakota Board of Nursing (operating under SDCL Title 36, Chapter 9) and the South Dakota State Board of Pharmacy (SDCL Title 36, Chapter 11) govern how their licensees use technology in practice. Neither board has published AI-specific rules as of mid-2025. Consult each board directly for current policy positions on AI-assisted clinical tasks.

Ethical Considerations and Best Practices for AI in SD Healthcare

Bias and Fairness

South Dakota's patient population includes significant rural communities and federally recognized tribal nations. AI tools trained predominantly on urban or non-Native datasets may perform poorly or inequitably for these groups. The American Medical Association's policy on augmented intelligence (AMA Policy H-480.940) calls for developers to test algorithms across diverse populations before deployment. The NIST AI Risk Management Framework (NIST AI RMF 1.0, January 2023) provides a structured approach to identifying and documenting bias risks across the AI lifecycle. It is a practical starting point for South Dakota health systems building vendor evaluation criteria.

Transparency and Explainability

Clinicians cannot exercise professional judgment over a recommendation they cannot interrogate. When evaluating AI tools, South Dakota providers should require vendors to explain, in plain terms, what inputs drive outputs and what the model's known failure modes are. The American College of Physicians has emphasized that AI tools used in clinical decision support should be interpretable to the clinician at the point of care (ACP Ethics Manual, 7th Edition).

Patient Safety and Accountability

Clear accountability chains are crucial before an adverse event. Contracts with AI vendors should specify who bears responsibility for errors, what postmarket surveillance the vendor conducts, and how errors are reported. For FDA-regulated devices, postmarket reporting obligations under 21 CFR Part 803 (Medical Device Reporting) apply to manufacturers and, in some cases, to facilities.

Data Governance

Training data quality determines model quality. South Dakota health systems should maintain documented data governance policies covering data provenance, access controls, retention schedules, and de-identification procedures. The NIST AI RMF's "Govern" function provides a practical checklist for these policies.

No South Dakota statute currently requires specific informed consent disclosures for AI use in clinical care. However, general informed consent principles under SDCL §20-9-16 (medical consent) and common law require patients to receive material information about their treatment. Whether AI involvement in diagnosis or treatment planning is "material" is an evolving question. The conservative and defensible position is to disclose AI use when it meaningfully influences a clinical recommendation.

Future Outlook: Anticipated Regulatory Developments for AI in Healthcare

Federal Initiatives

The White House Executive Order on Safe, Secure, and Trustworthy AI (Executive Order 14110, October 2023) directed HHS to develop a strategy for responsible AI use in healthcare. HHS released its AI Strategy in 2024, signaling increased federal attention to clinical AI governance. NIST's role in developing AI standards under the AI Risk Management Framework is expected to inform future FDA guidance and potentially CMS coverage and reimbursement policy.

Congress has introduced multiple AI-related bills in recent sessions, though none specific to healthcare AI had been enacted at the federal level as of mid-2025. Providers should monitor the FDA's Digital Health Center of Excellence for updated guidance documents, as the agency has indicated it will issue more specific rules for adaptive AI algorithms.

Several states have moved ahead of South Dakota on AI regulation. Colorado enacted SB 24-205 (effective 2026), which imposes requirements on developers of "high-risk" AI systems, including those used in healthcare. California has passed multiple bills addressing automated decision-making. These models are likely to inform future South Dakota legislative discussions, particularly if federal legislation stalls.

The National Conference of State Legislatures tracks AI-related bills across all states, and its database is an efficient way to monitor comparative state activity. South Dakota's Legislature Research Council could be tasked with a study of AI regulation, a common precursor to legislation in smaller-population states.

Industry Standards and Self-Regulation

The American Medical Association, the American College of Radiology (which has published its own AI appropriateness criteria), and the Healthcare Information and Management Systems Society (HIMSS) are all developing guidance that may precede formal regulation. For South Dakota providers, adherence to these standards now reduces regulatory risk later and demonstrates a good-faith compliance posture.

Resources and Compliance Assistance for AI in South Dakota Healthcare

South Dakota State Agencies

South Dakota Department of Health Website: doh.sd.gov Role: General healthcare facility oversight, public health standards. No AI-specific guidance published as of mid-2025. Contact the department directly for current health technology policy positions.

South Dakota Board of Medical and Osteopathic Examiners Website: sdbmoe.gov Contact for questions about professional responsibility and standard of care when using AI tools in clinical practice.

South Dakota Board of Nursing Website: nursing.sd.gov

South Dakota State Board of Pharmacy Website: pharmacy.sd.gov

South Dakota Attorney General, Consumer Protection Division Website: atg.sd.gov/consumers Relevant for AI vendor deceptive practice complaints under SDCL Chapter 37-24.

Federal Agencies

FDA Digital Health Center of Excellence Website: fda.gov/medical-devices/digital-health-center-excellence Primary resource for SaMD classification, 510(k) submissions, and AI/ML guidance documents.

HHS Office for Civil Rights (HIPAA Enforcement) Website: hhs.gov/ocr File HIPAA complaints, access compliance guidance, and review recent enforcement actions.

Office of the National Coordinator for Health Information Technology (ONC) Website: healthit.gov Interoperability standards, EHR certification, and information blocking rules.

Professional Associations

  • American Medical Association (ama-assn.org): AI policy statements and augmented

Affiliate disclosure: some links below are affiliate links (Amazon and partner programs). If you buy through them, we may earn a small commission at no extra cost to you. Product selection is not influenced by commission — see our full disclosure.