StateReg.Reference

AI Healthcare Regulations in Alabama (2025 Guide)

Understand AI healthcare regulations in Alabama: state laws, federal overlap, licensing rules, liability, and compliance steps for providers and vendors.

Last updated April 21, 20263 statute sources

Alabama has no dedicated AI-in-healthcare statute as of mid-2025. Existing state medical practice, telehealth, and data privacy laws apply to AI clinical tools by extension. Federal frameworks from the FDA, HHS, and FTC also apply. Providers and vendors must satisfy state and federal requirements simultaneously.

Quick Answer: Is AI in Healthcare Regulated in Alabama?

Yes, but not through a single, purpose-built law. Alabama has not enacted a standalone AI healthcare statute. A review of the 2025 and 2026 regular sessions confirms no healthcare AI bills advanced. The notable legislative activity in those sessions was limited to education appropriations (HB 169, 2025rs; HB 238 and SB 141, 2026rs).

This absence of specific legislation does not create a regulatory vacuum. Three layers of law govern AI tools used in Alabama clinical settings:

Layer 1: Alabama state law applied by extension. The Alabama Medical Practice Act (Ala. Code Title 34), the Alabama Telehealth Act (Ala. Code § 34-24-500 et seq.), and the Alabama Data Breach Notification Act (Ala. Code § 8-38-1 et seq.) all reach AI-assisted clinical tools despite predating them. The Alabama Board of Medical Examiners sets scope-of-practice rules that determine whether an AI recommendation crosses into the practice of medicine.

Layer 2: Federal device and privacy regulation. The FDA's Software as a Medical Device (SaMD) framework governs whether an AI tool requires premarket clearance. HIPAA's Privacy and Security Rules (45 CFR Parts 160 and 164) apply when a tool processes protected health information. Section 5 of the FTC Act reaches deceptive health claims made by vendors.

Layer 3: Federal civil rights and interoperability rules. HHS Office for Civil Rights guidance under Section 1557 of the ACA now explicitly addresses algorithmic discrimination. The 21st Century Cures Act information-blocking provisions apply to AI tools embedded in certified health IT.

A hospital deploying an AI diagnostic imaging tool in Birmingham faces FDA clearance requirements, HIPAA Business Associate Agreement obligations, Alabama Board of Medical Examiners scope-of-practice constraints, and potential Section 1557 scrutiny. The FDA's 2021 AI/ML-Based Software as a Medical Device action plan remains the primary federal roadmap for device regulation.


Alabama State Laws That Directly Affect AI Healthcare Tools

Alabama Telehealth Act

The Alabama Telehealth Act (Ala. Code § 34-24-500 through § 34-24-507) governs healthcare services delivered via electronic communications. AI-driven remote diagnostic tools, symptom checkers, and automated patient triage platforms can fall within the statute's scope when used to facilitate a clinical encounter between a patient and a licensed provider.

The Act requires that telehealth services meet the same standard of care as in-person services. This requirement extends to AI-assisted encounters. If a provider relies on an AI-generated recommendation during a telehealth visit, the provider remains responsible for ensuring that recommendation meets the applicable standard of care. The Alabama Board of Medical Examiners has codified telemedicine-specific rules in Administrative Code Chapter 540-X-9, which address prescribing, patient-provider relationships, and documentation. Consult the Alabama Board of Medical Examiners for current rule text and guidance on AI-assisted telehealth workflows.

Clinical Judgment Delegation and Scope of Practice

The Alabama Medical Licensure Commission and the Alabama Board of Medical Examiners (Ala. Code § 34-24-50 et seq.) control who may exercise clinical judgment in Alabama. Autonomous AI tools that generate diagnoses or recommend treatments without physician oversight raise a core question: is the AI performing acts that constitute the practice of medicine? Alabama has not issued a formal opinion on this question as of mid-2025. Until it does, the conservative position is that a licensed physician must review and take responsibility for any AI-generated clinical recommendation before it reaches the patient.

Alabama Pharmacy Practice Act

AI-assisted prescription tools, automated dispensing systems, and clinical decision support integrated into pharmacy workflows are subject to the Alabama Pharmacy Practice Act and the Alabama Board of Pharmacy's rules. The Board has authority over the practice of pharmacy, including automated systems used in dispensing. Consult the Alabama Board of Pharmacy for any guidance on AI-assisted pharmacy tools, as no formal rulemaking specific to AI had been published as of mid-2025.

Mental Health and Behavioral Health AI

The Alabama Mental Health Act and the Alabama Department of Mental Health's oversight framework apply to AI-assisted behavioral health screening tools deployed in licensed facilities. An AI chatbot that conducts mental health triage or generates a clinical impression in a licensed behavioral health setting is subject to the same professional standards as a human clinician performing the same function. Consult the Alabama Department of Mental Health for facility-specific guidance.

Alabama Data Breach Notification Act

The Alabama Data Breach Notification Act (Ala. Code § 8-38-1 et seq.) requires covered entities and their third-party agents to notify affected individuals and the Alabama Attorney General when a breach of sensitive personally identifying information occurs. Healthcare AI vendors that process protected health information are captured under the "third-party agent" definition. If an AI system is breached and patient data is exposed, the notification clock starts. The Act requires notification without unreasonable delay and no later than 45 days after determination that a breach has occurred (Ala. Code § 8-38-6). This obligation is in addition to HIPAA breach notification requirements.


Federal Regulations Alabama Providers and Vendors Must Follow

FDA Software as a Medical Device Framework

The FDA classifies AI tools that meet the definition of a medical device under 21 U.S.C. § 321(h) as Software as a Medical Device. Depending on risk classification, a tool may require 510(k) premarket notification, De Novo authorization, or Premarket Approval. The quality system requirements at 21 CFR Part 820 apply to device manufacturers, including developers of AI software that meets the device definition.

The 21st Century Cures Act (Pub. L. 114-255) created a clinical decision support software exemption at 21 U.S.C. § 520(o), carving out certain CDS tools from device regulation. To qualify, tools must display the basis for recommendations and allow clinicians to independently review that basis. The FDA's final Clinical Decision Support Software guidance (September 2022) defines the exemption's boundaries. Tools that fall outside the exemption, including most AI diagnostic imaging software and autonomous clinical recommendation engines, require premarket review.

HIPAA Privacy and Security Rules

Any AI vendor that receives, creates, maintains, or transmits protected health information on behalf of a covered entity is a Business Associate under 45 CFR § 160.103. A signed Business Associate Agreement is legally required before the vendor may access PHI (45 CFR § 164.308(b)). The Security Rule's administrative, physical, and technical safeguard requirements (45 CFR Part 164, Subpart C) apply to the AI system. Risk analysis requirements mean providers must assess the specific security risks introduced by each AI tool they deploy.

FTC Act Section 5

AI health tool vendors making clinical efficacy, accuracy, or safety claims that are not substantiated by competent and reliable scientific evidence face enforcement under Section 5 of the FTC Act for unfair or deceptive practices. The Act is relevant for direct-to-consumer AI health apps and for vendors whose marketing claims to providers exceed clinical validation.

21st Century Cures Act: Information Blocking

The information-blocking provisions of the 21st Century Cures Act (Pub. L. 114-255) prohibit practices that interfere with the access, exchange, or use of electronic health information. AI tools embedded in certified health IT systems that restrict data portability or create proprietary lock-in affecting patient data access may trigger information-blocking liability. The Office of the National Coordinator for Health Information Technology (ONC) enforces these provisions.

CMS Conditions of Participation

Hospitals participating in Medicare and Medicaid must meet CMS Conditions of Participation (42 CFR Part 482). AI tools integrated into clinical workflows at participating hospitals must be consistent with those conditions, particularly around medical staff oversight, patient rights, and quality assessment. CMS has not issued AI-specific CoP guidance as of mid-2025, but existing requirements apply.

HHS OCR Section 1557 and Algorithmic Bias

The HHS Office for Civil Rights 2024 final rule under Section 1557 of the ACA (89 Fed. Reg. 37522, May 6, 2024) explicitly addresses the use of patient care decision support tools, including algorithmic tools. Covered entities that use AI tools in clinical decision-making must ensure those tools do not discriminate on the basis of race, color, national origin, sex, age, or disability. The rule requires covered entities to take reasonable steps to identify and mitigate discriminatory outputs from AI systems they deploy.


Liability and Malpractice Considerations for AI-Assisted Care in Alabama

Standard of Care Under the Alabama Medical Liability Act

The Alabama Medical Liability Act (Ala. Code § 6-5-480 et seq.) governs medical malpractice claims. The standard of care is defined at Ala. Code § 6-5-484 as the reasonable care, skill, and treatment recognized by reasonably competent healthcare providers as appropriate under similar conditions.

When an AI tool contributes to a negative patient outcome, the analysis focuses on the provider's conduct. Did the provider exercise reasonable judgment in selecting the tool, interpreting its output, and applying it to the patient's circumstances? A provider who blindly follows an AI recommendation without independent clinical review is exposed. A provider who uses an AI tool as one input among several and documents the reasoning is in a stronger position.

Negligence Per Se and Non-Cleared AI Tools

Using an AI diagnostic tool that requires FDA clearance but has not received it may support a negligence per se argument in Alabama malpractice litigation. Alabama courts recognize negligence per se when a defendant violates a statute designed to protect a class of persons that includes the plaintiff, and that violation causes the plaintiff's injury. Deploying a non-cleared, device-grade AI tool fits that framework. While not settled case law in Alabama, this is a credible litigation theory.

Alabama's informed consent doctrine, developed through cases including Fain v. Smith, 479 So. 2d 1150 (Ala. 1985), requires providers to disclose material information that a reasonable patient would want to know before consenting to treatment. Whether AI involvement in diagnosis or treatment planning is material information that must be disclosed is an open question in Alabama. No statute or board rule currently mandates AI disclosure. Providers should consider disclosing when AI plays a substantive role in diagnosis or treatment recommendations, particularly for high-stakes decisions.

Vendor Indemnification

Vendor contracts for AI healthcare tools routinely include indemnification clauses, limitation-of-liability provisions, and warranty disclaimers. These are enforceable under Alabama contract law, subject to general public policy limits. Providers should negotiate for indemnification coverage when a vendor's tool malfunctions and causes patient harm. A vendor's FDA clearance does not automatically transfer liability away from the provider. Alabama has no AI-specific liability safe harbor statute.

Professional Liability Insurance

Professional liability carriers are increasingly asking about AI tool usage in renewal applications. Providers should disclose AI tool deployment accurately, confirm coverage applies to AI-assisted clinical decisions, and request written confirmation if uncertain. Failure to disclose material information about AI tool use could affect coverage in a claim.


Compliance Comparison: Key Requirements by AI Tool Category

AI Tool TypeFDA Oversight LevelAlabama Board Notification RequiredHIPAA BAA NeededEstimated Compliance Timeline
Clinical Decision Support (non-device, meets § 520(o) exemption)None (exempt)No codified mandate; consult ABME for informal guidanceYes, if PHI processed4–8 weeks
Clinical Decision Support (device-grade, outside § 520(o))510(k) or De Novo requiredNo codified mandate; consult ABMEYes6–18 months (FDA review dependent)
AI Diagnostic Imaging510(k) or PMA requiredNo codified mandate; consult ABMEYes6–24 months
AI-Powered Telehealth PlatformVaries by function; may be exempt or device-gradeNo codified mandate; consult ABME; review 540-X-9Yes3–12 months
AI Chatbot for Mental Health TriageVaries; likely device-grade if generating clinical impressionsNo codified mandate; consult ABME and ADMHYes6–18 months
AI-Assisted Billing and CodingGenerally not a deviceNoYes, if PHI processed2–6 weeks

"Alabama Board Notification Required" reflects current best practice, not a codified mandate. No Alabama statute or rule as of mid-2025 requires providers to notify the Alabama Board of Medical Examiners before deploying an AI clinical tool. This column reflects the recommended step of seeking informal guidance before deployment of autonomous clinical AI, pending future rulemaking.

The FDA's September 2022 Clinical Decision Support Software final guidance and the 21 U.S.C. § 520(o) exemption criteria are the primary tools for determining whether a tool escapes device classification. Key factors for the exemption include whether the tool allows a clinician to independently review the basis for its recommendations and whether it is intended to support, not replace, clinical judgment.


What Has Changed Recently (2024–2025)

HHS OCR Section 1557 Final Rule (May 2024)

A significant recent development for Alabama healthcare AI is the HHS OCR Section 1557 final rule published at 89 Fed. Reg. 37522 (May 6, 2024). The rule requires covered entities using patient care decision support tools, including AI, to ensure those tools do not produce discriminatory outputs. Covered entities must take reasonable steps to identify and address bias. This rule applies to any Alabama hospital, health system, or insurer receiving federal financial assistance.

FDA Laboratory Developed Test Final Rule (May 2024)

The FDA published a final rule at 89 Fed. Reg. 37286 (May 6, 2024) asserting regulatory authority over laboratory developed tests, including AI-powered diagnostic algorithms run in hospital labs. Alabama hospital labs using in-house AI diagnostic tools should assess whether those tools now require FDA premarket review. The rule phases in requirements over several years, but compliance planning is an immediate obligation. Consult the Alabama Department of Public Health for any state laboratory licensing implications.

Executive Order 14110 and HHS Follow-On Actions

Executive Order 14110 (88 Fed. Reg. 75191, Nov. 1, 2023) directed HHS and other agencies to develop AI safety standards. In response, HHS has issued internal AI use policies and begun developing sector-specific guidance. Alabama has not issued a corresponding executive order or state-level AI policy directive as of mid-2025.

Alabama Legislature: No Healthcare AI Bills

The 2025 and 2026 Alabama legislative sessions produced no healthcare AI legislation. The notable bills in those sessions were education appropriations measures (HB 169, 2025rs; HB 238 and SB 141, 2026rs).

Model Law Pressure from Other States

Colorado enacted SB 205 (2024), requiring developers of high-risk AI systems to use reasonable care to avoid algorithmic discrimination. Utah enacted HB 452 (2024), addressing AI transparency and disclosure. These statutes create frameworks that legislators in other states may reference. Alabama providers with multi-state operations should assess compliance with these laws, as they may signal the direction of future legislation.

Alabama Medicaid Agency

As of mid-2025, the Alabama Medicaid Agency had not issued formal guidance specific to AI-generated prior authorization or utilization management. Consult the Alabama Medicaid Agency Administrative Code Chapter 560-X and the agency's bulletin archive for updates. The CMS final rule on prior authorization (CMS-0057-F, effective January 2024) imposes interoperability and transparency requirements on Medicaid managed care plans, which have downstream implications for AI tools used by Alabama Medicaid contractors.


Next Steps: How to Achieve Compliance and Who to Contact in Alabama

Six-Step Compliance Checklist

Step 1: Classify your AI tool using the FDA's SaMD framework before deployment. Use the FDA's September 2022 Clinical Decision Support Software final guidance and 21 U.S.C. § 5

Affiliate disclosure: some links below are affiliate links (Amazon and partner programs). If you buy through them, we may earn a small commission at no extra cost to you. Product selection is not influenced by commission — see our full disclosure.