StateReg.Reference

Maryland AI Healthcare Regulations: A Comprehensive Guide

Navigate Maryland's evolving regulations for Artificial Intelligence in healthcare. Understand state-specific compliance, federal oversight, and future trends impacting medical AI. Essential for providers and developers.

Verified April 26, 2026
AI-drafted, human-reviewed

How we verify

Each guide is built from authoritative sources (state legislatures, FAA, IRS, DSIRE, OpenStates, etc.), drafted by AI, edited by a second AI pass, polished, then spot-reviewed by a human before publication.

MarylandAI in healthcare

Quick Answer: Maryland's Stance on AI in Healthcare

Maryland's regulatory landscape for AI in healthcare is still developing and somewhat scattered. As of mid-2025, no single Maryland law specifically governs AI in clinical settings. Instead, healthcare providers and AI developers must manage compliance through three overlapping layers: federal agency rules (FDA, HIPAA, ONC), existing Maryland health and data privacy laws, and a growing number of proposed state bills.

The Maryland Department of Health (MDH) is the main state agency overseeing healthcare delivery and health data management. Currently, MDH has not issued specific guidance or advisories for using AI in clinical settings. For the latest updates, consult MDH directly at health.maryland.gov.

In recent legislative sessions, the Maryland General Assembly has introduced bills related to AI transparency and automated decision-making. However, none of these specifically targeting clinical AI have been passed into law yet. The legislature's Health and Government Operations Committee is the key committee to watch for health-related AI legislation.

At the federal level, the FDA, the Department of Health and Human Services (HHS) Office for Civil Rights (OCR), and ONC set the fundamental rules that entities in Maryland must follow. Maryland's state-level actions supplement, rather than replace, these federal requirements.


Federal Frameworks Guiding AI in Maryland Healthcare

Federal laws provide most of the enforceable rules for AI in healthcare. Before considering state-specific details, Maryland providers and developers must understand three key federal areas.

FDA and Software as a Medical Device (SaMD)

The FDA regulates AI and machine learning (ML) tools that qualify as Software as a Medical Device under 21 U.S.C. §321(h) and related guidance. The agency's 2021 AI/ML-Based Software as a Medical Device Action Plan outlines pre-market approval processes (510(k), De Novo, PMA) and highlights the need for post-market monitoring systems designed for adaptive algorithms.

A 2026 analysis of FDA-approved oncology AI and ML devices found that the clinical evidence supporting many of these devices is limited in scope and generalizability (Litt H et al., Journal of Cancer Policy, 2026 Apr 21 [42025919]). This finding suggests that for Maryland oncology centers using these tools, FDA clearance is a basic regulatory requirement, not a full clinical endorsement.

Algorithms used in wearable devices and remote monitoring present similar challenges. Research on algorithms estimating blood volume decompensation showed that transfer learning can improve how well these algorithms work across different patient groups. However, significant validation is still needed before algorithms trained on one demographic can reliably perform on another (Tangolar D et al., Computers in Biology and Medicine, 2026 May 15 [41955753]). Given Maryland's diverse population, including many immigrant communities in the Baltimore-Washington corridor, this gap in generalizability is a direct patient safety concern.

Post-market surveillance requirements under 21 CFR Part 822 and the FDA's proposed framework for predetermined change control plans apply to any FDA-regulated AI device used in Maryland facilities.

HIPAA Privacy and Security Rules

Most clinical AI tools, such as diagnostic algorithms, predictive risk models, and AI-assisted documentation systems, handle Protected Health Information (PHI). These tools are therefore subject to HIPAA's Privacy and Security Rules (45 CFR Parts 160 and 164).

Key HIPAA requirements relevant to AI include:

  • Business Associate Agreements (BAAs) are required between covered entities and AI vendors that access, process, or store PHI (45 CFR §164.502(e)).
  • The Security Rule's requirements for administrative, physical, and technical safeguards (45 CFR §164.308, §164.310, §164.312) apply to AI systems and their data handling processes.
  • Breach notification rules under 45 CFR §164.400-414 are triggered if an AI system fails or is compromised, leading to unauthorized PHI disclosure.
  • De-identification standards under 45 CFR §164.514(b) determine whether data used to train or validate AI models is truly outside HIPAA's scope.

HHS OCR enforces these rules. State data privacy laws add to, but do not replace, HIPAA compliance.

ONC and Health IT Certification

The ONC 21st Century Cures Act Final Rule (85 Fed. Reg. 25642, codified at 45 CFR Parts 170 and 171) established rules against information blocking and set interoperability standards that affect AI-enabled health IT systems. AI tools integrated into certified Electronic Health Record (EHR) systems must meet the requirements of the ONC's Health IT Certification Program. Developers seeking ONC certification for AI-enabled modules should review the certification criteria published by ONC at healthit.gov.


Maryland-Specific Laws and Initiatives for AI in Healthcare

Existing State Statutes

Maryland does not have a law specifically regulating AI in clinical practice. However, existing laws apply indirectly.

The Maryland Health-General Article (Md. Code Ann., Health-Gen.) covers healthcare licensing, facility operations, and patient rights. Its informed consent provisions (Health-Gen. §5-601 et seq.) are relevant when AI tools influence clinical decisions, especially if patients have a right to know when algorithmic recommendations affect their care.

The Maryland Confidentiality of Medical Records Act (Health-Gen. §4-301 et seq.) restricts the disclosure of medical records and applies to AI vendors handling patient data under agreements with Maryland-licensed providers.

Data Privacy: Maryland Online Data Privacy Act

The Maryland Online Data Privacy Act (MODPA), passed in 2024 (2024 Md. Laws Ch. 24, codified at Md. Code Ann., Com. Law §14-4601 et seq.), is Maryland's most significant recent data privacy law. MODPA grants consumers rights regarding their personal data, including sensitive data. Health data processed outside of HIPAA-covered contexts, such as in wellness apps or consumer-facing AI health tools not covered by HIPAA, falls under MODPA.

MODPA includes provisions on automated decision-making: consumers can opt out of profiling that results in legal or similarly significant effects. AI developers creating consumer health tools in Maryland must comply with MODPA. The Maryland Attorney General's office enforces this law.

Note: MODPA's exemptions for health data align with HIPAA-covered data. Therefore, clinical AI operating within a covered entity framework is primarily governed by HIPAA, not MODPA. Consult the Maryland Attorney General's Consumer Protection Division for guidance on edge cases.

Legislative Activity

The Maryland General Assembly has introduced bills in recent sessions concerning automated decision-making and algorithmic accountability. However, as of mid-2025, none specifically targeting clinical AI have been enacted. To track current and proposed bills, visit the Maryland General Assembly's legislative information portal (mgaleg.maryland.gov). Use search terms like "artificial intelligence," "automated decision," "health data," and "algorithm."

The Health and Government Operations Committee (House) and the Finance Committee (Senate) are the main committees responsible for health-related technology legislation.


Key Regulatory Considerations for AI Developers and Providers in Maryland

Data Governance and Privacy

Establish data governance programs based on HIPAA's Security Rule (45 CFR §164.300 et seq.) as a foundation. Then, incorporate MODPA requirements for non-HIPAA data flows. Key practices include:

  • Document the origin of training datasets, including how data was de-identified and its demographic composition.
  • Sign BAAs with all AI vendors before sharing PHI (45 CFR §164.502(e)).
  • Conduct annual HIPAA Security Risk Analyses (45 CFR §164.308(a)(1)) that specifically include AI systems.

Algorithmic Bias and Equity

Maryland's diverse patient population means an AI tool validated primarily on white, insured individuals may not perform well for Baltimore's uninsured or Medicaid populations. An analysis by Basu et al. (2026) of Medicaid managed care procurement found that claims of equity performance were often emphasized without strong evidence, a pattern that could apply to AI vendor marketing (Basu S et al., Inquiry, 2026 [42012014]).

Require vendors to provide performance data broken down by race, ethnicity, age, sex, and insurance status before deploying AI tools. Include contract clauses that allow for auditing of algorithmic performance after deployment.

Transparency and Explainability

Maryland does not currently require clinical AI systems to be explainable. The NIST AI Risk Management Framework (NIST AI RMF 1.0, January 2023) offers the most widely recognized voluntary standard. NIST's GOVERN, MAP, MEASURE, and MANAGE functions provide a structured way to document model behavior and communicate uncertainty to clinicians.

Clinicians who rely on AI recommendations must be able to explain the basis for their clinical decisions if questioned in a malpractice case. Simply stating "the algorithm recommended it" is not a sufficient defense.

Clinical Validation and Post-Deployment Monitoring

FDA-regulated AI devices must undergo post-market surveillance according to 21 CFR Part 822. While no Maryland regulation mandates post-deployment performance monitoring for clinical decision support tools not regulated by the FDA, the standard of care increasingly requires it. Develop internal protocols to monitor AI performance, tracking model drift, error rates, and adverse events linked to AI recommendations.

Liability and Licensure

Maryland's medical malpractice law (Md. Code Ann., Cts. & Jud. Proc. §3-2A-01 et seq.) does not exempt AI-related errors. Liability is determined by the standard of care. If a physician relies on an AI recommendation that a reasonably competent physician would have questioned, that physician may be held liable.

As of mid-2025, the Maryland Board of Physicians (mbp.state.md.us) has not issued specific guidance on AI. For information on the Board's stance on technology-assisted practice, which is the closest analogue, consult the Board directly.


Recent Developments and Future Outlook for AI Healthcare Regulation in Maryland

Legislative Pipeline (Last 18 Months)

The enactment of MODPA in 2024 is the most significant recent development with direct implications for AI. The law's provisions on opting out of automated decision-making and protections for sensitive data create new compliance requirements for consumer-facing health AI tools.

The 2025 Maryland General Assembly session continued to show interest in legislation concerning AI transparency and accountability. Check mgaleg.maryland.gov for specific bill numbers and their current status, as the legislative calendar changes frequently.

Anticipated Regulatory Directions

Based on national trends and Maryland's legislative history, likely near-term developments include:

  • Mandatory requirements for bias audits for AI tools used in state-funded healthcare programs, including Medicaid managed care.
  • Rules requiring patients to be informed or give consent when AI tools significantly influence clinical decisions.
  • Guidance from MDH on standards for state-licensed facilities when procuring AI systems.
  • Potential participation in multi-state AI policy initiatives led by organizations like the National Governors Association or the National Academy for State Health Policy.

Maryland's Role in National Discussions

Maryland's location near federal agencies (FDA, HHS, ONC) and its concentration of major academic medical centers (University of Maryland Medical System, Johns Hopkins Medicine) position it to influence national AI policy. Stay informed about MDH's participation in federal comment periods on FDA AI/ML guidance updates and ONC rulemaking.


Comparative Overview: Maryland's Approach vs. Federal Guidelines

| Regulatory Focus Area | Federal Framework | Maryland State Approach

Affiliate disclosure: some links below are affiliate links (Amazon and partner programs). If you buy through them, we may earn a small commission at no extra cost to you. Product selection is not influenced by commission — see our full disclosure.