{"id":868,"date":"2026-05-08T14:06:14","date_gmt":"2026-05-08T18:06:14","guid":{"rendered":"https:\/\/blogs.duanemorris.com\/healthlaw\/?p=868"},"modified":"2026-05-08T14:06:15","modified_gmt":"2026-05-08T18:06:15","slug":"no-ai-law-no-problem-pennsylvania-applies-its-medical-practice-act-to-an-ai-chatbot","status":"publish","type":"post","link":"https:\/\/blogs.duanemorris.com\/healthlaw\/2026\/05\/08\/no-ai-law-no-problem-pennsylvania-applies-its-medical-practice-act-to-an-ai-chatbot\/","title":{"rendered":"No AI Law, No Problem: Pennsylvania Applies Its Medical Practice Act to an AI Chatbot"},"content":{"rendered":"\n<p>By<a href=\"https:\/\/www.duanemorris.com\/attorneys\/ryanwesleybrown.html\"> Ryan Wesley Brown<\/a><\/p>\n\n\n\n<p>Much of the national discussion around artificial intelligence regulation has centered on whether legislatures need to pass new, AI-specific laws to address the technology&#8217;s risks. In health care, though, AI is already regulated in important ways. State licensing laws, consumer protection statutes, and professional practice acts likely apply to AI-driven tools the same way they apply to people, and state enforcers are starting to make that point.<\/p>\n\n\n\n<p>On May 1, 2026, the Pennsylvania Department of State sued Character Technologies, Inc. in Commonwealth Court. Character Technologies operates an AI chatbot platform called Character.ai. The state&#8217;s complaint does not rely on any new AI statute. It relies on Pennsylvania&#8217;s Medical Practice Act, a law that has been on the books for decades, to allege that a user-created chatbot character on the platform engaged in what the Commonwealth calls the unlawful practice of medicine and surgery. The allegations have not been adjudicated, and Character Technologies has not been found liable.<\/p>\n\n\n\n<p>According to the complaint, an investigator from the Bureau of Enforcement and Investigation interacted with a chatbot &#8220;doctor&#8221; named &#8220;Emilie&#8221; that allegedly claimed to be a licensed psychiatrist authorized to practice medicine in Pennsylvania. The Commonwealth alleges that the chatbot provided a fabricated Pennsylvania medical license number and offered mental health guidance while the investigator posed as someone experiencing symptoms of depression. The complaint states that, as of mid-April, the chatbot had reportedly logged about 45,500 user interactions.<\/p>\n\n\n\n<p>In a news release, Governor Josh Shapiro framed the enforcement action as a matter of public safety rather than technology policy. &#8220;Pennsylvanians deserve to know who\u2014or what\u2014they are interacting with online, especially when it comes to their health,&#8221; Shapiro said. Secretary of State Al Schmidt struck a similar note: &#8220;Pennsylvania law is clear\u2014you cannot hold yourself out as a licensed medical professional without proper credentials. We will continue to take action to protect the public from misleading or unlawful practices, whether they come from individuals or emerging technologies.&#8221;<\/p>\n\n\n\n<p>The Shapiro administration described the suit as the first enforcement action arising from a broader Department of State investigation into AI companion bots that may be falsely presenting themselves as licensed professionals, and reportedly the first such action announced by a governor. Pennsylvania has also set up a formal complaint reporting process through the Department of State&#8217;s AI enforcement task force, focused on bots that may be engaged in unlicensed professional practice.<\/p>\n\n\n\n<p>While the case is unresolved, it already shows that the absence of a dedicated AI statute does not leave a regulatory gap. Medical practice acts, scope-of-practice rules, and professional licensing requirements in every state define who, and now potentially what, may provide medical advice, diagnose conditions, or claim to be a licensed provider. These laws are usually technology-neutral, and regulators are showing a willingness to apply them to AI tools that wander into clinical territory.<\/p>\n\n\n\n<p>For companies evaluating AI-powered patient engagement platforms, virtual health assistants, or clinical decision-support tools, the Pennsylvania action is a useful reminder: compliance obligations generally do not turn on whether a legislature has spoken specifically to artificial intelligence. Any AI tool that interacts with patients in ways that look like the practice of medicine\u2014offering diagnostic impressions, recommending treatment, or suggesting that it is licensed\u2014can trigger existing regulatory regimes and create enforcement risk for both the vendor and the institution that deploys it.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>By Ryan Wesley Brown Much of the national discussion around artificial intelligence regulation has centered on whether legislatures need to pass new, AI-specific laws to address the technology&#8217;s risks. In health care, though, AI is already regulated in important ways. State licensing laws, consumer protection statutes, and professional practice acts likely apply to AI-driven tools &hellip; <\/p>\n<p class=\"link-more\"><a href=\"https:\/\/blogs.duanemorris.com\/healthlaw\/2026\/05\/08\/no-ai-law-no-problem-pennsylvania-applies-its-medical-practice-act-to-an-ai-chatbot\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;No AI Law, No Problem: Pennsylvania Applies Its Medical Practice Act to an AI Chatbot&#8221;<\/span><\/a><\/p>\n","protected":false},"author":346,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[5],"tags":[991,1006,446],"ppma_author":[932],"class_list":["post-868","post","type-post","status-publish","format-standard","hentry","category-health-law","tag-ai","tag-medical-practice-act","tag-pennsylvania"],"authors":[{"term_id":932,"user_id":346,"is_guest":0,"slug":"rwbrown","display_name":"Ryan Wesley Brown","avatar_url":"https:\/\/blogs.duanemorris.com\/healthlaw\/wp-content\/uploads\/sites\/8\/2019\/01\/brownryanwesley-125x150.jpg","0":null,"1":"","2":"","3":"","4":"","5":"","6":"","7":"","8":""}],"_links":{"self":[{"href":"https:\/\/blogs.duanemorris.com\/healthlaw\/wp-json\/wp\/v2\/posts\/868","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blogs.duanemorris.com\/healthlaw\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blogs.duanemorris.com\/healthlaw\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blogs.duanemorris.com\/healthlaw\/wp-json\/wp\/v2\/users\/346"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs.duanemorris.com\/healthlaw\/wp-json\/wp\/v2\/comments?post=868"}],"version-history":[{"count":0,"href":"https:\/\/blogs.duanemorris.com\/healthlaw\/wp-json\/wp\/v2\/posts\/868\/revisions"}],"wp:attachment":[{"href":"https:\/\/blogs.duanemorris.com\/healthlaw\/wp-json\/wp\/v2\/media?parent=868"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blogs.duanemorris.com\/healthlaw\/wp-json\/wp\/v2\/categories?post=868"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blogs.duanemorris.com\/healthlaw\/wp-json\/wp\/v2\/tags?post=868"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/blogs.duanemorris.com\/healthlaw\/wp-json\/wp\/v2\/ppma_author?post=868"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}