On January 7, 2026, OpenAI announced plans to launch of ChatGPT Health (“Health”), a new model that will allow users to connect their health records and wellness applications to the chatbot. Every week, hundreds of millions of people use ChatGPT to enquire about health and wellness. OpenAI has set out the privacy protections and controls it intends to implement in handling highly personal and sensitive information, including data encryption, data isolation, user options to delete chats from its system, and restricting inputs to Health not to train the foundational model. Similar to its existing system, Health will use a large language model (LLM) to service its users in chatting about health, reviewing medical records, summarizing visits, and providing nutrition advice, among other functions.
Executive actions have shifted towards limiting AI regulations, attempting to maintain the United States as a global leader in AI innovation, and encouraging industries to adopt automation. In December 2025, President Trump issued the Executive Order 14365 “Ensuring a National Policy Framework for Artificial Intelligence,” attempting to deter state regulations from creating a patchwork of regulatory regimes and instead create national consistency. This action alone does not prevent state-level AI or privacy laws, however, it does establish a task force to challenge them. The EO followed a previous action which removed Biden-Era regulations placed on AI, classifying them as a hindrance to innovation and free markets.
The Food and Drug Administration (“FDA”) regulates AI Health technology, classifying certain developments as software as a medical device (SaMD) under the Federal Food, Drug, and Cosmetic (“FD&C Act”). On January 6, 2026 the FDA provided guidance on their oversight of AI devices, distinguishing low-risk products used for general wellness not to be regulated as medical devices. Software that is “unrelated to the diagnosis, cure, mitigation, prevention, or treatment of a disease or condition” is not a medical device under the FD&C Act. The FDA explicitly classified software programs as general wellness products, likely putting Health into an regulation-exempt status under the FD&C Act.
Systems which function solely to transfer, store, convert, format, and display medical device data are characterized as Medical Device Data Systems (MDDS) are subject to the FD&C Act. However, the FDA has also clarified that Non-Device-MDDS with software functions that store patient data, convert digital generated data, or display previously stored patient data are exempt from regulation as long as they do not analyze or interpret data. This contention produces uncertainty for Health’s classification because of the functional interaction between data input and user interactions.
The Health Insurance Portability and Accountability Act (“HIPAA”) Privacy Rule ensures covered entities and business associates properly handle protected health information (“PHI”). Users submitting medical records to Health would not render OpenAI a covered entity or business associate, leaving its status as a consumer health product outside of HIPAA’s regulatory scope. Data sharing, such as what Health sets out to do across Apple Health, MyFitness Pal, and other applications, falls outside of the HIPAA framework if it is disclosed for purposes other than for treatment, payment, healthcare operations or otherwise requiring authorization by the Privacy Rule 45 C.F.R. § 164.508.
The Federal Trade Commission (“FTC”) may serve as a backup for these regulatory rollbacks. The FTC regulates healthcare privacy by providing data breach notifications. Compliance is enforced through the Health Breach Notification Rule (“HBNR”), requiring vendors of personal health records to notify the FTC and consumers if a data breach occurs. A vendor under the HBNR is any Non-HIPAA entity or business associate that “offers or maintains a personal health record.” It is uncertain whether Health will be subject to regulation under this category, or any other, despite their handling of users’ personal health record uploads. As an alternative method of accountability, the FTC may bring litigation actions, such as the recent class action settled with Flo Health Inc. for sharing proprietary health data to Facebook, Google, and others without user consent.
As the regulation landscape surrounding Health is actively evolving, it is uncertain how privacy concerns will be handled. Federal agencies and the executive are giving broad autonomy to the developers for privacy practices as AI integrates healthcare practices, leaving much of the accountability to be exercised through litigation or FTC after-actions.
