Category: Blog

Embryos Are Children: What Does This Mean For Reproductive Rights?

In a ruling on February 16th, 2024, the Alabama Supreme Court declared that frozen embryos outside of the womb are classified as children. There are a few steps to in vitro fertilization (IVF); hormones are taken for the egg retrieval process, and then mature eggs are collected from the ovaries and fertilized in a lab with sperm. One or more of the fertilized embryos are then inserted into the uterus. Many complications can arise from inserting multiple fertilized embryos into the uterus, and since an IVF retrieval process will typically produce additional embryos, it is standard procedure to freeze the embryos not used in the current IVF cycle and keep them preserved in the fertility clinic. By freezing the extra embryos, couples can have additional IVF cycles without the strains that come with egg retrieval, which can include surgeries and hormonal treatments.

In the current case, three couples went through a standard IVF treatment and kept their extra fertilized embryos cryo-preserved at the fertility clinic, which resided within a hospital. Unfortunately, those embryos were destroyed when a hospital patient entered the cryo-preservation area, opened a tank where the frozen embryos were stored, and dropped them to the ground.

The couples sued under the Wrongful Death of a Minor Act. For the court to rule on the matter and determine if there is proper standing under the Act, the court first had to look to whether the embryos were considered unborn children. In analyzing the statute, the Alabama Supreme Court determined that fertilized embryos qualify as children, as nothing in the Act specified unborn children must be in utero. So, the Act “applies on its face to all unborn children, without limitation.” The Court concluded the fertilized embryos fall within the definition of children because “all parties to these cases, like all members of this Court, agree that an unborn child is a genetically unique human being whose life begins at fertilization and ends at death.” Dobbs v. Jackson Women’s Health Organization is noted as a ruling factor in the case, with the court stressing that unborn fetuses were widely recognized as living persons with rights and interests dating as far back as the 18th century.  

The ruling poses serious risks to not only those who rely on IVF treatments, but for general family planning, as the holding implies that life beings at fertilization. Holding embryos as children creates a slippery slope where individuals can be held civilly and possibly criminally liable for destroying embryos.

IVF clinics in Alabama are already closing with concerns the clinics will be penalized, or individual embryologists, for what they deem safe to freeze or in turn discard. With worry over IVF in the state, both the Alabama House and Senate rallied together and voted to approve legal protections for providers of IVF. The bill would extend civil and criminal immunity to providers of IVF services. While this creates legislature specifically for IVF providers, nothing has been discussed in relation to family planning measures. This poses a serious question: how far can this holding go in terms of family planning, birth control, and autonomy of one’s own body?

In relation to IVF, even individuals who choose to discard their frozen embryos could be liable for killing a child, in theory. What if a couple decides they no longer want to continue keeping the embryos viable? What if one parent wants to unfreeze the embryo and the other doesn’t? If the embryos are never destroyed, but additionally are never given the opportunity to be birthed, does this constitute as the embryos essentially dying? These questions just pose the beginning of larger disputes regarding parental rights of fertilized embryos.

Outside of IVF, emergency contraceptives like Plan B contain a hormone that can keep a fertilized egg from implanting in the uterus. With this ruling, emergency contraceptives can be deemed as killing a child, and those that take it may be at risk for civil or criminal penalties. Additionally, after Dobbs, Alabama reinstated a total abortion ban. However, if a mother miscarries, she could be liable for the death of her own child. While it may seem severe, the door is now open for courts to rule in this manner. 

 “Aging in Place” and the Importance of the 2024 OAA Final Rule

The demographic landscape in the United States is undergoing an unprecedented shift as the older population continues to increase. In the next decade, the number of people aged 65 or older is projected to outnumber those under 18. Older adults face unique social, physical, and economic challenges. As this population grows, our nation will need to find comprehensive alternatives to the current systems, which are already struggling to adequately support the complex needs of persons 65 and older.

As we age, daily tasks and activities become more difficult. While moving into an assisted living facility or nursing home may seem like the obvious next step for aging adults who need additional care and support, for many, this move may be unnecessary, unattainable, or unwanted.

Studies show that approximately 95 percent of older adults want to continue residing in their own homes and communities. This concept, known as aging in place, allows the nation’s growing population of older adults to live independently and “age with dignity.” This desire is driven by numerous factors, including high facility costs, the uncertainty of transitioning to a new and unfamiliar environment, the desire for independence, and the absence of their support systems.

However, there are challenges and risks associated with living independently in one’s own home, such as increased difficulty performing daily tasks, lack of transportation, a risky home setup, potential for isolation, and the need for additional care. Additionally, as the nation’s aging population grows, so will the fraction of vulnerable older adults who will require more assistance than those who are in good health, have strong social connections, and possess adequate resources.

Nevertheless, individuals can safely remain in their homes and communities with support. Given the increase of older adults who will inevitably age in place, and face challenges that can make independent living more difficult, the need for additional services and support is imperative.

On February 6, 2024 the Administration for Community Living (ACL) announced its final rule to update regulations for implementing its Older Americans Act (OAA) programs. This update reflects the increased needs of a nearly doubled senior population since the last substantial update to the OAA regulations in 1988.

First passed in 1965, the OAA authorized programs and services to help older adults remain in their homes and communities. Since the last substantial update to OAA regulations 36 years ago, there is a deeper understanding of how social determinants of health can be used to reshape healthcare, and health care service delivery models, through the use of non-medical services. OAA programs provide older adults with resources to remain in their communities and places of residence, empowering them to age in place and avoid institutionalization or hospitalization. The OAA offers a range of services and support, such as home-delivered meals, congregate meals, caregiver assistance, personal and home care services, preventive health services, and transportation. These services play a crucial role in enabling older adults to maintain their independence and place in their communities, ensuring that aging in place becomes a viable and sustainable option for individuals across the country.

Health Care Privacy Concerns Around Mental Health Apps

Between 2019 and 2021, the market for mental health and wellness apps such as Calm, Headspace, and BetterHelp increased by 54.6%. In 2020 alone, the top 100 wellness apps were installed over 1.2 billion times. This increase can be attributed to the Covid-19 pandemic. Lockdowns and the rise of remote work led to an increased number of people turning to mobile apps to work on their health and fitness goals from home. Studies have also shown that individuals with existing mental health problems reported that their symptoms worsened during the pandemic. In this case, the convenience of mental health apps has enabled users to connect with therapists and mental health coaches from the comfort of their home. Today, an increasing number of physicians recommend their patients to use mental health apps to reduce stress and improve sleep habits, with apps such as Calm allowing patients to access content through their health insurance plan. 

But as mental health apps continue to grow in popularity, researchers have begun raising privacy concerns surrounding the collection of user data. For instance, the Federal Trade Commission (FTC) issued a proposed order in 2023 banning BetterHelp, an online counseling service, from sharing sensitive information about mental health challenges for advertising. This included requiring the app to pay a $7.8 million settlement to consumers. In this case, it is useful to consider what the current privacy concerns are as well as policy solutions to address the issue. One of the main issues concerns the volume of data that is collected by mental health apps. For instance, data collection can include geolocation information such as clinic visits and purchase history for healthcare services. Furthermore, mental health apps may collect other sensitive information such as a user’s full name, date of birth, or home address, as well as health biometrics on a user’s mood, sleep habits, or symptoms. In the case of BetterHelp, the app’s privacy policy now states that they are not paid for user data, but that they may share visitor data with third party advertisers such as Meta, TikTok, Snap, and Reddit. 

This leads to the question of what improvements can be made to protect sensitive data of users on mental health apps and what solutions are available to developers and policymakers. Currently, HIPAA regulations do not apply to most mental health apps unless it involves a covered entity, a business associate relationship, and the disclosure of electronic protected health information (ePHI). However, there are various steps that can be taken to safeguard user data. One study created a list of actions that stakeholders could take to ensure data privacy. This includes a variety of recommendations, including the creation of a privacy impact assessment and having readable privacy policies with accessible language. De-identification is another option that developers can take to protect user data, which can involve removing specific identifiers such as names and addresses. In addition, apps that do not fall under HIPAA regulations are now required to follow the FTC’s Health Breach Notification Rule, released in June 2023, which requires apps providing health care services or supplies to to notify consumers, the FTC, and often the media of data breaches. Overall, both new and current developers behind mental health apps can take these steps to better protect sensitive data and to build trust among users in the digital space. 

Navigating the Ethical and Legal Maze of IVF: A Closer Look at Industry Challenges

In the rapidly evolving world of reproductive technologies, in vitro fertilization (IVF) stands out as a beacon of hope for countless individuals striving to conceive. As this technology advances, it is increasingly confronted with complex ethical, legal, and medical challenges. These challenges highlight the ongoing struggle to maintain a balance between facilitating medical innovation and ensuring rigorous regulatory practices while also prioritizing comprehensive patient care.

A recent incident involving CooperSurgical, a major supplier in the fertility industry, brings to light this balance between innovation and regulation in the fertility industry. The company is facing several lawsuits from patients who claim that one of its products—a nutrient-rich liquid that helps fertilized eggs develop into embryos—was defective, destroying the embryos for potentially thousands of patients worldwide. This incident not only underscores the vulnerabilities inherent in assisted reproductive technology (ART) industries but also the critical importance of ensuring that the advancements in ART are matched with effective legal frameworks and ethical standards. Addressing these challenges is essential to safeguarding patient interests, and ensuring the pursuit of fertility treatment is safe and effective.

This week, the Food and Drug Administration (FDA) issued a notice stating that CooperSurgical recalled three lots of the liquid used for embryo transfer in multiple clinics during November and December. While CooperSurgical notified the affected clinics on December 13th, it is unclear how many bottles of the botched media they used before the recall. In lawsuits filed by impacted patients, the plaintiffs assert that the defective product purportedly lacked magnesium, a key nutrient whose absence halted the development of their embryos, rendering those embryos unviable for transfer. These lawsuits further demonstrate the challenging and emotional journey of IVF for individuals seeking fertility treatment that is further complicated by corporate negligence and product recall. In the most recent of eight lawsuits filed, a couple received a phone call on Thanksgiving notifying them that all the embryos had stopped growing. Unaware of the product recall, the plaintiff attributed this failure to herself, blaming her age, before being notified two months later that the clinic used the defective CooperSurgical media on her embryos.

Infertility affects approximately 1 in 8 couples in the United States, many of whom turn to IVF as a solution. Fertility medicine and ART are relatively new fields with the first live birth from IVF occurring in 1978. However, since this breakthrough, IVF has rapidly become a common remedy for those seeking infertility treatment and now accounts for 1.6% and 4.5% of all live births in the United States and Europe. The growing demand for IVF incentivizes companies like CooperSurgical to position themselves as leaders in a largely unregulated industry. The regulatory landscape of assisted reproduction is complex as the U.S. Department of Health and Human Services exerts only limited oversight over fertility clinics. The lack of stricter federal regulatory practices incentivizes corporations to prioritize economic considerations over ethical principles in their decision-making. CooperSurgical owns multiple large sperm and egg banks, bringing in $1.2 billion in revenue last year with 40 percent coming from fertility services and supplies. This lucrative market for fertility treatment underscores the recent trend in manufacturing issues resulting from rapid growth and consolidations in those companies that supply the industry with their products. The IVF journey, while filled with hope for individuals facing infertility, is fraught with challenges that demand attention from medical professionals, regulatory bodies, and legal authorities. The issues that arise out of the case against CooperSurgical are a stark reminder of the devastating impact that corporate oversights, a lack of stricter regulatory practices, and overall disregard for an already emotional process can have on individuals’ lives. As the field of reproductive medicine continues to evolve, it must do so with a commitment to protecting patient care and ethical standards.

For their own good: Involuntary commitment and patient’s rights in Washington D.C.

As we approach the holidays, we are encouraged to consider what we are grateful for. Walking around the WCL campus and seeing the posterboards, I am struck by how many people express open gratitude for therapy and psychiatric medication. Over the last week, I’ve seen notes that say: “I’m grateful for my meds” and “I am grateful for Celexa” (An antidepressant). In past years, I’ve seen: “I’m grateful for Prozac” and “I’m grateful for Lexapro.” These are also antidepressants. These notes indicate to me that there is not only less shame in disclosing mental health conditions and treatments, but more desire to speak openly about how we live with these conditions. Considering the stressful, highly competitive nature of law school, students on psychiatric medications should be aware their conditions can become unmanageable and require treatment in a psychiatric hospital ward. This raises the question: Are they aware of their rights once they have voluntarily accepted or have been forced into treatment? It is important to understand the basic rights of involuntarily committed patients because these patients are often vulnerable to misuse of their information and arrive at the hospital in a state not conducive to self-advocacy. Even in the midst of a mental health crisis, patients have a right to privacy and humane treatment.

What follows is a summary of rights of mental health patients in the D.C. metropolitan area. A person can be involuntarily committed for psychiatric treatment in Washington D.C. when a psychiatrist, qualified physician, or qualified psychologist has examined the person and has determined that the person has a mental illness which makes them a danger to themself or others, and that hospitalization is the least restrictive setting that meets the person’s needs. If the patient is a minor (under 18), their parent or legal guardian must be served notice of the patient’s admission to the hospital no more than 24 hours after admission. If the patient is an adult, the patient must authorize the Department to serve notice to their spouse, domestic partner, or legal guardian as long as the notice is consistent with the District of Columbia Mental Health Information Act of 1978 D.C. Code §7-1201-0.1. Patients cannot be detained for more than 48 hours, unless the chief officer of the department files a written petition with the court for emergency observation and diagnosis of the patient, in which case, the patient’s detention cannot exceed seven days from the time the order is entered. Patients are entitled to a complete record of their treatment as a patient. Not all mental health conditions qualify for involuntary commitment. A national survey of psychiatrists in the United States revealed that most believe simply having a mental illness that is diagnosable using the DSM should not be grounds for involuntary commitment, but being a danger to one’s self or others should be grounds for involuntary commitment. Most psychiatrists that were surveyed believed conditions that included a psychotic component should be grounds for involuntary commitment. Fifty-one percent of psychiatrists believed that having depression should be grounds for commitment.

We fear what we do not understand. Mental health commitment is not something to be feared. Psychiatric wards offer a place to rest and disconnect from the stresses of daily life until we are ready to face them again.  The goal of involuntary commitment is to prepare patients for a successful re-entry into society. Commitment may seem frightening at first, but it will likely be an experience you are grateful you had.

ChatGPT, MD? Artificial Intelligence in Healthcare 

ChatGPT is a natural language processing tool created by OpenAI, a research company specializing in artificial intelligence tools such as DALL-E 2, an AI art generator. Since its launch in November 2022, ChatGPT has become one of the fastest-growing apps in recent memory. By January 2023, the chatbot reached 100 million active users, with an average of 13 million visitors per day. Available for both desktop and mobile devices, ChatGPT employs a dialogue format similar to messaging apps where users can type in prompts and ask questions on a variety of topics. Numerous articles provide tips to users on the best prompts to ask ChatGPT, from drafting cover letters, solving complex math problems, and editing videos. Given its ability to form human-like responses, some users have turned to ChatGPT for medical advice. Patients can ask general questions on health conditions and use AI tools for summaries or resources to prepare for medical visits. However, the popularity of ChatGPT and its accelerated development has led industries to question how artificial intelligence may affect patient care in the near future, including privacy concerns and clinical care. As a result, it is worth asking how AI tools such as ChatGPT may be used to improve the standards of quality healthcare and the risks that are involved. 

Prior to ChatGPT, people frequently turned to the Internet to self-diagnose. Studies have shown that between 67.5% to 81.5% of American adults have searched for health information online. While this is not a new phenomenon, the conversational aspects of ChatGPT and a lack of regulation around artificial intelligence involves considering the ethical and moral implications of using AI tools. Generally, health experts have recommended against using ChatGPT for medical advice. However, doctors have reported that using AI tools may be useful for patients to learn more about certain conditions such as COVID, including the symptoms, causes, risk factors, and treatments as well as side effects of prescription drugs. Early studies have also indicated that AI models including ChatGPT could be used to screen patients for mental health conditions such as depression or anxiety and to determine treatment options. 

While ChatGPT has the potential to change the medical field in terms of diagnosis and treatment of various health conditions, it also raises new liability risks for health care providers. Physicians who rely on AI tools may be subject to greater liability for medical malpractice, as courts would likely find it unreasonable for health professionals to rely on AI-generated content. In addition, companies have noted that there is no way for physicians to use ChatGPT with protected health information (PHI) of patients while remaining compliant with HIPAA. For instance, if a physician chose to use ChatGPT to transcribe handwritten notes or recordings from a general appointment, they may potentially violate HIPAA regulations without knowing it.

Although the use of artificial intelligence in healthcare settings is fairly limited today, researchers have begun considering how AI systems can be built to improve the efficiency and effectiveness of medical services. This includes proposals for “human-centered AI” that employs a problem-solving approach to key issues in clinical care. One possible method is to train AI models by analyzing large amounts of data to look for certain health conditions in specific populations. In recent news, Stanford School of Medicine and the Stanford Institute for Human-Centered Artificial Intelligence (HAI) announced the creation of RAISE-Health (Responsible AI for Safe and Equitable Health), an initiative to integrate AI into healthcare. This includes developing a platform for responsible use of AI systems by enhancing clinical services as well as educating patients and providers on best practices.   

As ChatGPT becomes an increasingly prominent aspect of medicine and other industries, it is important to consider how AI tools have the potential to streamline the work of healthcare systems while also cautioning physicians of the legal and ethical risks involved in its use.