A final rule again revising Sec. 1557 of the Affordable Care Act protects gender identity, sexual orientation and pregnancy from sex discrimination and brings back notice and other requirements that were dropped from the 2020 version of the rule.[1] The HHS Office for Civil Rights (OCR) and CMS, which released the rule April 26, also require covered entities to have a Sec. 1557 coordinator and grievance procedure and give patients a “notice of availability” about free language assistance and auxiliary aids, among other things. And there’s something brand new at the intersection of discrimination protections and artificial intelligence (AI).
Covered entities are required “to make reasonable efforts” to identify patient care decision support tools used in their health programs and activities “that employ input variables or factors that measure race, color, national origin, sex, age, or disability” and then “make reasonable efforts to mitigate the risk of discrimination” resulting from use of the tools, according to the rule.
There’s no method for minimizing discrimination in the use of patient care decision support tools in the rule. “It establishes a general rule that covered entities must not discriminate on the basis of a protected class through the use of a wide range of tools and technologies, including AI,” said attorney Kyle Gotchy, with King & Spalding in Sacramento, California. The rule essentially requires covered entities not to let it happen to them, whether they developed the tools or are just the end users, he said.
But covered entities could be on the hook for penalties even if discrimination stemming from AI isn’t intentional. “It doesn’t matter if the AI tools don’t have a discriminatory animus to start with,” said attorney Rachel Carey, with Whiteford in Richmond, Virginia. “If certain tools result in discrimination in one of the covered areas, they can bring pretty big scrutiny.”
Sec. 1557 advances protections against discrimination in health care on the basis of race, color, national origin, sex, age and disability. The rule applies to health programs or activities that get HHS funding (e.g., hospitals, physicians) or are administered by HHS (e.g., Medicare Part D), or are in the health insurance marketplace, according to FAQs.[2]
The various Sec. 1557 requirements take effect at different times, with the AI provision given a 300-day deadline from the date the regulation is published in the Federal Register.[3]
This is the latest in a series of nondiscrimination rules from the Biden administration. For example, on May 1, OCR released a final rule it said strengthens protections for people with disabilities under Section 504 of the Rehabilitation Act[4] and the U.S. Department of Justice on April 24 published a final rule on nondiscrimination on the basis on disability.[5] “The administration has put out statements they want to focus on access and health equity,” Carey said, and she expects OCR to focus its investigations on complaints in this area.
Back and Forth ‘On the Basis of Sex’
The Sec. 1557 final rule covers a lot of ground. A high-profile aspect of the rule relates to the back and forth over the definition of sex discrimination. HHS in the 2016 version defined discrimination “on the basis of sex” to include termination of pregnancy and gender identity, but the 2020 rule eliminated gender identity and termination of pregnancy from that definition. Then came the landmark 2020 decision from the U.S. Supreme Court in Bostock v. Clayton County, which said that the Civil Rights Act of 1964—which bans sex discrimination—applies to discrimination against gay and transgender people in the workplace.[6] It was followed by a 2021 notice from the Biden administration, which declared it “will interpret and enforce Section 1557’s prohibition on discrimination on the basis of sex to include: (1) discrimination on the basis of sexual orientation; and (2) discrimination on the basis of gender identity.”[7]
And now the Sec. 1557 rule clarifies that sex discrimination “includes discrimination based on sex characteristics, including intersex traits; pregnancy or related conditions; sexual orientation; gender identity; and sex stereotypes.” But there are federal protections for religious freedom and conscience and recipients may apply for exemptions from provisions of the rule, according to the FAQs.
The final rule noted that the 2022 U.S. Supreme Court decision in Dobbs v. Jackson Women’s Health Organization, which overturned the right to abortion enshrined in Roe v. Wade, has altered the legal landscape in the realm of abortion. “OCR emphasizes that a covered provider’s decision not to provide abortions does not itself constitute discrimination in violation of section 1557.”
Taglines Are Still Out, Notice of Availability Is In
The definition of sex discrimination is not the only place where the three sets of rules differ. The first round of regulations under the Obama administration included compliance requirements that were eliminated by the Trump administration.
Specifically, the 2020 regulation ditched the notice and tagline requirements from the 2016 regulations. OCR also killed the requirement for covered entities with 15 or more employees to have a compliance coordinator and a written procedure for patients to file grievances alleging violations of Sec. 1557.
Almost everything came back in the Biden administration’s rule, with some variations. It requires a Sec. 1557 coordinator, although “it is the covered entity’s prerogative to designate any qualified individual to serve as its Coordinator,” and that person may have other responsibilities. Instead of taglines, OCR now requires a notice of availability of language assistance services and auxiliary aids and services in the 15 most common languages spoken by people with limited English proficiency. Covered entities also must have a written nondiscrimination policy and a grievance procedure and procedures for reasonable modifications for individuals with disabilities.
AI: ‘You Can’t Bury Your Head in the Sand’
When it comes to AI, the rule gets more aspirational. Although OCR doesn’t prohibit covered entities from using patient care decision support tools (which were called clinical algorithms in the proposed rule), it tells them not to discriminate. The rule applies “to all patient care decision support tools used in a covered entity’s health programs or activities to support clinical decision-making, including patient care decision support tools that are autonomous and those that assist or augment a covered entity’s clinical decision-making.”
Gotchy said there are several challenges here. For one thing, “we don’t have a ton of guidance about what actions—from the government’s point of view—will be considered reasonable for a given provider” although he anticipates OCR will post more guidance on this and related expectations under Sec. 1557 before they take effect. “I’m optimistic they’ll issue more guidance that’s tailored to different types of covered entities, tools and use cases,” Gotchy said.
Another challenge is that many health care organizations don’t have a full inventory of their AI technologies and other patient decision support tools. “Commonly, organizations lack a comprehensive, centralized understanding of what’s being used and nailing down what’s being used within your four walls is getting more complicated by the day because of the proliferation and democratization of this technology,” he said.
But not knowing where these tools are being used or how they work won’t necessarily get you off the hook. “You can’t bury your head in the sand anymore and say, ‘We didn’t develop this so it’s out of our control’ because the government is saying, ‘If you use these tools, you’re expected to make reasonable efforts to determine whether they use inputs about protected classes and then mitigate the risk of discrimination resulting from your use of that tool,’” he said.
Gotchy has been working on the development of AI governance for assessing and overseeing these technologies with chief information officers and other leaders, and “the first step is often figuring out what’s already there” because a lot of tools will have been adopted ad hoc in health care organizations. “As we think about what it will entail to make sure you’re in a position to meet the challenge this rule creates, there’s a lot of legwork providers will need to go through to get a sense of the current state.”
Contact Gotchy at kgotchy@kslaw.com and Carey at rcarey@whitefordlaw.com.