What issues should employers consider before using automated decision-making systems in the workplace?
Employers using automated decision-making systems, including artificial intelligence, algorithms, machine learning, and other tools (collectively, “ADMs”), in connection with employment decisions are on the precipice of a drastically changed landscape concerning such use. The Equal Employment Opportunity Commission (“EEOC”) is preparing to issue its final strategic enforcement plan addressing the use of ADMs in employment. Additionally, states and localities are enacting or introducing their own legislation and regulation on the subject. New York City has announced that it will begin enforcing Local Law 144 of 2021 (“Local Law 144”) imminently, on April 15, 2023. Meanwhile, the EEOC and private litigants increasingly are commencing lawsuits alleging discrimination in the use of ADMs in employment.
Considering the legal developments covering the use of ADMs in employment, employers should prepare for the coming changes and recognize how the ADMs they are using may leave them vulnerable to claims of employment discrimination.
EEOC Enforcement and Guidance
On January 10, 2023, the EEOC issued a Draft Strategic Enforcement Plan (“Draft SEP”), which places elimination of employment discrimination in the use of ADMs at the top of its strategic priorities list. The EEOC signaled that it will use investigations and litigation to “eliminat[e] barriers in recruitment and hiring” arising out of “the use of automated systems, including artificial intelligence or machine learning, to target job advertisements, recruit applicants, or make or assist in hiring decisions where such systems intentionally exclude or adversely impact protected groups.”
The EEOC held a public hearing on January 31, 2023, addressing the use of ADMs in employment. Much of the testimony urged the EEOC to issue additional guidance establishing best practices for employers to ensure that their use of ADMs complies with employment discrimination laws. The EEOC is considering the public feedback it received and may issue the final strategic enforcement plan at any time.
The EEOC previously issued guidance in May 2022 focusing on unique issues pertaining to the use of ADMs as to individuals with disabilities. The guidance identifies potential violations of the Americans with Disabilities Act (“ADA”) where: (1) an employer using ADMs does not provide reasonable accommodations to applicants and employees with disabilities to ensure fair assessment; (2) the employer’s ADMs—intentionally or unintentionally—screen out individuals with disabilities who could perform the essential functions of a job with reasonable accommodations; and (3) the employer’s ADMs violate ADA restrictions on disability-related inquiries and medical examinations. Litigants may seek to hold employers responsible for ADA violations resulting from the use of ADMs, even when third-party vendors develop and administer them.
Local Law 144
Meanwhile, New York City enacted Local Law 144, effective January 1, 2023, the first law attempting directly and comprehensively to regulate the use of ADMs in the workplace. The ordinance provides that it is unlawful for employers and employment agencies to use an “automated employment decision tool” (“AEDT”) to “screen a candidate or employee for an employment decision” within the city—unless the tool has been subjected to a “bias audit” within one year before the tool’s use, information about the bias audit and tool are published, and required notices are given to employees and candidates residing in the city. N.Y.C. Admin. Code § 28-871. The ordinance defines a “bias audit” as “an impartial evaluation by an independent auditor” that includes testing of the disparate impact on component 1 categories (i.e., sex and race/ethnicity) required to be reported by employers covered by Title VII of the Civil Rights Act of 1964 (“Title VII”). Id. § 28-870.
In December 2022, the New York City Department of Consumer and Worker Protection (“DCWP”) announced that it would not begin enforcing the ordinance until April 15, 2023, and issued a revised set of proposed regulations seeking to clarify the ordinance. Among other things, the proposed rules would refine the ordinance’s requirement that an AEDT is a tool used “to substantially assist or replace discretionary decision making” to mean that the AEDT’s “simplified output” must be relied on solely to make an employment decision; given greater weight than other criteria when making an employment decision; or used to overrule conclusions derived from other factors, including human decision-making. The proposed rules also would clarify that an “independent auditor” must be “capable of exercising objective and impartial judgment” and must not have been involved in using, developing, or distributing the AEDT or have an employment relationship or financial interest with the employer, employment agency, or vendor whose AEDT is being audited.
Further, the DCWP’s proposed rules specify the calculations required for the bias audit. Essentially, when an AEDT is used to select or score applicants (or employees for promotion), the bias audit must calculate the “selection rate” or “scoring rate” and “impact ratio” for sex and race/ethnicity categories and “intersectional categories” of sex, ethnicity, and race (as well as the median score for the full sample of applicants for a scoring rate calculation). The proposed rules include exemplar calculations required for a bias audit.
The DCWP received additional public comments and held a hearing on the revised proposed rules on January 23, 2023. The DCWP is finalizing the rules but has not provided a date by which the final rules will be issued or extended the enforcement timeline.
Other States and Localities
Employers should be aware of pending developments relevant to the use of ADMs in employment in other states and localities. For example, New York and New Jersey are considering new legislation regarding ADMs used in employment. New York 2023 Leg., 246th Sess. (Jan. 9, 2023); New Jersey 220th Leg., A.B. 4909 (Dec. 5, 2022). In California, legislation specific to the use of ADMs for employment decisions stalled out in 2022. But the California Privacy Rights Act (“CPRA”), which amended and expanded the California Consumer Privacy Act (“CCPA”), became operative on January 1, 2023, resulting in the expiration of previous exemptions pertaining to the collection and use of employment-related personal information—which may be collected and used by ADMs. The California Privacy Protection Agency is currently in the rulemaking process and considering regulations specific to ADMs. Enforcement is set to begin on July 1, 2023. A detailed discussion of these developments is beyond the scope of this article.
Beyond the changing legislative landscape, employers using ADMs and vendors of ADMs are encountering increasing litigation from the EEOC and private litigants.
On May 5, 2022, the EEOC filed its first lawsuit addressing employers’ use of ADMs. In Equal Employment Opportunity Commission v. iTutorGroup, Inc., et al., Case No. 1:22-cv-02565 (E.D.N.Y.), the EEOC alleges that the defendants provide English-language tutoring services and discriminated against more than 200 tutor applicants by programming their application software to reject female applicants over the age of 55 and male applicants over the age of 60. The charging party allegedly applied and was rejected because she was older than 55 but applied a second time the next day using a more recent date of birth and was offered an interview. The EEOC seeks injunctive and monetary relief.
Recently, on February 21, 2023, an individual filed a putative class action against Workday, Inc., alleging that it provides an algorithm-based screening system that disproportionately denies employment opportunities to applicants based on race, age, and disability. Mobley v. Workday, Inc., Case No. 4:23-cv-00770 (N.D. Cal.). Specifically, Mr. Mobley alleges that Workday’s screening tools enable customers to select candidates based, at least in part, upon their protected classifications or lack thereof. Mr. Mobley seeks class certification and injunctive and monetary relief.
Criticisms of ADMs
ADMs offer great benefits to employers of cost and time savings in hiring and managing employees. However, commentators have pointed out that ADMs may introduce into employment decisions unintended bias or discriminatory impact on members of protected classes.
For instance, at the EEOC’s hearing on the Draft SEP, ReNika Moore (Director of the American Civil Liberty Union’s Racial Justice Program) testified (among other things) that:
- Racial and ethnic minority groups are overrepresented in data containing negative information (such as criminal records, evictions, and poor credit records) that may be considered by ADMs and lead them to be disproportionately excluded from employment opportunities.
- ADMs may be trained with data drawn from pools of individuals who are not representative of the group to which the ADMs will be applied, rendering the tool “less accurate for people in the underrepresented group.”
- Algorithms used to assess whether employees are meeting or candidates are likely to meet performance targets are trained with historical data, which may cause the algorithm to carry forward discriminatory impact from the past.
- ADMs may use certain data as “proxies” for protected characteristics (such as zip codes, names, or educational institutions).
- ADMs that continually learn may experience a “feedback loop” and reinforce their own discriminatory impact.
As one example of an ADM that may introduce bias into the process of recruiting employees, Ms. Moore testified that tools which target job advertisements based on individuals’ personal information can exclude members of protected classes from receiving the advertisements. Such exclusion may occur, for example, where employers select the characteristics of their desired audience or where they upload individuals’ data to “lookalike” tools that target recipients by their similarities to such individuals. Either process may select a group of recipients of job advertisements that is not representative of the applicable population.
In light of the evolving legal landscape focusing on bias in the use of ADMs, employers may wish to prepare for the patchwork laws that will soon apply to their use of ADMs. Among other things, employers may wish to ascertain whether they are using technology covered by new and pending laws on the subject. Further, even employers outside of New York City may wish to establish processes consistent with the requirements of Local Law 144 (or other new laws), such as conducting bias audits, publishing audit results, and issuing notices to candidates or employees of the use of ADMs. Of course, employers should always ensure their use of ADMs complies with existing employment discrimination laws, such as by providing reasonable accommodations to individuals with disabilities.
Reprinted with permission from the April 4, 2023 edition of the NEW YORK LAW JOURNAL © 2023 ALM Media Properties, LLC. All rights reserved. Further duplication without permission is prohibited. ALMReprints.com – 877-257-3382 – firstname.lastname@example.org.
 See https://rules.cityofnewyork.us/wp-content/uploads/2022/12/DCWP-NOH-AEDTs-1.pdf (last visited March 23, 2023).
 The “selection rate” is equal to the number of employees or candidates in a given category who were selected divided by the total number of employees or candidates in such category. The “scoring rate” is the rate at which individuals in a category receive a score above the full sample’s median score. The “impact ratio” is equal to the selection rate or scoring rate for a category divided by the selection rate or scoring rate of the most selected or highest scoring category.
 See https://www.eeoc.gov/meetings/meeting-january-31-2023-navigating-employment-discrimination-ai-and-automated-systems-new/moore#_ftnref48 (last visited March 23, 2023).