On May 18, 2023, the U.S. Equal Employment Opportunity Commission (EEOC) released a technical assistance document titled “Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures Under Title VII of the Civil Rights Act of 1964.” 

This follows the Commission’s May 2022 guidance on the use of AI tools and the ADA, which we covered at the time (here). The latest technical assistance document addresses potentially discriminatory impacts of artificial intelligence (AI) in employment selection procedures under Title VII. While the document does not establish new policies, businesses should consider it when utilizing AI-driven tools. This article is a brief discussion of the EEOC’s technical assistance document and its implications for businesses and employees.

Understanding AI in Employment Selection Procedures

The EEOC guidance clarifies that Title VII of the Civil Rights Act applies to all employment practices, with a specific focus on “selection procedures” such as hiring, promotion, and firing. It defines AI as a machine-based system that can make predictions, recommendations, or decisions based on human-defined objectives. 

The document provides various examples of AI tools used in employment selection, including:

  • Resume scanners
  • Employee monitoring software
  • Virtual assistants
  • Video interviewing software
  • Testing software

Disparate Impact and Disparate Treatment

The technical assistance document primarily addresses disparate impact discrimination resulting from AI-driven tools and does not cover intentional discrimination. In 1978, the EEOC published the Uniform Guidelines on Employee Selection Procedures (UGESP) to address how employers should determine whether any selection process has an adverse impact.

Disparate impact occurs when a facially neutral selection procedure disproportionately excludes individuals based on protected characteristics, such as race, sex, color, or religion. Employers can justify the use of such tools if they are job-related, consistent with business necessity, and there is no equally effective, less discriminatory alternative. 

Highlights of the Technical Guidance

Although the technical assistance document does not have the force of law, employers that use AI screening tools should note the following: 

Liability for AI Tools Designed or Administered by a Vendor or Third Party

Employers may be held liable for disparate impact discrimination resulting from AI-powered selection tools, even if the tools were developed or administered by external vendors. The EEOC recommends that employers assess potential adverse impacts before relying on an outside party and ask vendors about their evaluations of the tool’s impact. Employers remain accountable even if a vendor provides incorrect information regarding adverse impact.

The “Four-Fifths Rule” is Not Determinative

The “four-fifths rule” adopted in the UGESP is not the sole indicator of disparate impact. Under this rule, a selection rate for any protected group less than 80 percent of the rate for the group with the highest selection rate generally indicates disparate impact. However, smaller differences in selection rates may still indicate an adverse impact, especially if the tool is used to make a significant number of selections or if certain applicants are discouraged from applying.

Employers Should Self-Audit AI Tools

The EEOC encourages employers to conduct ongoing self-audits of AI selection tools to identify potential adverse impacts on protected groups. If a disparate impact is found, employers should consider modifying the tool to minimize it. However, employers should consult legal counsel to ensure compliance with both disparate treatment and disparate impact provisions of Title VII.

The Takeaway

AI is ushering in a brave new world for employers and employees alike. While the EEOC’s technical assistance guidance does not establish new policies, it provides valuable guidance for employers navigating the complexities of AI and discrimination. And if you suspect that an employer or its AI hiring tools have treated you unfairly, talk to an experienced employment lawyer.