Employers’ use of artificial intelligence (AI) tools in making hiring and other employment decisions is drawing increased scrutiny. The Equal Employment Opportunity Commission (EEOC) published guidance on how employers’ use of AI tools may violate the Americans with Disabilities Act (ADA), and the New York City Council enacted a new law mandating a yearly bias audit of the tools, along with certain notice requirements. In light of this heightened focus, companies should carefully evaluate their use of AI tools to ensure they are compliant with current requirements.
EEOC Guidance on How the Use of AI Tools May Violate the ADA
The EEOC issued guidance warning employers that using algorithms and artificial intelligence in making hiring decisions can result in discrimination based on disability. These tools can include any software and applications that use algorithms to process data to evaluate, rate, or make other decisions about candidates and employees, such as resume scanners, employee monitoring software, chatbots, video interviewing software, and testing programs. While the EEOC’s guidance does not have the force of law, it serves as notice to employers so that they can avoid engaging in discriminatory action.
In the guidance, the EEOC focused on three situations in which an employer’s use of these algorithmic decision-making tools could violate the ADA:
New York City’s Law on the Use of Automated Employment Decision Tools
Companies with candidates or employees in New York City also should be aware that the city enacted a law, which will go into effect on January 1, 2023, governing the use of automated decision tools in making hiring and other employment decisions. The New York City Department of Consumer and Worker Protection has proposed rules to clarify certain provisions in the law. The proposed rules are open for public comment in advance of a public hearing on October 24, 2022.
Overview of New York City’s Law
The law requires any company that uses an “automated employment decision tool” for candidates or employees in New York City to conduct a yearly “bias audit” of the tool and publish the results on the company’s website. A “bias audit” for these purposes is an impartial evaluation by an independent auditor that must include an assessment of whether the tool has a disparate impact on individuals based on race/ethnicity or sex. The proposed rules, if adopted, would add details regarding the bias audit, including requirements for calculating the selection rate and the impact ratio for each category.
An “automated employment decision tool” means “any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons.” It does not include a tool that “does not automate, support, substantially assist or replace discretionary decision-making processes and that does not materially impact natural persons, including, but not limited to, a junk email filter, firewall, antivirus software, calculator, spreadsheet, database, data set, or other compilation of data.” Based on the text, the law likely covers any computerized tool or algorithm-based software used to evaluate, select, or assess candidates or employees, such as programs or software that review resumes, conduct testing, rank employees or candidates, evaluate performance, or conduct personality assessments.
Notice Requirements
The law also contains notice requirements. Employers who use an automated employment decision tool must, for each employee or candidate residing in New York City: (i) notify the employee or candidate of the fact that such tool will be used in connection with the assessment or evaluation of such employee or candidate; (ii) notify the employee or candidate of the job qualifications and characteristics that such tool will use in the assessment; and (iii) allow the employee or candidate to request an alternative selection process or accommodation. The notice must be provided at least 10 business days prior to the employer’s use of the tool. The rules propose details on how the company may furnish the notice to candidates and employees, such as by including the notice in the job posting or mailing or emailing the notice to the individual. Further, upon written request by an employee or candidate, the employer must, within 30 days of the request, make available the type of data collected, the source of the data, and the company’s data retention policy.
Under the text of the law, it is unclear who is considered a “candidate” entitled to notice. For instance, if an employer uses a program that screens LinkedIn profiles for particular qualifications, it is not clear whether every individual in New York City whose profile was reviewed would be considered a candidate and must receive the notice. The proposed rules would clarify this issue. Under the proposed rules, a candidate for employment is someone who has actually applied for a specific employment position by submitting the necessary information and/or items in the format required by the company. Therefore, a tool that screens LinkedIn profiles of individuals who have not applied for employment would not be included.
The law specifically states that the notices must be provided to an employee or candidate who “resides in the city.” As a result, companies may be required to determine where candidates reside to understand the notice obligations, rather than providing notice only to candidates for job postings located in New York City. It is not clear how companies should practically go about finding out the residence of candidates if the company does not have a resume from the applicant containing their home address. Further, the focus on where the candidate resides means that employers outside of New York City must still comply with this law for any candidates or employees located in the city.
Penalties for Violations
Employers who violate this New York City law can be liable for up to $500 for a first violation and each additional violation that occurs on the same day as the first violation. Subsequent violations can result in penalties between $500 to $1,500 each. Each day an automated employment decision tool is used in violation of the law is considered a separate violation, and the failure to provide any required notice to an employee or candidate also is considered a separate violation.
Next Steps for Employers
Given the varied ways use of AI tools may violate employment laws, employers are advised to discuss with employment counsel the particulars of their situation. In the meantime, employers should consider the following steps in light of the guidance from the EEOC and to prepare for the enactment of the New York City law if covered:
Wilson Sonsini Goodrich & Rosati’s employment and trade secret litigation group is actively following developments related to the use of artificial intelligence tools in making employment decisions. For more information, please contact Rico Rosales, Marina Tsatalis, Jason Storck, Rebecca Stuart, or another member of the employment and trade secret litigation group.