Employer algorithms and AI software can discriminate against applicants and employees with disabilities | McAfee and Taft

ByLance T. Lee

May 20, 2022

On May 12, 2022, the Equal Employment Opportunity Commission (EEOC) and the U.S. Department of Justice (DOJ) released guidance warning employers against using artificial intelligence (“AI”) and software tools to make employment decisions. The guide, titled “The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Candidates and Employees,” warns that using these tools without safeguards could result in prosecution under the Americans with Disabilities Act (ADA).

The ADA requires that people with disabilities have full access to public and private services and facilities. Federal law also prohibits employers from discriminating on the basis of disability and requires employers to provide reasonable accommodations to applicants and employees with disabilities to enable them to apply for or perform employment.

Employers warned against using certain decision-support technology tools

Companies are increasingly relying on online applications and computerized pre-employment assessments for hiring decisions. This would include applicant screening software that prioritizes applications using certain keywords, or automatically filters applications without certain qualifications, or testing software that rates employees on personality traits, skills, or competencies. cognitive. Many employers also rely on automated software tools to monitor the locations, productivity, and performance of existing employees. Employers sometimes use these tools to make decisions about compensation, discipline, and termination.

The new guidelines warn employers who rely on AI and software to make decisions about compensation, performance reviews, discipline, hiring and firings, that it can lead to workplace discrimination. against applicants and employees with disabilities. Indeed, a) sometimes, the software tool used is not accessible to candidates or employees; or (b) at times, the metrics measured by performance or productivity tracking software may not fairly or accurately reflect a candidate or employee’s ability to perform job requirements with reasonable accommodation.

Common barriers to access present in web-based or computer-based tools include incompatibility with screen-reading software used by blind users; poor color contrast, which can cause problems for visually impaired or color blind people; videos without alt text or closed captions for the hearing impaired; or the use of timers, which can cause problems for people with developmental disabilities or those with dexterity issues that make it difficult to use a mouse or keyboard. Without adequate safeguards, applicants or workers with disabilities can be screened out before they even have a chance to apply or take the assessment.

As the guidelines state, a person’s disability can “prevent[] the algorithmic decision support tool to measure what it is supposed to measure…. If such a request is refused because the applicant [disability] gave a low or unacceptable rating, the candidate may indeed have been screened out due to [a disability.]The guide goes on to state, “For example, video interviewing software that analyzes candidates’ speech patterns to draw conclusions about their problem-solving abilities is not likely to give a fair score to a candidate if he suffers from a speech impediment which causes significant problems. differences in speech patterns.

Along with performance tracking metrics, EEOC warns that employers should not use algorithms, software, or AI to make employment decisions without giving candidates and employees the opportunity to request reasonable accommodations . The software and AI are designed to provide information based on predefined specifications or the average or ideal worker. As the technical guide notes, “persons with disabilities do not always work under typical conditions if they are entitled to reasonable accommodation in the workplace”.

The guidance also cautions against using software that violates ADA restrictions on disability inquiries. The ADA only allows employers to inquire about an applicant or employee’s medical conditions in limited circumstances. An algorithmic decision-making tool that asks questions that may elicit information about medical conditions, or that directly excludes applicants with certain conditions, may violate the ADA.

Good practices for employers

Employers looking to use algorithms, AI, and other job screening and performance measurement software should ensure applicants and employees are made aware of their ability to request an accommodation. Staff should be trained to recognize and respond to accommodation requests (which often do not use the word “accommodation”). As the technical guide notes, this could include requests to take a test in an alternative format or be assessed in an alternative way. Employers should strive to minimize the risk that the tools used will disadvantage people with disabilities, including by seeking software that has been tested by users with disabilities, providing clear instructions for accommodations, and avoiding searching for traits that may reveal disabilities. Best practice is to use the tools only to measure the qualifications that are truly needed for the job, and to measure those qualifications directly, rather than through characteristics or personality assessment scores. The vendor from whom the tool is purchased must also be able to verify that the tool does not solicit information regarding a candidate’s medical conditions.

Employers would do well to remember that applicants and employees are humans, and sometimes decisions about humans should be made by humans, not computers.


Source link