New EEOC guidance raises concerns about AI in the hiring process

A friend of mine has a stellar résumé, but can’t get it through prospective employers’ algorithmic screening processes, probably because it’s too easy to infer he has a disability and would need an accommodation to work. Artificial intelligence isn’t inherently bad, according to the EEOC, which issued guidance addressing the interaction of AI and the Americans with Disabilities Act in the workplace earlier this month. But you must use AI carefully, especially when hiring or evaluating employee performance.

Using AI in the reasonable accommodation process

You may tell job applicants or employees what steps an evaluation process includes and may ask them whether they’ll need reasonable accommodations to complete it. But in the absence of letting everyone know beforehand, the EEOC says you must reasonably accommodate those for whom a disability isn’t obvious. You may request supporting medical documentation.

No magic words: The EEOC says job applicants or employees may request a reasonable accommodation if they tell you their medical condition may make it difficult to take an employment-related evaluation or may result in a test score you find less than acceptable.

Best practices to help employers to meet this requirement include:

  • Training staff to recognize and process requests for reasonable accommodation as quickly as possible, including requests to retake a test in an alternative format or requests to be assessed in an alternative way after an individual has received poor results.
  • Training staff to develop or obtain alternative means of rating job applicants and employees when a current evaluation process is inaccessible or otherwise unfairly disadvantages someone who has requested a reasonable accommodation because of a disability.
  • If the AI-based evaluation is administered by your agent, ask it to promptly forward accommodation requests. Or you may allow your third party to provide reasonable accommodations.

Using AI to intentionally or unintentionally screen out an individual with a disability

Under the ADA, you can’t screen out job applicants or employees if they can do the job with a reasonable accommodation. AI could screen out an individual because of a disability if the disability causes them to receive a lower score or less than acceptable assessment results and they lose a job opportunity as a result.

Examples:

Interview Bootcamp D
  • A chatbot rejects job applicants who during the course of their conversation indicate they have significant gaps in their employment history due to cancer treatments.
  • An applicant with a speech impediment isn’t likely to score highly on video interviewing software designed to analyze applicants’ speech patterns to determine their problem-solving ability.

Beware a claim of “validation.” Software developers may say their algorithms don’t discriminate on the basis of a disability because they’re validated. This isn’t enough for the EEOC. To say AI has been validated means it meets certain professional standards showing it accurately measures or predicts a trait or characteristic important for a specific job. But the AI may still be inaccurate when applied to job applicants with disabilities.

Example:

  • An assessment of memory based on a computer game may be validated because it accurately measures most peoples’ memory, yet still screen out applicants who have good memories but are blind.

Best practices to help employers to meet this requirement include:

  • Using AI designed to be accessible to individuals with as many different kinds of disabilities as possible.
  • Informing all job applicants and employees who are being rated of the available reasonable accommodations and providing clear and accessible instructions for requesting an accommodation.
  • Describing, in plain language and in accessible formats, the traits the AI is designed to assess, the method by which those traits are assessed, and the variables or factors that may affect the rating.
  • Ensuring the AI instrument measures only the abilities or qualifications truly necessary for the job, even for people who are entitled to an on-the-job reasonable accommodation.
  • Measuring applicants’ abilities or qualifications directly, rather than using characteristics or scores correlated with those abilities or qualifications.

Using AI to screen out job applicants or employees in violation of the ADA’s restrictions on disability-related inquiries and medical examinations

You might also violate the ADA if your AI poses disability-related inquiries or seeks information regarded as a medical examination before making a conditional offer of employment. This type of violation may occur even if a job applicant doesn’t have a disability.

Here the EEOC’s best practice includes asking your AI vendor to confirm its algorithm doesn’t ask applicants or employees questions likely to elicit information about a disability or seek information about an individual’s physical or mental impairments or health, unless inquiries are related to a request for reasonable accommodation.