Restricting Use of AI in Employment Applicant Screening

September 7, 2022by Cheryl Gernand
index

“AI is exploding. It is being put to use in many different industries and toward many different applications, including in various human resources related functions. As we previously discussed, the use of AI can be extremely helpful to employers, if done properly.

Recently, New York City became one of the first jurisdictions to address the proliferation of AI tools in the employment context. NYC passed a local ordinance prohibiting employers from using AI tools, which it calls Automated Employment Decision Tools, unless the employer takes certain steps to “vet” and disclose use of the tool.

The local law goes into effect on January 1, 2023 and will apply to employers operating in New York City that target NYC residents. It will require that employers subject the AI screening tool to a validation test. The ordinance calls the test a bias audit. The bias audit must be conducted by an impartial auditor and will require evaluation of the tool’s potential disparate impact on protected traits. In addition, the employer must post a summary of the bias audit on the employer’s web site.

Employers using Automated Employment Decision Tools will also need to provide certain notices to applicants. Employers will need to notify applicants at least 10 days in advance that the AI tool will be used and must allow the applicant to request a different testing/screening method. The employer must also give notice of the characteristics that the AI tool will assess, the types of information that will be collected, and how that information will be retained by the employer.

This approach seeks to directly address two of the most significant concerns with the use of AI tools in the screening process: (1) the potential to accidently or unintentionally screen out applicants based on a protected trait (disparate impact); and (2) potential disability-related accommodation obligations in the hiring process. Recall that we took a close look at these risks in our previous post. The ordinance also goes further to address applicant privacy concerns as well.

Like many new developments, employers relying on tried-and-true HR best practices will be ahead of the curve. That’s because it has long been a best practice to require any pre-employment screening tool to be properly validated to protect against disparate impact concerns. Organizational Psychologists have been talking about this type of validation for years. Many reputable third-party service providers who offer such applicant screening tools have already conducted similar validation tests or bias audits.

While this may be the first, we certainly do not think that the NYC ordinance is the last local or state law to address the use of AI in the hiring process. Employers adopting these tools will need to be ready to comply with this changing legal landscape, as the AI tools themselves continue to develop and increase in popularity.”

 

September 2022, McNees Wallace & Nurick

Accurate Investigation Services works hard to make its website accessible to all, including those with disabilities. If you are having difficulty accessing this website, please click here for PC and Mac click here

Copyright © 2020 Accurate Investigation Services | All Rights Reserved. | Privacy Policy