Aug 22, 2023
Lawmakers and regulators are increasingly scrutinizing whether AI hiring tools unintentionally perpetuate bias. While these tools can streamline hiring processes, they may pose a compliance risk if they discriminate against certain candidates.
From basic computer screening to advanced AI, various tools are used to evaluate employment candidates. These include resume scanners, employee monitoring software, virtual assistants, video technologies and job-fit algorithms that assess candidates' potential success in certain positions.
However, these tools can introduce discrimination into the hiring process, which is why legislators at the federal, state and local levels are focusing on AI in employment. Although there is no consensus on what constitutes AI and how it contributes to discrimination in hiring, upcoming laws will aim to address these concerns.
Starting in April, New York City employers are banned from using AI in employment decisions, unless they take affirmative steps to mitigate potential bias. This includes conducting a bias audit of the AI tool, making the tool publicly available and notifying job candidates about its use. Candidates or employees must also be allowed to request alternative evaluation processes if needed.
What is a bias audit?
A bias audit involves having a person or group not involved in developing or using the AI tool review it for potential bias. Further details on what this entails are expected. A main concern is that the AI tool might significantly impact or replace discretionary employment decisions, such as overruling human decision-making.
In these instances, the bias audit must calculate the selection rate for each race or ethnicity and sex category. Then, it must be reported to the Equal Employment Opportunity Commission, where the impact ratio for each category will be calculated. This involves dividing the selection rate by the rate of the most selected category. From there, the average score of individuals in a given category will be compared to the value of the highest-scoring category.
Government responses to AI
Various government bodies are scrutinizing the use of AI and automated tools in employment decision-making. The EEOC has issued technical guidance, warning employers that the use of these tools could violate disability protections under the Americans with Disability Act.
The Department of Commerce has appointed 27 members to the National Advisory Committee, which is the group that oversees AI. The White House has released a blueprint for an AI Bill of Rights, which includes provisions related to employment decisions using AI tools.
The Federal Trade Commission has also issued reports expressing concerns about the use of AI and automated tools in decision-making without human review. The EEOC has issued guidance on the use of AI to assess job applicants and employees with regard to ADA protections. Two pending federal bills would require the FTC to mandate impact assessments of automated hiring systems and algorithm usage.
Several states are taking action
California is proposing draft revisions to expand existing discrimination and employment laws that are related to liability risks for employers and vendors selling or administering automated hiring tools that leverage AI. Colorado, Vermont and Washington have created task forces to study AI. And Illinois and Maryland have enacted laws regulating the tools to some extent, though not as comprehensively as New York City.
When it comes to using AI in hiring, there are a few things to keep in mind. The definition of AI can be broad and unclear. It's important to understand what constitutes AI use in an automated hiring tool, and your legal counsel can help with this assessment. When using automated hiring decision tools, consider data retention and work with third-party vendors to ensure compliance with any new laws or regulations. Be prepared to address any implementation issues.
Although automated employment decision tools have the potential to improve efficiency, the use of these tools in employment decision-making will soon require transparency and accountability, as well as clear standards for auditing the processes.