NYC’s AI Law Affects Organizations Differently
Confusion is brewing among HR executives since New York City’s law governing the use of artificial intelligence in HR went live in July.
Some leaders I’ve talked to have different impressions, interpretations, or risk aversion tolerances. I’m hearing things like “How much are these audits going to cost?” and “We don’t know what we don’t know yet.”
That’s fair. After all, it’s the first law of its kind in the country; there are bound to be questions.
I’m curious if similar conversations are coming up in your C-suite? If so, what are the main concerns your organizations are having with the audit requirements for automated employment decision tools? Are you huddling with your general counsels and data analytics teams to figure out what they can and can’t do with AI?
In the spirit of knowledge-sharing, I want to tee up a discussion by sharing some of the feelings that others at your senior level are having about this new regulatory landscape. You, too, likely have someone on your team who’s using one of several best-fit selection tools in their talent acquisition process, including a psychometric assessment.
Either way, if these leaders are having concerns, chances are you may be having the same questions, too.
To Interview Or Not?
A leader from a global payment processing company said there was hesitancy about using a best-fit tool to interview a candidate if the tool was used as the main data point vs. one of several data points.
“That’s the main argument here,” she said. “What other decisions should go into that? How would a recruiter use that (tool) and make sure that it is just one data point versus the main one?”
I found it helpful and interesting to hear that her organization would actually like to use an assessment tool but needs to determine risk tolerance. Is it in scope or not? Her team is starting to make the case to use it within their talent CRM and on the sourcing side, but is a little more hesitant in using AI-enhanced tools in the interview process.
What can we take away from her comments?
There are appropriate uses of best-fit tools, and they’re not necessarily relevant to every workflow. But they should at least inspire us to start thinking about integrating them in a useful way for interviews.
That’s exactly what one major bank is doing. It uses an automated tool as one way to prioritize candidates to potentially interview. “It’s clearly not a decision maker,” he said. “We still are measuring diverse candidate pools.”
I heard quite a bit of back-and-forth in the HR community about the audit requirements. NYC’s law basically says that organizations using AI or similar technology in their hiring process must do an annual audit. They must be performed by a third party and check for instances of bias, be they intentional or not.
The price tag for the audits was brought up by a senior HR leader at a tech company with some 60,000 employees. A twice-yearly audit by a third-party would have cost the organization about $100,000, which is not petty cash, even for a multi-billion dollar tech powerhouse.
The other choice is going with an internal audit team, but finding people with that level of expertise remains a challenge in a tight labor market.
Outside or internal auditor? “Either way it goes, this is a pretty decent price tag,” she said.
Another leader pointed out that her organization simply doesn’t have the infrastructure in place for a team that’s qualified to conduct audits. “Actually getting the funding to hire the person and getting them on board, I mean, how many roles would you do that for?” was her concern.
“Companies that don't have that infrastructure built up — that will be the biggest hurdle for us to really take advantage of the (AI) technology, but the appetite is definitely there,” she added.
Phenom’s Bias Audit
I’ve received numerous inquiries about how the law affects Phenom. It’s worth pointing out that our Fit Score (which helps recruiters decide whether or not to invite a candidate to a job interview) does not meet the definition of an “automated decision employment tool” if it is used properly. That is, if it is not being used to substantially assist or replace discretionary decision making. It is just a quick, at-a-glance summary of whether a candidate potentially meets enough criteria to be interviewed.
We did an internal audit on a pretty significant data set — 650,000 applicants, 6,000 jobs, and more than 20 job classifications. We were looking at something called “adverse impact,” which refers to the tendency of certain policies, practices, or procedures to negatively affect protected classes. It basically means “Are you making decisions that favor one demographic group over another?”
Phenom can say definitively that over 97% of the jobs that we considered had no adverse impact for gender, race, or ethnicity.
The remaining two-something percent is less adverse impact than you would expect due to chance alone.
What’s On Your Mind?
I’d like to hear about some of the discussions taking place in your organization about the new AI regulatory environment since other states are following NYC’s lead. Thoughts? Concerns? Connect with me on LinkedIn and let’s talk.
Jess Elmquist is the Chief Human Resources Officer and Chief Evangelist at Phenom. In a previous career as the Chief Learning Officer at Life Time, the healthy way of life company, Jess hired more than 200,000 people and spoke to hundreds of his executive peers about talent trends.
Get the latest talent experience insights delivered to your inbox.
Sign up to the Phenom email list for weekly updates!