Regulating the Future: A Conversation on AI Legislation
AI has been revolutionary in many ways, but without an understanding of the laws in place to regulate its use, organizations put themselves at risk.
Josh Goren, Senior Director of Contracts and Legal Compliance at Phenom, joined us on Talent Experience Live to share his knowledge on the rules and regulations that govern the use of AI. Learn about emerging legislation affecting the use of AI in HR and Goren’s advice for staying out of “trouble” in applying this technology.
Get the highlights below, or watch the full episode right here.
AI is in our lives in many forms these days, from work-related applications, to art, entertainment, and online shopping. Generative AI (GenAI for short) specifically has been on the rise, but with this mass adoption comes the need for guardrails to ensure it’s used for good. These precautions are coming in the form of new laws and regulations, according to Goren, who ensures legal compliance for Phenom.
How have you seen AI used irresponsibly?
In his own field, Goren has seen lawyers use GenAI to write their legal arguments. In at least one circumstance, the judge questioned the lawyer about some cases cited in a claim. “As it turned out, the Generative AI had just simply made up the cases. The lawyer did not do their due diligence in checking the work that they gave,” Goren said.
“That’s a pretty flagrant terrible use of GenAI,” he added, as well as a great example of why we always need humans in the loop to keep tabs on the AI they’re using. This type of fabricated information known as “hallucinations” in the world of AI is still a major challenge that needs to be solved.
What type of legislation is emerging to regulate AI?
Currently, more than 100 jurisdictions are drafting and implementing rules and regulations, Goren said. “It’s a hot topic, and it’ll affect people all around the world in various industries.”
New York City Local Law 144 (NYC 144) is perhaps the most prominent piece of legislation regarding AI and HR practices. The law, which took effect in July 2023, requires that employers perform a bias audit if they are using automated employment decision tools (AEDTs) to guide their hiring decisions. An AEDT, as defined by nyc.gov, is a computer-based tool that uses machine learning, statistical modeling, data analytics, or artificial intelligence to substantially help with employment decisions.
“The law is attempting to eliminate bias in hiring practices that may stem from such tools,” said Goren, adding that there had been a certain amount of resistance on the part of tech vendors and some companies. However, “companies should be auditing their hiring processes regardless,” he pointed out.
Why the pushback? Cost may come into play if companies don’t want to invest in any new infrastructure necessary to implement the auditing process. “But I see that as being pretty shortsighted — the cost resulting from lawsuits and claims about biased hiring would be far greater, and if you consider the PR hit that could have, it could be quite devastating,” Goren added.
Although NYC 144 applies only to companies within the city’s borders, it’s worth taking note of for any employer using an AEDT, Goren said. He used the EU’s GDPR laws regarding data privacy as an example. When GDPR went live, the requirements technically only applied to European companies, but he made sure Phenom brought practices into compliance with them.
Laws like these serve as writing on the wall for what’s to come, he noted. In fact, several other localities are considering implementing similar legislation. “It’s always best to be in compliance with the law, because this is going to be a growing trend,” he said.
What are some positive trends you’re seeing in AI?
Used correctly, generative AI makes a great assistant. In fact, Goren said he uses it to help him structure his thoughts and find information for presentations more quickly. The caveat? AI’s role should remain as assistant, not chief creator. Think of those lawyers mentioned earlier as a cautionary tale.
Then there are the fun ways to use it:
AI image generators
AI-generated entertainment like the comedy skit featuring “Tom Brady” (which in itself is a bit of a cautionary tale, considering Brady is suing the creators)
In the realm of HR, a positive use case for AI is its ability to alleviate talent acquisition and talent management professionals of administrative tasks like manual sourcing, screening, and scheduling so they can focus on more strategic work and relationship building.
Related reading: Introducing Phenom X+: Generative AI for HR
What are your key takeaways on AI regulations?
“With any technology, get to know it first. Don’t rely on it completely,” Goren said. “Remember the episode of The Office where Michael blindly followed GPS right into a lake? You don’t want to be the lawyer citing fake cases, and you don’t want to drive your car into a lake,” he said.
“We’re in the infancy of Generative AI, and there’s obviously a ton of unknowns about the potential consequences of its use,” Goren said. “Never completely rely on it … make it a tool that enhances your work, and make sure the output you’re getting is correct.”
If you want to go a step further and dig into Phenom’s data models powering HR technology, impact and use cases of AI across industries, regulatory compliance, and integrations that streamline data into models, you can view AI Day, Phenom’s annual AI tech showcase for HR, right here.
*The information provided on this website does not, and is not intended to, constitute legal advice. All information, content, and materials available on this site are for general informational purposes only.
Maggie is a writer at Phenom, bringing you information on all things talent experience. In addition to writing, she enjoys traveling, painting, cooking, and spending time with her family and friends.
Get the latest talent experience insights delivered to your inbox.
Sign up to the Phenom email list for weekly updates!