Artificial intelligence is being used in almost every aspect of business, and human resources is no exception. But while AI increases efficiency in sorting through job applications and evaluating the performance of employees, it can also present pitfalls to organizations trying to diversify their staff and their customer bases, according to panelists at Providence Business News' 2024 Diversity, Equity and Inclusion Summit and Awards program on Dec. 5. There has been an increased focus on ensuring these technologies do not perpetuate biases, especially as AI tools become more integrated into hiring practices, performance evaluations and decision-making processes, said moderator Kevin Matta, board president for Diversity and Inclusion Professionals Inc. and senior director of people and culture for United Way of Rhode Island Inc. Matta noted how AI use - from health care to banking - has led to increasing concerns from DEI practitioners that it could "take over jobs or take over workplaces," he said, cautioning that complex data sets will almost certainly contain biased information, dooming the output to prejudice. While it is true that AI-produced data can play a critical role "in seeing the totality of a human being ... it is also important to consistently understand how to leverage this information," he said. "It has to be discreet." AI is now being used across the public and private sector to streamline the search and decision-making process for human resource departments, scanning resumes for designated keywords. But data provided by AI is never 100% neutral. It is gleaned from human beings using historical data. So unseen discriminatory patterns and habits can be tough to break. It can be a double-edged sword, said the panelists, who are finding the best ways to apply this emergent and exponentially complex technology while also promoting progress toward DEI goals. A 2023 Pew Research poll found opposition to AI use in making final hiring decisions had reached 71% among Americans, and a majority were against AI analysis being used in firing decisions. Also, a majority opposed AI use in reviewing job applications or determining whether a worker should be promoted. Talia Brookshire, Neighborhood Health Plan of Rhode Island's chief diversity officer, said since the organization began embedding DEI into its structure, there has been a 41% increase in diversity hiring from 2019 to 2023, with people of color now comprising 43% of its workforce. AI used in one capacity can be double-checked by using it in another way, said Neenee Shin, CBIZ Inc. director of diversity, equity and inclusion. The company uses Microsoft Copilot to scan applications from all job postings. "There could already be biases in your candidate pool in the way you write your job descriptions," she said. To protect against AI-related backtracking, HR specialists advise remedies such as assembling inter-organizational and diverse teams with a cross-section of perspectives and backgrounds to oversee AI systems. Or companies can simply hire an auditor. Shameem Awan, vice president of talent management and DE&I assistant vice president at Amica Mutual Insurance Co., said the company uses a third-party auditor and has made sure to limit the use of AI to tasks where the possibility of creeping bias is less likely. "It is super helpful when it comes to [maximizing] efficiencies," she said. "But we need to be careful about using it too much when we are looking at things like talent acquisition. We use that [auditor] to make sure we are doing what we should be doing and are on the right track." There are plenty of examples of things going haywire. Researchers at Lehigh University in Pennsylvania looked at 6,000 sample loan applications curated from the 2022 Home Mortgage Disclosure Act and found white applicants were 8.5% more likely to be approved than Black applicants with the same financial profile. Carolyn Belisle, vice president of corporate social responsibility at Blue Cross & Blue Shield of Rhode Island, said a detailed policy on AI should say its use should be more of a guide rather than a catchall solution. "It's a great way to start to chart a course or a path. Or to compensate people for the extra things we ask them to do," she said. "It is a tool. But it is important we do not overuse it." As with any emerging technology, organizations must learn to harness its power rather than be consumed by it. "This is something new," Matta said. "For organizations that are not so resource rich, it is wise to have an outside group come in and test it to make sure policies are inclusive enough. Or [determine if they are] exclusive."
PBN summit panel: Panel: Even AI has biases that can hamper diversity
By Christopher Allen