As artificial intelligence (AI) advances in business, the most successful adopters are taking a holistic approach to AI, according to PwC's 2022 AI Business Survey. Comprised of 36% of respondents, these “AI leaders,” as PwC calls them, use AI to simultaneously focus on business transformation, improved decision-making, and systems modernization rather than attacking one area at a time.
These organizations and others are also beginning to use AI to solve more complex business decisions around diversity, equity, and inclusion (DEI). In fact, 46% of AI leaders use AI to make workforce decisions that include DEI, compared to 24% of other companies, according to PwC.
"Companies are using AI for recruiting and hiring, as well as retention and engagement," said Bret Greenstein, AI and data analytics partner at PwC and co-author of the report.
The disastrous past of AI in hiring
While many companies are experimenting with AI as a tool to assess DEI in these areas, Greenstein noted, they are not delegating these processes entirely to AI, but augmenting them with AI. Part of the reason for their caution is that, in the past, AI often did more harm than good in terms of DEI in the workplace, as biased algorithms discriminated against women and non-white applicants.
“There has been a lot of news about the impact of bias on algorithms looking to identify talent,” Greenstein said. For example, in 2018, Amazon was forced to remove its secret AI recruiting tool after the tech giant realized it was biased against women. And a 2019 study by the Harvard Business Review concluded that AI-enabled recruiting algorithms introduce an anti-black bias into the process.
AI bias is caused, often unconsciously, by the people who design AI models and interpret the results. If an AI is trained on biased data, it will in turn make biased decisions. For example, if a company has hired mostly white software engineers with degrees from certain universities in the past, a hiring algorithm might favor candidates with similar backgrounds for open engineering positions.
As AI developers become more aware of the potential for bias built into recruiting and hiring software, they can work to protect against it. In fact, 45% of organizations identified by PwC as leaders in AI said they plan to address equity issues in their AI systems by 2022.
“I think using AI [for DEI] will move from experimentation to production for recruiting and hiring as people better understand and identify biases and understand how to better assess future performance,” Greenstein said.
Use AI to highlight biases
According to Gartner, 62% of HR managers say they use DEI data as input to talent management processes such as hiring and performance management. However, few use it to effectively influence management decisions regarding workers. To create a diverse, equitable and inclusive workforce, HR leaders must better integrate DEI data strategies into everyday employee experience practices, said Emily Strother, research director at Gartner.
Organizations are increasingly integrating AI technology into their talent acquisition and management processes to highlight potential bias, Strother said. “In particular, we see it in the way [they] manage hiring and how [they] work with performance management. This is one of the places where organizations worry the most about bias, but AI can help. »
For example, some companies are using AI-powered tools to identify biased language hiring managers might use when interviewing candidates. Corrective measures could include setting up bias reminders throughout the interview process or alerting managers when their language is biased or potentially unfair, Strother said.
Manager biases can also creep in when it comes to setting goals for employees. AI can help by comparing employee goals to others with the same seniority and then alerting managers if they consistently assign lower or less important goals to certain workers.
"It helps managers realize some of their unconscious goal-setting biases and helps them correct their behaviors," Strother said.
AI can also help organizations ensure that their job offers are as unbiased as possible. “We are seeing organizations using AI to review some of the job sites, like LinkedIn or Indeed, to ensure that the language they use when posting [open jobs] is accurate or consistent with the skills [needed for the job] against everything else. I can [indicate bias], Strother said.
Kay Formanek, founder and CEO of diversity education company KAY Diversity and Performance and author of Beyond D&I: Leading Diversity with Purpose and Inclusiveness, provides an example. "If a company says, 'We're looking for a driven leader, we're looking for someone who's ambitious, we're looking for someone who's going to deliver results,' we call it a male workplace, and research has shown that women will tend to retire” even when they are well-qualified for the job, he said.
According to Formanek, women are looking for more feminine language, such as: “We are looking for a leader who, together with the team, supports the company's growth program. We are looking for someone to build a team.
AI can help companies remove biased language from their job postings and send alerts when language may be gender biased or aligned with specific skill sets that may exclude qualified candidates from different, more diverse or underrepresented backgrounds. , according to Strother.
"It's very important," Formanek said. "Because if you don't, you're going to alienate people who are very important to your diversity."
Use AI to identify disengaged employees
PwC's Greenstein sees great potential for AI in worker retention. Retaining employees is key to business success, he said. The factors that drive people to leave a company have a lot to do with workers feeling marginalized, disconnected and disengaged.
Companies can use AI to identify departments or roles at high risk of burnout, dissatisfied or disengaged workers, and even people who feel isolated because they work remotely, Greenstein said.
“In general, remote work has had a greater impact on diverse employees, as there [are] greater degrees of isolation. Less connection can be more harmful in most cases,” she said.
Artificial intelligence tools can help managers understand whether some employees are more at risk than others, Greenstein said. "Managers can use AI to look for indicators in the data about how people interact to identify how isolated people are feeling, as well as look for triggers to determine when people seem most disengaged."
Although there are no standard tools yet for this purpose, PwC sees clients identifying the data they consider most important (travel, locations, time, performance, compensation, workload, etc.) to explore the impact isolation has had on commitment and ultimately abandonment. Greenstein said. When integrating potentially relevant data into data lakes or cloud data warehouses, companies primarily use bespoke cloud-native analytics tools to look for correlation and causality, build predictive models and determine best practices, he said.
Once companies identify employees who feel disconnected or marginalized, it's up to them to take steps to make those workers feel respected and included. But knowing who feels left out is an important first step.
The dynamics of talent attrition and acquisition have changed dramatically in recent years and continue to evolve, so companies that know their data and employees equipped with the analytical skills to interpret it have an advantage, Greenstein said. “I believe that these tools can help us become better managers, partners and colleagues for our employees. »
Copyright © 2022 IDG Communications, Inc.