Gen AI for HR   //   July 25, 2024

How HR leaders can tackle blindspots – like how workers are actually using AI

This article is part of a series that looks at specific ways HR professionals can leverage generative AI. More from the series →

The idea of an AI audit is still foreign to a lot of companies. 

The speed of the generative AI boom took everyone by surprise, with the result that organizations are not really on top of how exactly their workers are using it in their jobs. In a nutshell: who is auditing the adoption of generative AI at work?

It could be that workers are inputting sensitive company information into open-source AI tools, which poses a risk for the organization. Or it could be that a company has invested heavily in AI tools, only to find they’re barely used due to a lack of training or guidance. That’s a serious stumbling block, which is contributing to a general slowdown in AI adoption within some businesses, according to experts.

“It’s one thing to set up a tool for an organization and go through a standard implementation and launch process, it’s another thing to say ‘this tool or technology is set up for the organization and what their current procedures might be,’” said Alicia O’Brien, svp of innovation, consulting, and customer success at executive recruitment firm WilsonHCG. “The latter oftentimes is forgotten about. We’re so focused on getting this technology set up that we forget that it has to actually work for the organization that it was sold to.”

HR, in partnership with other leaders, has an opportunity to take a finer look at how AI is being used within a company. But what does that look like in action?

Companies are panicking with policies

To start, companies need to know how long it takes to complete their processes without the help of new tools. That will then make it easier to measure whether or not AI is speeding up processes or not.

From there, HR leaders could send out pulse surveys to ask for anecdotes on how people are leveraging AI on their own, especially if there isn’t an internal GPT or vendor in place yet. This can help companies get a better understanding of how they are using open-source tools that might be harming the company’s data or sensitive information. This is a phenomenon called Shadow AI. A significant part of auditing AI adoption is risk management. 

“There is no real risk management, so companies are panicking with policies because right now it’s very hard to monitor that,” said ​​Michael Beygelman, founder of Claro Analytics. “Right now, most companies are issuing general policies, and the auditing is for the few that are kind of venturing in this.”

“Right now, most companies are issuing general policies, and the auditing is for the few that are kind of venturing in this.”
Michael Beygelman, founder of Claro Analytics.

Seeing how AI tools are working in action

Beygelman says in a lot of ways, it will be a self-fulfilling prophecy: “They’ll audit the systems they bought, that they know work. It’s a false positive concept.” Although it may confirm that a company chose to invest in the right tool, it won’t audit usage beyond that, to AI tools that haven’t been company-bought. With an audit, there is an opportunity to spot the areas in which AI could help, where it hasn’t yet been deployed.

For now though, the most useful metric is seeing how much time using AI tools saves employees.

O’Brien says that governance, change management, and training can all help set the standard on how things are used and also not meant to be used. WilsonHCG has conducted focus groups or drop-in sessions to work through challenges or review training as and when they’ve launched new AI tools internally. That has helped provide additional context behind just the stats. 

“For example, the data might be showing us that we’re spending 20 minutes in a tool,” said O’Brien. “Is 20 minutes a good amount of time? Are we spending 20 minutes getting through all we need to? Or are we spending that 20 minutes what we are doing and maybe didn’t get any results from it?”

Asking those questions is crucial so that companies can truly understand where enhancements can be made. AI vendors can help ensure that this happens, going beyond initial implementation and ensuring ROI by helping employees see how to best use the tool over time. O’Brien says that when this isn’t done, there is significantly less adoption. 

It is especially helpful when someone within the company takes on the role of looking at the usage along with the vendor. That could be an HR manager or a tech manager. 

“For example, the data might be showing us that we’re spending 20 minutes in a tool. Is 20 minutes a good amount of time? Are we spending 20 minutes getting through all we need to? Or are we spending that 20 minutes what we are doing and maybe didn’t get any results from it?”
Alicia O’Brien, svp of innovation, consulting, and customer success at WilsonHCG.

“It’s sometimes an argument between tech and HR because IT isn’t necessarily going to understand how it’s being used, they’re just going to see how it integrates,” said O’Brien. “Whereas HR might not be as much tech-focused, but on the ‘so what?’ A lot of the time it’s a tech-minded HR or talent acquisition person that is a good fit.”

But when AI use is properly audited, and more and more employees start to see how it is impacting their colleagues’ work experience, more people hop on board. 

“What’s happening is the productivity desire is driving the AI adoption because your workload says you have 27 hours worth of work, but 8 hours to do it,” said Beygelman. “You’re going to use tools that will help you. HR will realize this.” 

Leveraging AI tools to track AI learning

There are many ways that companies are utilizing certain AI-powered software to help them understand how employees spend their time with these tools. For example, AI-powered skills intelligence platform Workera works with companies like Accenture, the U.S. Air Force, and Booz Allen Hamilton to help their employees accurately measure AI competencies.

With four tiers of AI readiness assessments, Workera is helping companies get a true grasp on how their employees are performing with AI. 

“You cannot take a one-size-fits-all approach when the workforce is so diverse,” said Kian Katanforoosh, CEO and co-founder of Workera. “Assessments allow us to understand the strengths and gaps in everyone and tell us where AI is going to help you be a better worker.”

Companies are using AI assessments to help find that starting point in employees’ skills and see how much they grow from there when they revisit it. It’s also been helpful for business leaders who might not be as familiar with AI as they’d like to be. Katanforoosh found that there are two types of business leaders when it comes to AI: those who pretend they know and those who are transparent with the fact they don’t know.

“Leaders who pretend they know something, like AI, when they don’t have the background in it and it’s new to them, are going to foster a culture of dangerous amateurs,” said Katanforoosh. “You’re going to have a lot of people in the organization who are going to try to mimic the leader by showing they know when they don’t. On the other hand, when a leader says ‘I’m not accomplished yet, we’re all in this together,’ it fosters a culture of transparency and life-long learning.”