How AI will influence the future of work   //   June 5, 2023  ■  6 min read

How companies are training next-gen AI talent, to avert skills shortage

This article is part of WorkLife’s artificial intelligence special edition, which breaks through the hype around AI – both traditional and generative – and examines what its role will be in the future of work, for desk-based workers. More from the series →

Amid the current generative AI boom, individuals and entire industries are asking both existential and practical questions about the potential of artificial intelligence. And while workforce adoption of AI isn’t novel on its own, rapid innovations and their accelerated proliferation have created a new race to recruit, train and apply the best AI talent, to fill what some have described as an “AI skills shortage crisis.”

In just a few months, developing new AI tools and teaching how to use them have become top priorities across a range of industries. Increased attention — and the money that comes with it — has led many business leaders to feel both urgency and concern: What do employees need to learn, how can they start, and what will it all mean for current and future jobs?

Since Microsoft sparked a frenzy in January with its $10 billion investment in OpenAI, startups and giants alike have made innumerable bets on various types of generative AI. In April, the accounting giant PwC announced a $1 billion investment to scale generative AI and upskill 65,000 people with ways to “work faster and smarter.” A few weeks later, the analytics firm SAS also announced a $1 billion three-year plan. Other companies have recently created AI-focused programs for both employees and clients. In March, Accenture announced a new Large Language Model Center Of Excellence, which was followed by IBM in May with a Center of Excellence focused on generative AI.

Generative AI is also becoming more common on the job search engine AdZuna. Ads for roles that require working with generative AI jumped from just 185 in January to 1,500 in May while the average salaries have risen from $131,000 in February to $146,000 in May. Meanwhile, Apple is reportedly looking to fill at least a dozen open generative AI roles — not to mention the countless startups and larger firms building out their own teams.

Education platforms like Coursera are noticing more user interest in learning about AI. Between April 2022 and April 2023, searches on the platform related to generative AI increased 230% including terms like “GPT and “LLMs,” and “GANs.” As AI becomes more popular, they want to learn more about it, said Hadi Moussa, Coursera’s managing director for Europe, the Middle East and Africa. He added that workers, companies and governments all seem to be responding to changes in the job market.

“Especially in this kind of environment where employment is quite tight and at the same time companies are really trying to figure out how they can to some extent reduce the level of spend in the current economic environment,” Moussa said.

To meet growing demand, the company has added new AI courses such as “AI for good” from the ed-tech company DeepLearning.AI and “ChatGPT Teach-Out” from the University of Michigan. It’s also integrating AI tools into the platform itself including “Coursera Coach,” which answer user questions and provides personalized content. Others new tools include an AI-assisted course builder and new way to translate courses into other language with machine learning.

“…employment is quite tight and at the same time companies are really trying to figure out how they can to some extent reduce the level of spend in the current economic environment.”
Hadi Moussa, Coursera’s managing director for Europe, the Middle East and Africa.

The new tools might be welcome for anyone looking for ways to self-teach themselves. In Coursera’s recent survey of U.K. business leaders, 80% of respondents reported already using generative AI in their operations, but 34% said a lack of skilled workers was a top challenge. Although 83% of U.K. decision leaders expect AI will “change the skills their company requires,” just 67% of British bosses said AI skills were important for job candidates.

Developing next generation of AI talent

For the AI startup Fusemachines, the focus is partially on developing the next generation of global AI talent. With its “Democratize AI” program, Fusemachines helps facilitate AI education programs with K-12 schools, colleges and government partnerships in places like Nepal, the Dominican Republic and Rwanda. The company is also developing AI-automated tools including a new “Interviewing Agent” that uses natural language processing to reduce the time engineers spend interviewing job candidates.

Because the AI industry is still fairly new, senior AI engineers are often the hardest to find, said Sameer Maskey, founder and CEO of Fusemachines. Organizations also don’t always have plans for training or mentoring younger AI engineers.

“It also doesn’t help that the academic and training courseworks aren’t dynamic enough to keep pace with the changing landscape of innovation and new discoveries with AI,” Maskey said.

Other startups are also creating new industry-specific AI roles, including some at the entry-level. When developing an AI-powered tax platform, the Israeli fintech startup April hired “tax engineers” who could work with both tax experts and actual engineers, develop data sets, sample baseline models and improve overall AI accuracy. Tax engineers — including people with diverse backgrounds in other fields such as education — are also given ways to become full-stack engineers, according to April CTO and co-founder Daniel Marcous.

“In order to maximize [AI’s] efficiency, you need a collaborative manner and to marry it with domain experts.”
Daniel Marcous, CTO and co-founder

“In order to maximize [AI’s] efficiency, you need a collaborative manner and to marry it with domain experts,” said Marcous, who was previously CTO of Waze. “That requires a lot of effort so you need to build a platform for them to interact with AI and tune it.”

Finding the right talent to question AI ethics

In a February survey conducted by Monster, 49% of 900 employee respondents had used ChatGPT or another AI generator for work ranging from writing (50%) to more specialty applications like writing code and financial forecasting. However, 38% of workers expressed concern about an AI generator replacing them and 26% said they were more scared of ChatGPT than getting into an accident, switching careers, being disliked by a manager or speaking in public.

The impact large language models like ChatGPT and others could have on the labor market is still uncertain. However, a March research paper from OpenAI and the University of Pennsylvania predicted LLMs could affect at least 10% of tasks for 80% of the U.S. workforce and affect at least 50% of tasks for another 19%.

For some, AI talent isn’t necessarily a major concern yet. Gartner Research found that CIOs are sourcing AI talent from internal and external sources while looking to deploy AI through with four types of experts: data scientists, data engineers, AI engineers and business experts. In a May survey of chief information officers, 33% of CIOs said they’d already deployed AI and another 15% said they planned to within the next year.

AI talent can originate beyond people with backgrounds in science, tech, engineering and math. For example, Accenture’s “Responsible AI” team includes historians, linguists and philosophers to address ethics issues such as bias, according to Jatinder Singh, Accenture Song’s head of data and analytics. He thinks too much focus is on building large language models instead of discussing how to “bring that expression to life.”

“The best books and the best authors are the ones that unleash the mind,” Singh said. “It’s generating creativity in our minds so we mustn’t forget our liberal arts friends in that conversation…They’re brilliant storytellers. They’re the ones that are going to humanize this technology.”