Technology   //   September 5, 2023  ■  5 min read

How AI can rewrite job descriptions to be more gender inclusive

Women aren’t applying to the same jobs as men and the language used in the job descriptions is playing a big part.

Early results from new research from the University of Pittsburgh indicate there are likely subtleties in job description word choices that deter women from applying for certain roles in the first place. It comes off of the heels of a recent report from McKinsey & Company that found AI will take more jobs from women than men by 2030 and women will be 1.5 times more likely to move into a new occupation than men. 

From the University of Pittsburgh’s data, done in collaboration with global HR tech company Phenom, they found that in male-dominated professions only 16% of people viewing job postings are women. Out of those viewers, 52% of men applied and only 19% of women applied, making the women’s applicant pool share less than 7%. 

“There’s a downstream effect,” said University of Pittsburgh researcher Dor Morag, who was joined by David Huffman, PhD and Alistair Wilson, PhD for this research. “What surprised me is the scope of the impact the job listings might have. Something happened that discouraged them from applying.”

On the flip side, Datapeople, an analytics company for hiring people, found that job posts for remote work arrangements on average received 4.5 times more female candidates. Are those listings written geared toward women?

“What surprised me is the scope of the impact the job listings might have. Something happened that discouraged them from applying.”
University of Pittsburgh researcher Dor Morag.

“Every year we see a gap between onsite roles and remote roles really increase,” said Amit Bhatia, CEO of Datapeople. “The flexibility of these listings really attract more people.”

Companies are facing a pipeline problem in recruiting women for different roles, and it starts with the job ad, says Bhatia. 

But overall, women are at a disadvantage. By surveying college student seniors over multiple years, economists at the University of Pittsburgh have found that, on average, women are willing to give up 7.3% of their salary (compared to 1.1% for men) to have more flexible work arrangements. 

“We are currently working on mapping and quantifying the factors,” said Morag. “For each job ad, we want to look at what the inferred or perceived job flexibility is by looking at wording use.”

So how do people make sure that all applications peak interest based on skills and not gender? Rewriting job descriptions using AI language tools could improve the diversity of the applicant pool.

“How job descriptions are written right now, it makes some people believe, even subconsciously, they’re not qualified for the work,” said Cliff Jurkiewicz, vp of global strategy for Phenom. “They aren’t intending that, but things just become part of our vernacular and we write it that way. That’s the reason why AI can help so much. It doesn’t have an emotional connection or experience that can create an unconscious bias.”

There are AI tools like Gender Decoder to ensure the most inclusive language is used. The free-to-use tool identifies masculine- and feminine-coded words and gives an overall summary of which way the ad is leaning. Textio is another tool that interrupts bias in performance feedback and recruiting.

Psychologists have found evidence that women report being less interested in applying to a job described with stereotypically masculine words, like “competitive”, “force”, and “independent.” Relatively subtle word choices and phrasing could suggest that job security or flexibility are lacking, even if that’s not the case. If job language, rather than substantive employment conditions, is a source of the gender gap, then simple remedies are possible, say the University of Pittsburgh researchers.

But that’s largely just focusing on word replacement. AI brings it a step further, bringing in context that can help nail down the perfect job listing that is inclusive for everyone. Despite people saying that there are a lot of things that ChatGPT needs revision on, Jurkiewicz says that writing job descriptions is one thing that it’s actually really good at.

“I would say there are a few things that ChatGPT is very good at out of the box and has a higher level of accuracy, and writing job descriptions is at the top of that list,” said Jurkiewicz. “It’s quite good at it. You can say take all of my job descriptions and analyze them for the right keywords, and inclusive language and context.” 

“I would say there are a few things that ChatGPT is very good at out of the box and has a higher level of accuracy, and writing job descriptions is at the top of that list.”
Cliff Jurkiewicz, vp of global strategy for Phenom.

Using AI also helps things go faster. If a large company has 1,000 active job listings, it would take a significant amount of time to go through each and ensure it’s inclusive. That’s a part of the challenge that AI solves for. It ends up only taking minutes to produce new job listings. Of course, it will always need a human to look them over, but it speeds the process along. “The tools are already available and don’t take much time,” said Jurkiewicz. 

Bhatia says it’s important to be cautious, though, and that he’s skeptical about how much AI tools can truly create fairness and inclusion – something that has long been a part of the AI ethics conversation.

“Recruiting is a very specific type of problem,” said Bhatia. “With job descriptions comes compliance documents, the Department of Labor, the EEOC, documents for the company. Using AI has a high risk factor.”

The research, though, suggests that using AI to write job ads to be more inclusive can have a large impact on more women applying to a myriad of jobs.

“We think language plays a significant role,” said Jurkiewicz. “It’s one thing to simply change the way you advertise for a particular audience. We need to really involve, in this case, women, in this research to understand if the output matches the expected outcome. Generative AI will help us, but only human beings can tell you that.”