Recently, we plugged in a "gender decoder" into our database in order to analyze potential gender bias in job listings. This gender decoder is based on the research paper "Evidence That Gendered Wording in Job Advertisements Exists and Sustains Gender Inequality" (Journal of Personality and Social Psychology, July 2011, Vol 101(1), p109-28), written by Danielle Gaucher, Justin Friesen, and Aaron C. Kay. In this paper, the three looked at language in job descriptions they found to be "feminine" or "masculine" in nature and whether or not men and women may be off-putted by sample job descriptions.

With this new tool, we wanted to analyze the language used in these job openings en masse, evaluating entire companies, industries, and the overall job market for ESG (Environmental, Social and corporate Governance) metrics. Rather than take a wide survey of the entire job market, we wanted to test the waters with one company to start any analysis: Google ($GOOG).

This past April, Google saw Danielle Brown, its Chief Diversity Officer leave after she reported how the company was not up to self-set diversity standards. This came after Laszlo Bock, Google's Human Resources director, stepped down after a decade in the job. Brown was replaced by Melonia Parker, who joined Google from the U.S. Department of Energy after an 18 year stint at Lockheed Martin.

After seeing this, we put our mass gender decoder to the test on Alphabet Inc.'s career websites, especially after we found Google's PR team tout the decoder on its official blog in March. What it found is how Google has changed the way it writes new job listings to equalize language for potential gender bias. In this case, it the gender decoder shows that the language of Google's job listings over time has become more gender egalitarian.

Within Google, there was a less-observable gender bias in job postings categorized under Google proper and Google Fiber. Meanwhile, YouTube's job postings, on average, used language that was more coded towards females than males. On the flip side, Chronicle, Google's cybersecurity team, had the average job posting lean towards more masculine words as determined by the gender decoder.

Interestingly, this wasn't always the case. As of late, job listings for Chronicle saw an uptick in biases that may put off female applicants since October 2018.

Meanwhile, YouTube has made enough changes in its job listings that, over time, what was once a strong male bias has become just the opposite in what might be described as an overcorrection.

Year/Month

Brand

% Masculine Bias (Average)

2019/05

Chronicle

75%

2019/05

Google

51.54%

2019/05

Google Fiber

54.71%

2019/05

Loon

25%

2019/05

X

50%

2019/05

YouTube

20.64%

The big however with the gender decoder

HOWEVER 

We found that this methodology could have some caveats. For instance, we studied the full job listing page, converting every word of the listing into a format readable by the gender decoder. That did not take into account the placement of sections, including the "Equal Opportunity Statement" Google has on its listings. This statement was put on the bottom of its job listings in a font significantly smaller than the rest of the listing, and is on every job posting that Google has.

For example, when we manually ran the YouTube Recruiter job listing (above) without the statement, the listing showed more masculine language than feminine language.

Furthermore, the words used to determine whether or not a job listing has "masculine" or "feminine" had debates around the office. What's to say "active" is a masculine word? Why is "loyal" a feminine word?

We don't know, but we are siding with social scientists here, and base this analysis on their research.

Maybe the solution for Google is to make this statement more prominent in its listings; if our algorithms are picking this up, but the average job poster stops when they get to the fine print, it may be reasonable to make it stand out more among the listings to make all genders more comfortable in applying, according to this study.

As human resources experts, social scientists, and investors are looking into anything and everything that impacts ESG more than ever before, this gender decoder may be a step towards holding those who make these listings accountable, or at the very least, aware of the words that may put off their next greatest working.

We'll be experimenting with the gender decoder in the coming weeks and months as we seek to understand how companies and industries write their job listings and how they are perceived by potential applications, especially as companies look to not just be more responsible, but also more diverse in the coming years.

This article was written in collaboration within Thinknum Media.

Ad placeholder