AI-assisted recruitment is biased. Here’s how to make it more fair

Articial Intelligence.jpg

(Unsplash, 2019)

This article is brought to you thanks to the collaboration of The European Sting with the World Economic Forum.

Author: Julius Schulte, Research and Analysis Specialist, Future of Digital Economy and Society, World Economic Forum


Chances are that you have sent hundreds or even thousands of resumes and cover letters over the years to potential employers. This observation is supported by the fact that according to Bureau of Labor Statistics, wage and salary workers in the US have been with their current employer for an average of 4.2 years. Younger workers, however – those aged between 25 and 34 – have on average been with their current employer for just 2.8 years, which suggests they are part of an increasingly transient workforce that is more accustomed to applying for jobs.

This pattern can also be observed in other countries – with a few notable exceptions, such as Japan. As a result, many millennials from around the world are growing increasingly accustomed to this new career model of sending out a near-constant barrage of cover letters and CVs.

Eventually, these applications may lead to you accepting a job offer that determines your quality of life – your income, the time you can spend with friends and family, and the neighbourhood you live in.

In some cases, however, the chances of getting the job for which you have applied for are systematically biased. For example, it has been shown that in the US labour market, African-American names are systematically discriminated against, while white names receive more callbacks for interviews. However, we observe bias not only because of human error, but also because the algorithms increasingly used by recruiters are not neutral; rather, they reproduce the same human errors they are supposed to eliminate. For example, the algorithm that Amazon employed between 2014 and 2017 to screen job applicants reportedly penalised words such as ‘women’ or the names of women’s colleges on applicants’ CVs. Similarly, researchers from Northeastern University, the University of California and Upturn, a public-interest advocacy group, have demonstrated that Facebook’s housing and employment ads delivery follows gender and race stereotypes.

What can you do about it?

Given these biases, what steps can you take to maximise the chances that your CV and cover letter will land you an interview?

Today, recruiters in large companies such as Target, Hilton, Cisco, PepsiCo and Amazon use predictive hiring tools to both reduce the time and cost – and to hypothetically increase the quality and tenure – of each new hire. Understanding at which points algorithms come into play in the hiring process can help identify the origins of bias.

Typically, hiring is not a single decision, but a process involving many small decisions that culminate in a job offer. The aim of the first step – known as sourcing – is to generate a strong set of applicants (see figure 1). This can be done via advertisements, active headhunting or attractive job descriptions. Usually, artificial intelligence (AI) is used to optimise the display of job ads as well as their wording, as done by companies who provide ‘augmented writing’, such as Textio.

The second step, screening, is crucial as this is where algorithmic bias can strongly influence whether your application is rejected. Screening uses algorithms that systematically decipher your cover letter and CV and save this information in the company’s HR database. This information could include your years of experience, the languages you speak, the university degrees you obtained and the countries in which you have you worked. Algorithms are then used to narrow down the selection of candidates automatically – not in an affirmative way, but by rejecting those who do not fit. The company CVViZ, for example, employs machine learning algorithms to screen resumes for keywords in context and to create relative rankings between the different candidates.

 How job applicants are whittled down

Image: SEQ

If you have made it through the screening process, you may be invited to an interview that might also use different algorithms to support the employer’s final selection decision. HireVue, a US-based company, assesses candidates based on the keywords, facial expressions and tones they use in video interviews. After a video interview you may get a face-to-face interview, after which you are rewarded with an offer.

The use of machine learning algorithms in each of these steps can lead us to question the fairness of an AI-driven recruitment system. As in the case of Facebook, mentioned above, bias may be present in how job advertisements target potential employees. In other cases, web crawlers try to find matching candidates to job descriptions by scanning information from publicly available online sources – and while in this case one might argue the unfairness is limited because it doesn’t prevent you from applying, this screening process may already display strong bias that is difficult to overcome.

Algorithms are often trained to read specific formats of CVs and resumes, which could mean your CV is not evaluated properly. For example, in Japan there is a common CV template (Rirekisho) used by all job applicants. In China, applicants list their work experience in reverse chronology. Other cultural differences exist between American CVs and European CVs; the former is usually one page long with no photo, while the latter can be between two to three pages, headlined with a photo.

If your CV has been successfully parsed – that is, translated into machine-readable data – another algorithm will rank your application vis-à-vis other applications based on the data in your CV and your cover letter. Each factor, such as your years of experience, languages, software skills and the set of words you use, to name but a few success metrics, will be weighted according to what is estimated to have successfully worked in the past. Past hiring decisions are used to train the algorithm to evaluate who is most likely to be the ‘right’ applicant. Often this approach inherently replicates the same biases that were present before the arrival of AI recruiting tools. If the gender distribution of the training data was strongly imbalanced, this may be replicated by an algorithm even if gender is not included in the information provided in the application documents. For example – as in Amazon’s case – strong gender imbalances could correlate with the type of study undertaken. These training data biases might also arise due to bad data quality or very small, non-diverse data sets, which may be the case for companies that do not operate globally and are searching for niche candidates.

Similarly, the evaluation of video interviews conducted prior to any in-person interaction may replicate biases that rely on training data if it has not been thoroughly vetted against categories such as gender, age or religion.

Recommendations

There are several steps both job applicants and employers can take to maximise the chance that the right application will be read by a human being making the ultimate hiring call.

As an applicant, you should:

1) Make sure your CV is formatted according to local norms. Evaluate which length, layout, photo and format are most appropriate. Avoid graphics and fancy fonts that may not be readable by the algorithm.

2) Elaborate on your work experience and adapt your language to that of the job description.

3) Make sure to include key information on your CV – what is not on your CV cannot be evaluated. For example, mention the month and year for each position you held instead of only the year.

4) Optimise your online brand by using the appropriate jargon. Use language that speaks to the job family you are interested in. For example, IT jobs have different titles such as ‘full stack developer’ that are often used in connection with programming languages such as C++ or PHP.

As an employer using machine learning algorithms in the hiring process, ensuring fairness is key. The following concepts, taken from recent research carried out at Delft University of Technology, may provide a guide:

Justification: Does it make sense for an organization of a certain size with specific hiring needs to employ AI hiring tools, given the data requirements and the need for bias remediation?

Explanation: Does the AI tool explain its decisions and are those explanations made available to the recruiter and the applicant? If algorithmic information is proprietary, are counterfactual explanations taken into consideration?

Anticipation: Are mechanisms in place to report biased decisions and what are the recourse mechanisms in place?

Reflexiveness: Is the organization aware of its changing values and its reflection in the data it uses? How is data collected and which limitations become evident?

Inclusion: Do you think about diversity in your team and in the evaluation results?

Auditability: Is the training data publicly available or verified by a third party?

the sting Milestone

Featured Stings

Can we feed everyone without unleashing disaster? Read on

These campaigners want to give a quarter of the UK back to nature

How to build a more resilient and inclusive global system

Stopping antimicrobial resistance would cost just USD 2 per person a year

More funds needed to counter ‘persistent and multi-faceted humanitarian problems’ in Ethiopia

Commission publishes guidance on coronavirus-related humanitarian aid to Syria despite sanctions

Teamgum @ TheNextWeb 2014

A letter from Italy: Our insecurity in COVID-19 times

Companies that put employees first perform better

Vile act of torture prohibited ‘under all circumstances’, UN chief affirms on International Day to support victims

The power of partnership: joining forces to fight financial crime

Speak up for health care workers’ safety

Intervene, don’t overthink – the new mantra of systems design

8000 young people in the EP in Strasbourg: “a breath of fresh air for EU democracy”

This is what happened to CO2 emissions in the EU last year

From Israel’s ‘start-up nation’, 4 lessons in innovation

3.7 million lives could be saved by 2025 if health services ramp up nutrition actions: WHO

You can live up to 10 years longer by doing these 5 things

Statement by OECD Secretary-General Angel Gurría on the outcome of COP 25

Italian Prime Minister Giuseppe Conte: “Europe must listen to the people”

Integration of migrants: Commission launches a public consultation and call for an expert group on the views of migrants

European Youth Forum welcomes strong stance on human rights in State of the Union

Why rich countries are seeing more poverty

Quality of air in Bucharest-Romania: is it fog or is it smog?

Commission reaches agreement with collaborative economy platforms to publish key data on tourism accommodation

This is how we can feed the planet while saving the ocean

Genocide threat for Myanmar’s Rohingya greater than ever, investigators warn Human Rights Council

Mental health and suicide prevention

World Migratory Bird Day highlights deadly risks of plastic pollution

UN chief expresses solidarity with Indonesian authorities after flash floods kill dozens in Papua

‘Path to peace’ on Korean Peninsula only possible through diplomacy and full denuclearization: US tells Security Council

9 climate tipping points pushing Earth to the point of no return

EU Ombudsman investigates the European Commission

Humanitarian aid: €7 million for disaster preparedness in Southern Africa and Indian Ocean region

MEPs approve €585 million to support refugees from Syria

How can we make entrepreneurship serve the greater good?

Can the world take the risk of a new financial armageddon so that IMF doesn’t lose face towards Tsipras?

We need to build a quantum security coalition. Here’s why

A Sting Exclusive: “The challenge of Society’s digital transformation”, Spanish Minister of Spain for Industry, Energy and Tourism José Manuel Soria live from European Business Summit 2015

China’s lead in the global solar race – at a glance

How COVID-19 can be the Great Reset toward global sustainability

UN food agency begins ‘last resort’ partial withdrawal of aid to opposition-held Yemeni capital

The World Bank’s 2020 country classifications explained

Venezuela: Competing US, Russia resolutions fail to pass in Security Council

On Kristallnacht anniversary, UN chief urges renewed fight against ‘crime’ of anti-Semitism

The EU has to prove it can remain one piece

Politics needs to “Youth UP” in order the ensure the future of our democracies

5 reasons to protect mangrove forests for the future

States with power and influence to end suffering of Yemenis must take action ‘immediately’ – UN rights chief

COVID-19: Both WHO and Europe must learn from the current pandemic, say MEPs

Italy’s revised budget remains roughly unchanged waiting for Europe’s fury

Tropical Cyclone Idai affects 1.5 million across Mozambique and Malawi, as UN ramps up response

Germany rules the banking industry of Eurozone

Google strongly rejects EU antitrust charges and now gets ready for the worst to come

Labels for tyres: deal for greener and safer road transport

Why people with disabilities are your company’s untapped resource

The Fourth Industrial Revolution must not leave farming behind

Agreement on linking the emissions trading systems of the EU and Switzerland

Political solution ‘long overdue’ to protect the children of eastern Ukraine

Europe plans to send satellites into space to monitor CO2 emissions

UN General Assembly President upholds value of multilateralism in speech closing annual debate

‘Air bridge’ vaccination operation begins for Ebola-hit communities in DR Congo

More Stings?

Advertising

Speak your Mind Here

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s