Bias in AI is a real problem. Here’s what we should do about it

AI 2018

WIPO/Emmanuel Berrod
Nono-Y the robot was one of the highlights of the 2012 Geneva Inventions Fair.

This article is brought to you thanks to the strategic cooperation of The European Sting with the World Economic Forum.

Author: Robert GrynChief Executive Officer, Codewise & Pawel RzeszucinskiChief Data Scientist, Codewise

Hardly a day goes by without a new story about the power of AI and how it will change our world. However, there are also dangers alongside the possibilities. While we should be rightly excited by the idea of AI revolutionizing eye disease and cancer diagnostics, or the potential transport revolution promised by autonomous vehicles, we should not ignore the risks – and one of the most pressing of these is bias.

Unlike killer time-traveling robots, AI bias is unlikely to be the stuff of the next Hollywood blockbuster – and yet it is a potentially huge problem.

How can AI be biased?

Although it’s comforting to imagine AI algorithms as completely emotionless and neutral, that is simply not true. AI programmes are made up of algorithms that follow rules. They need to be taught those rules, and this occurs by feeding the algorithms with data, which the algorithms then use to infer hidden patterns and irregularities. If the training data is inaccurately collected, an error or unjust rule can become part of the algorithm – which can lead to biased outcomes.

Some day-to-day examples might be worse performing facial recognition software on non-white people, or speech recognition software that doesn’t recognize women’s voices as well as men’s. Or consider the even more worrying claims of racial discrimination in the AI used by credit agencies and parole boards.

The algorithm used by a credit agency might be developed using data from pre-existing credit ratings or based on a particular group’s loan repayment records. Alternatively, it might use data that is widely available on the internet – for example, someone’s social media behaviour or generalized characteristics about the neighborhood in which they live. If even a few of our data sources were biased, if they contained information on sex, race, colour or ethnicity, or we collected data that didn’t equally represent all the stakeholders, we could unwittingly build bias into our AI.

If we feed our AI with data showing the majority of high-level positions are filled by men, all of a sudden the AI knows the company is looking to hire a man, even when that isn’t a criteria. Training algorithms with poor datasets can lead to conclusions such as that women are poor candidates for C-suite roles, or that a minority from a poor ZIP code is more likely to commit a crime.

As we know from basic statistics, even if there is a correlation between two characteristics, that doesn’t mean that one causes the other. These conclusions may not be valid and individuals should not be disadvantaged as a result. Rather, this implies that the algorithm was trained using poorly collected data and should be corrected.

Fortunately, there are some key steps we can take to prevent these biases from forming in our AI.

1. Awareness of bias

Acknowledging that AI can be biased is the vital first step. The view that AI doesn’t have biases because robots aren’t emotional prevents us from taking the necessary steps to tackle bias. Ignoring our own responsibility and ability to take action has the same effect.

2. Motivation

Awareness will provide some motivation for change but it isn’t enough for everyone. For-profit companies creating a product for consumers have a financial incentive to avoid bias and create inclusive products; if company X’s latest smartphone doesn’t have accurate speech recognition, for example, then the dissatisfied customer will go to a competitor. Even then, there can be a cost-benefit analysis that leads to discriminating against some users.

For groups where these financial motives are absent, we need to provide outside pressure to create a different source of motivation. The impact of a biased algorithm in a government agency could unfairly impact the lives of millions of citizens.

We also need clear guidelines on who is responsible in situations where multiple partners deploy an AI. For example, a government programme based on privately developed software that has been repackaged by another party. Who is responsible here? We need to make sure that we don’t have a situation where everyone passes the buck in a never-ending loop.

3. Ensuring we use quality data

All the issues that arise from biased AI algorithms are rooted in the tainted training data. If we can avoid introducing biases in how we collect data and the data we introduce to the algorithms, then we have taken a significant step in avoiding these issues. For example, training speech recognition software on a wide variety of equally represented users and accents can help ensure no minorities are excluded.

If AI is trained on cheap, easily acquired data, then there is a good chance it won’t be vetted to check for biases. The data might have been acquired from a source which wasn’t fully representative. Instead, we need to make sure we base our AI on quality data that is collected in ways which mitigate introducing bias.

4. Transparency

The AI Now initiative believes that if a public agency can’t explain an algorithm or how it reaches its conclusion, then it shouldn’t be used. In situations like this, we can identify why bias and unfair decisions are being reached, give the people the chance to question the outputs and, as a consequence, provide feedback that can be used to address the issues appropriately. It also helps keep those responsible accountable and prevents companies from relinquishing their responsibilities.

While AI is undeniably powerful and has the potential to help our society immeasurably, we can’t pass the buck of our responsibility for equality to the mirage of a supposedly all-knowing AI algorithms. Biases can creep in without intention, but we can still take action to mitigate and prevent them. It will require awareness, motivation, transparency and ensuring we use the right data.

the sting Milestone

Featured Stings

Can we feed everyone without unleashing disaster? Read on

These campaigners want to give a quarter of the UK back to nature

How to build a more resilient and inclusive global system

Stopping antimicrobial resistance would cost just USD 2 per person a year

UN chief urges ‘active, substantive and meaningful participation’ on International Day of Democracy

How Kolkata is tackling its air pollution with public transportation

EU Trust Fund for Africa: Can it be beneficial for Italy and tackle the migration crisis in the Mediterranean?

A sterilised EMU may lead to a break up of Eurozone

What’s the latest on coronavirus antibody tests?

A profitability roadmap for the fast-changing automotive sector

Coronavirus: EU global response to fight the pandemic

To what extent are our moral standards responsible for killing people?

How speaking ‘parentese’ to your child could make them a faster learner

The gender gap of medicine in 2018

UN health experts warn ‘dramatic resurgence’ of measles continues to threaten the European region

The Ultimate Career Choice: General Practice Specialist

Clean air is good for business

Living in the mouth of the shark: we are all refugees

UN sounds alarm as Venezuelan refugees and migrants passes three million mark

Mergers: Commission prohibits proposed merger between Tata Steel and ThyssenKrupp

Social Committee teaches Van Rompuy a lesson

Electronic or conventional cigarettes – which is safer?

Inflation not a problem for Europe

The European Sting’s 2018 in most critical review

Extraordinary Justice and Home Affairs Council: Commission presents Action Plan for immediate measures to support Greece

Everyone’s ‘buy-in’ needed to restore peace in Kosovo, UN envoy tells Security Council

The global appetite for meat is growing, and it’s harming the planet

Microplastics have spread right to the sea bed, study finds

Coronavirus: European roadmap shows path towards common lifting of containment measures

The Eurogroup has set Cyprus on fire

The Banking Union may lead to a Germanic Europe

Young people meet in Malta to shape the future of Europe

WHO reports ‘very strong progress’ in battling DR Congo Ebola outbreak

We’re facing a ‘cold crunch,’ and it’s nothing to do with the polar vortex

The Commission neglects the services sector and favours industry

New UN-supported farming app is cream of crop in tackling Sahel pest

Art, mental health and suicide: different strategies for increasing access to health services

UN rights chief ‘alarmed’ by upsurge in attacks against civilians in Syria’s Idlib

Women’s rights in Asia – how far have we come?

Sudan military committed to ‘ensuring stability’ and ‘peaceful transition’ says senior diplomat, as UN rights chief appeals for protesters’ rights to be upheld

Oh, well, you are wrong, Google responds to the European Commission

5 futuristic ways to fight cyber attacks

Gender Equality Index 2019: Still far from the finish line

Commission sets moderate greenhouse gas reduction targets for 2030

This new form of currency could transform the way we see money

European Parliament gives green light to Christine Lagarde

Greece bailout programme: Full agreement after marathon negotiations on debt relief between IMF and Eurozone

Will the outcome of the UK referendum “calm” the financial markets?

Protecting European consumers: toys and cars on top of the list of dangerous products

A digital tax sounds like a great idea. Here’s why it might not be universally popular

INTERVIEW: Poverty, education and inclusion top new General Assembly President’s priority list

‘Continuing deterioration’ leaves Mali facing critical security level: UN expert

Iran: BBC and other broadcast journalists harassed; families threatened – UN experts

We lack a global framework for saving our environment. Here’s how we change that

OECD sees rising trade tensions and policy uncertainty further weakening global growth

Who’s promised net-zero, and who looks likely to get there?

EU Ombudsman investigates the European Commission

Carnage must stop in northwest Syria demands Lowcock, as attacks intensify

Civil society can make sure no one is left stranded by the skills gap

EU Commission: Banking and energy conglomerates don’t threaten competition!

EU sets ambitious targets for the Warsaw climate conference

How AI can ensure your transition to remote work is equitable

Antitrust: Commission sends Statement of Objections to O2 CZ, CETIN and T-Mobile CZ for their network sharing agreement

New SDG Advocates sign up for ‘peace, prosperity, people’ and planet, on the road to 2030

More Stings?

Advertising

Speak your Mind Here

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s