Skip to main content

Uber’s AI may racially discriminate drivers

By August 24, 2023August 30th, 2023Case Review, Current Affairs

Uber’s integration of Artificial Intelligence (AI) facial recognition for driver verification has sparked controversy, alleging racial discrimination against drivers. This case has not yet been heard in tribunal; however, it is a noteworthy topic as technology continues to rapidly integrate into working practices. Two unions are supporting two separate cases against Uber on this matter.

What is the case about?

Now infamous in employment law for employment status, Uber began to use facial recognition software in April 2020 to confirm the identity of drivers, which is alleged to discriminate against them on the grounds of race.

The circumstances

Uber introduced a ‘Real Time ID check’ developed by Microsoft, which requires them to take a photograph of themselves to verify their identity before they are allowed to use the app which gives them access work. There are several legitimate practical reasons that this was introduced, including the need to ensure security of passengers.

The concern is that face recognition algorithms do not correctly identify people of colour. Indeed, the US National Institute of Standards and Technology has shown that individuals classified in a database as African American or Asian were 10-100 times more likely to be misidentified than those classified as white. Furthermore, the Equality and Human Rights Commission have called out concerns over how this technology is regulated in the UK.

If the app used by Uber does not recognise the individual in the photo, access to the app is suspended for 24 hours and work cannot be accessed. Further failed attempts can result in their account being terminated and the driver being removed from the platform. Drivers dismissed as a result of this can also automatically have their private hire licences for Transport for London revoked. Before a driver is removed from the platform, there is a human review and the driver is allowed to appeal.

The claims

In the cases of Raja v Uber and Manjang v Uber, both drivers are claiming indirect discrimination on the grounds of race for the ways in which they have been affected due to the app failing to identify them. One is also claiming for harassment and victimisation on the same grounds.

Indirect discrimination is usually less obvious than direct discrimination and is normally unintended.

Generally speaking, it occurs when a rule or plan of some sort is put into place which applies to everyone; and is not in itself discriminatory but it could put those with a certain protected characteristic at a disadvantage.

In law, it is where a ‘provision, criterion or practice’ (PCP) involves all these four things:

  • the ‘PCP’ is applied equally to a group of people, only some of whom share the protected characteristic
  • it has (or will have) the effect of putting those who share the protected characteristic at a particular disadvantage when compared to others who do not have the characteristic
  • it puts, or would put, the person at that disadvantage
  • the employer is unable to objectively justify it.

The Equality Act does not define what a ‘PCP’ is. Acas say ‘the term is most likely to include an employer’s policies, procedures, requirements, rules and arrangements, even if informal and whether written down or not.’

Although all four elements must apply for a claim to be successful, it would be the responsibility of the employee (or ‘claimant’) to demonstrate point 2, and to demonstrate that point 3 applies to themselves personally.

In limited circumstances indirect discrimination may be lawful if the employer can objectively justify it as appropriate and necessary. (The law calls this ‘a proportionate means of achieving a legitimate aim’.)

Harassment is ‘unwanted conduct’ related to a protected characteristic. It must have the purpose or effect of violating a person’s dignity or creating an intimidating, hostile, degrading, humiliating or offensive environment for them.

Victimisation occurs when an employee suffers a ‘detriment’ because they have done (or because it is suspected that they have done or may do) one of the following things in good faith:

  • make an allegation of discrimination
  • support a complaint of discrimination
  • give evidence relating to a complaint about discrimination
  • raise a grievance concerning equality or discrimination
  • do anything else for the purposes of (or in connection with) the Equality Act, such as bringing an employment tribunal claim of discrimination

Learning points

Although we do not yet have an outcome on these cases, it highlights a modern consideration that employers need to be aware of in relation to their practices and use of technology and AI to manage staff.

Certainly, appropriate measures will need to be put in place to ensure that the use of technology, which often improves and streamlines practices, is nonetheless subject to control measures to ensure that it is applied fairly and is not left to make decisions about employment unchecked.

Artificial intelligence is also being used more frequently in recruitment, such as analysing responses and facial cues in remote interviews, reducing the number of CVs to be looked through etc and it should be remembered that candidates are similarly able to bring claims against a prospective employer.

The use of AI also has the potential for implications under Human Rights legislation with regards to privacy and under data protection legislation which prohibits the use of automated decision making which results in a legal or significant effect on the data subject, without human intervention.

We are here to help

The use of Artificial Intelligence in organisations is increasing. We’ve created an in-depth policy and impact assessment to help you to leverage the benefits of AI in a compliant manner. In addition, our Guide to Using AI in the Workplace takes an in-depth approach on implementation, challenges and benefits.


Interested in what we do?

Get the latest news from HR Solutions delivered to your inbox