Fair UI designfor the ‘gig economy’
Fair UI publication · Blog post · Help us find out how designers can help counteract implicit bias online.
By Roxanne Leitão · October 27, 2017
Recently online bias and discrimination have been receiving a growing amount of attention. From individual accounts of discrimination, such as the Dyne Suh Airbnb case, to company initiatives that aim to design out opportunities for racial profiling, such as NextDoor.
The disturbing news is that this is not limited to a select few platforms. Research has been conducted into several online labour marketplaces and sharing economy platforms, only to find that bias can be observed across the board.
The ‘gig economy’, where workers compete for short-term freelance work, is increasing all over the world. The reasons for this growth can be seen as two-fold, firstly the economic and social implications of high unemployment rates and the need for jobs where they do not currently exist, and secondly the increasing global connectedness, where about 51.7% of the world’s population now has access to the internet. It is estimated that about 77 million people formally identify as freelance and/or gig economy workers across the USA, UK and Europe. Statistics on Asia, Africa, and South America are harder to find but in Singapore it is estimated that 11% of the workforce are freelancers. By 2018, the gig economy is expected to be worth $5 billion.
Indeed freelancing and short-term contract work offer the benefit of flexibility and access to remote workplaces, be it by those in geographic locations where they cannot find work, or those who cannot access certain workplaces due to disability, or even parents who need to care for a newborn. However, this flexibility often places the burden of economic risk on individuals rather than on businesses, by excusing them from offering the same benefits they would be obligated by law to offer employees.
Many of the platforms offering employers access to casual workers (e.g., TaskRabbit, YunoJuno, Fiverr, Upwork, Twago, peopleperhour) allow potential employers to browse lists of candidates according to selected skills, and they generally display worker’s CVs, profile pictures, and names. This sort of practice, often unintendedly, reveals individual’s protected characteristics which can lead to both explicit and implicit bias, which may result in varying levels of discrimination.
A review of TaskRabbit and Fiverr, published earlier this year, revealed that on both platforms workers perceived as Black received significantly worse reviews than workers with similar qualifications that were perceived to be White. The authors also found that TaskRabbit’s algorithms discriminate according to race and gender in search result rankings, and that this varies according to the city in which a search is conducted, therefore tailoring discrimination to the particular biases present in a given geographical place.
A 2013 study of an (anonymous) online labour marketplace with workers from Sub-Saharan Africa and South-east Asia, showed that often workers list their location as the USA or Australia, due to perceptions that people from these locations get more assignments. Workers also struggle with misconceptions about Africa, stating that many employers think of the African population as illiterate, uneducated, and willing to work for whatever fee the employer stipulates.
On a more positive note, Applied is already setting an example in exploring ways of enabling online recruitment while eliminating, or reducing, opportunity for bias. Applied was created by the Behavioural Insights Team as part of a UK government initiative. It renders the process of selecting candidates for interview anonymous, where all identifying information is stripped including photos, names and CVs. Instead, candidates all answer the same 5 questions that relate specifically to the job they’re applying for. The answers are then evaluated by up to 3 recruiters from the employer, without knowing which answer belongs to which candidate. Although this doesn’t guarantee that bias will be eliminated during an interview, it at least gets candidates through the door.
The way we design UIs for recruitment platforms, and the sharing economy, can have a significant impact on users’ behaviour. Either by allowing for certain behaviors to manifest or by intentionally designing them out. Strategies that have been explored so far include removing names and profile pictures, or revealing them once a transaction has been completed, with the aim of making it harder to immediately identify someone’s ethnicity, gender and/or age. However, to the best of our knowledge, very little exists in the way of evidence to support any design strategies that may have been implemented in the past, which makes designing with any amount of certainty quite difficult. As designers we need to start thinking about how we may design to eliminate opportunities for implicit and explicit bias, but also, about collecting data and evidence to support or refute different design strategies.
Fair UI is an exploratory experimental project, launched by Samhæng, precisely to look into designing out opportunities for bias in UIs, with the aim of delivering a set of guidelines and/or design patterns that are evidence-based and capable of providing designers with guidance in their daily practice. In the near future, we will be running a series of empirical tests to determine the efficacy of a selected set design strategies that will then form the basis of a set of UI design best practice materials.
Samhæng ApS
Islands Brygge 79A, 4. 1 2300 Copenhagen
hello@samhaeng.com ❧