Deal breaks and heartbreaks: ethnic preference settings make dating apps even harder for people of colour

Illustration by Naomi Gennery

While online dating becomes vital to modern romance and hookups, all is not fair in love and design, especially for minorities. As previously documented on apps like Tinder, for many people of colour, online dating has led to fetishisation, exclusion or abuse. If technology mediates how we date, then design mediates how we experience it and so developers are making the new interface of intimacy.

Around 40% of couples now meet online. Tinder is currently the highest-grossing non-game app on Apple’s app store. To streamline and speed up online matchmaking, virtually all dating apps enable filtering preferences, from religion to politics and even astrology. Hinge, which launched in 2012, markets itself as the anti-Tinder relationship app but is owned by Match Group, the parent company of major dating sites Tinder and OkCupid. It employs a machine learning algorithm, which takes into account basic user information such as likes and activities in order to collate matches for its “Most Compatible” feature.

However, automated matchmaking is flawed, largely ineffective and conservative by nature. Matching algorithms define a “good” match based on a user’s past matches without accounting for the biases in how those matches got there. These algorithms don’t calculate what a “good” future match might be; they merely repeat the past. Putting our concerns that one company could potentially monopolise online dating, in terms of the product, Hinge isn’t that different from what’s already out there. However, unlike Bumble or Tinder, which don’t offer ethnic-based filters, Hinge does.

“Automated matchmaking is flawed, largely ineffective and conservative by nature”

Dating app design in its essence suggests that compatibility is something that can be automated (which sounds like the antithesis of romance). Ethnicity and race-based filters are some of the most problematic and contentious aspects of them. You can take your pick of candidates via the click of a button: “black/African descent”, “East Asian”, “American Indian”. Racial preferences based on race pops up all the time in pop culture – from Samira’s unhappy time on Love Island, to Molly from Insecure deciding not to date an Asian guy because she says she wants to marry a black man. Of course, prejudice exists in society, but should our products facilitate it?

“The thing that tech is consistently bad at understanding is that it still thinks, when we’re using the internet, it’s just the individual behind the screen. It doesn’t model people in a context, which is a social and tech one,” says Florence Okoye, a UX designer and part of the collective AfroFutures_UK.

As a result, the online world is designed as hyper-personalised, where people are imagined as separable from society, community and politics.

“There’s an imbalance in tech where we’re happy to set goals like ‘increase download numbers’, yet we’re less comfortable with setting goals like ‘making people less racist’,” says Florence. “It’s not that designers are intentionally bigoted, but not thinking to test reveals biases.”

A good illustration of this lack of forethought is Hinge’s “dealbreaker” feature, which turns a preference into an absolute requirement. If someone doesn’t meet your requirement, then you won’t see their profile. This isn’t unreasonable in the context of preferring to meet someone who lives near you or a non-smoker. But to allow it in the ethnicity category is problematic: it’s where preferences become prejudice.

To be clear: filtering people based on race is problematic. There are a quagmire of complexities trying to separate racial preferences from prejudices and societal standards of beauty. Implicit biases already rear their ugly heads through the seemingly docile “likes” but explicitly eliminating an entire group by choosing to “deal break” them is particularly crude.

Still from Insecure / HBO

Take Cosmopolitan’s list of the 30 most swiped Tinder users – it is overwhelmingly white. Our current dating apps make recommendations based on an algorithm called “collaborative” filtering, used on platforms such as Netflix, Amazon and Hinge. This is where majority opinion and, inevitably, biases are used to assume a user’s preferences. On mainstream Western apps, white people are the majority user base and therefore more popular in the dating pool, meaning minority profiles are frequently excluded. In a pernicious feedback loop, over time, collaborative filtering favours the majority at the expense of the minority. Unfortunately, this means black women and Asian men are the least likely to get responses, according to research by OkCupid, while other minorities are fetishised.

All of this makes mainstream dating apps an unequal playing field. It’s for these reasons, and of course,  experiences of fetishisation, microaggressions, and prejudice on mainstream apps, that there are arguments in favour of people of colour being able to filter matches from white people. But overall, “deal break” is a design choice that encourages racism, erasing whole groups of people based on race. By preventing an entire group from ever showing up in a user’s feed, this negates the one clear benefit of dating apps: expanding your pool of potentials and broadening your horizons.

Even if you can reconcile people’s ethnic preferences on the app, you should at least be able to make the “open to all” ethnicity option a deal breaker, but you can’t. So those who have racial preferences can’t be filtered out.

“The preference section allows people to exclude races but it can also target them, which is worse because I’ll never date people who have bad reasons for rejecting me, but I’ve dated people who I discovered fetishised me,” says one 26-year-old Hinge user, Dan*, from London.

“The ‘dealbreaker’ function exacerbates the problem. It seems to promise that you can screen out everyone except your target races – the perfect tool for someone with a fetish,” he continues. “Personally, I don’t want to date anyone who isn’t ‘open to all’, but you can’t make ‘open to all’ a deal breaker, so there’s no way of filtering out the creeps.”

Another user, Stacy*, 33, says: “Lots of guys who are into black women would describe me as ‘forbidden fruit’ and it just turned me off dating apps forever. I want to be known for things like my art and music tastes, not the colour of my skin.” If Hinge insists on enabling “ethnicity” filters (which is what it calls its narrow set of categories that are actually racial categories), then at least they should redesign the interface to allow users to filter out those with ethnic preferences. “There’s a lot of racist language on apps, but at least then I know to swipe left. With filters, I don’t know who’s liked me for my race so it’s more insidious,” says Joe*, another 25-year-old user. Hinge did not respond to requests for comment

“There are a quagmire of complexities trying to separate racial preferences from prejudices and societal standards of beauty”

Researchers at Cornell University argued that dating apps which allow race filters do reinforce racial division and biases in a 2018 paper. Filters can be useful to self-identify and identify others, but they also “allow users to explicitly or implicitly exclude, demote, or fetishise others on the basis of race or other protected characteristics (which can) hold serious implications for social equality.”

The public consequences of personal choice aren’t just influenced by filters, but also the algorithms used to sort matches. Online dating is serious, and how it impacts our lives and socioeconomic patterns should be treated as such. What is the systematic implication if design enables ethnic uniformity? When dating is coded on such logic? All these incremental choices can have a monumental significance in the aggregate, in what design theorist Geert Lovink describes as the “invisibility of technological violence”. This is notoriously exemplified by Facebook’s facilitation of the Rohingya crisis in Myanmar.

“The onus is on the people who have designed it, because I don’t think policing the user base is of any use,” says Lovink. “We’re talking here not just about behaviour, but code, about choices engineers have made We can have another sexual imagination. To design another dating app is not a societal challenge. It can easily be done.”

Ari Waldman, a law professor at New York University, says matching algorithms can deepen racial divides because they show people who tend to look like us. “That’s called a network effect, and it can make dating less diverse,” he says. “The effects of this are profound: It erodes our autonomy, perpetuates inequality and tribalism, and reduces dating to code-able inputs and outputs.”

For example, in 2016, San Francisco–based dating site CoffeeMeetsBagel’s algorithm faced criticism for its bias towards presenting matches of the same race as the user, even when they had expressed no racial preference. Its algorithm inferred a user’s preference based on the racial preferences of other people with similar characteristics. So white users without a preference were more likely to see more white matches and less of other races.

“Dating apps are not just about you but also your interaction with others. You’re not realising that you also come under somebody else’s umbrella or are excluded from it,” says Afreen, a UX researcher who has worked at Cambridge University. “When I downloaded Hinge, I didn’t say I wanted a type of ethnicity or religion, but then I started to see I was getting ads [on the web] and profiles for Muslims.”

Dating apps can be spaces to explore and expand sexual possibilities by exposing us to different people. Bad design, though, can regress that. Hinge’s major claim is that it’s “designed to be deleted”, but perhaps not for the reason it wants.

*Names have been changed

You may also like

error: Content is protected !!