male programmer’s website more relevant than a female programmer’s – ‘even if the two websites are identical except for the names and gender pronouns’. So a male-biased algorithm trained on corpora marked by a gender data gap could literally do a woman out of a job.

But web search is only scraping the surface of how algorithms are already guiding decision-making. According to the Guardian 72% of US CVs never reach human eyes,45 and robots are already involved in the interview process with their algorithms trained on the posture, facial expressions and vocal tone of ‘top-performing employees’.46 Which sounds great – until you start thinking about the potential data gaps: did the coders ensure that these top-performing employees were gender and ethnically diverse and, if not, does the algorithm account for this? Has the algorithm been trained to account for socialised gender differences in tone and facial expression? We simply don’t know, because the companies developing these products don’t share their algorithms – but let’s face it, based on the available evidence, it seems unlikely.

AI systems have been introduced to the medical world as well, to guide diagnoses – and while this could ultimately be a boon to healthcare, it currently feels like hubris.47 The introduction of AI to diagnostics seems to be accompanied by little to no acknowledgement of the well-documented and chronic gaps in medical data when it comes to women.48 And this could be a disaster. It could, in fact, be fatal – particularly given what we know about machine learning amplifying already-existing biases. With our body of medical knowledge being so heavily skewed towards the male body, AIs could make diagnosis for women worse, rather than better.

And, at the moment, barely anyone is even aware that we have a major problem brewing here. The authors of the 2016 Google News study pointed out that not a single one of the ‘hundreds of papers’ about the applications for word-association software recognised how ‘blatantly sexist’ the datasets are. The authors of the image-labelling paper similarly noted that they were ‘the first to demonstrate structured prediction models amplify bias and the first to propose methods for reducing this effect’.

Our current approach to product design is disadvantaging women. It’s affecting our ability to do our jobs effectively – and sometimes to even get jobs in the first place. It’s affecting our health, and it’s affecting our safety. And perhaps worst of all, the evidence suggests that when it comes to algorithm-driven products, it’s making our world even more unequal. There are solutions to these problems if we choose to acknowledge them, however. The authors of the women = homemaker paper devised a new algorithm that reduced gender stereotyping (e.g. ‘he is to doctor as she is to nurse’) by over two-thirds, while leaving gender-appropriate word associations (e.g. ‘he is to prostate cancer as she is to ovarian cancer’) intact.49 And the authors of the 2017 study on image interpretation devised a new algorithm that decreased bias amplification by 47.5%.

CHAPTER 9

A Sea of Dudes

When Janica Alvarez was trying to raise funds for her tech start-up Naya Health Inc. in 2013, she struggled to get investors to take her seriously. In one meeting, ‘investors Googled the product and ended up on a porn site. They lingered on the page and started cracking jokes’, leaving Alvarez feeling like she was ‘in the middle of a fraternity’.1 Other investors were ‘too grossed out to touch her product or pleaded ignorance’, with one male investor saying ‘I’m not touching that; that’s disgusting.’2 And what was this vile, ‘disgusting’ and incomprehensible product Alvarez was pitching? Reader, it was a breast pump.

The odd thing is, the breast-pump industry is one that is ripe for ‘disruption’, as Silicon Valley would have it. Breast-pumping is huge business in the US in particular: given the lack of legally mandated maternity leave, for most American women breast-pumping is the only option if they want to follow their doctors’ recommendations and breastfeed their babies for at least six months (in fact, the American Academy of Pediatrics recommends that women try to breastfeed for at least twelve months).3

And one company, Medela, has pretty much cornered the market. According to the New Yorker, ‘Eighty per cent of hospitals in the United States and the United Kingdom stock Medela’s pumps, and its sales increased thirty-four per cent in the two years after the passage of the Affordable Care Act, which mandated coverage of lactation services, including pumps.’ But the Medela pump is just not very good. Writing for the New Yorker4 Jessica Winter described it as a ‘hard, ill-fitting breast shield with a bottle dangling from it’, which, as it sucks milk out of a woman’s breast ‘pulls and stretches the breast like it’s taffy, except that taffy doesn’t have nerve endings’.5 And although some women manage to make it work hands-free most can’t because it doesn’t fit well enough. So they just have to sit and hold their personal milking contraptions to their breasts, for twenty minutes a time, several times a day.

So, to sum up: captive market (currently estimated at $700 million with room to grow)?6 Check. Products that aren’t serving consumer needs? Check. Why aren’t investors lapping it up?

Addressing the under-representation of women in positions of power and influence is often framed as a good in itself. And, of course, it is. It is a matter of justice that women have an equal chance of success as their equally qualified male colleagues. But female representation is about more than a specific woman who does or doesn’t get a job, because female representation is also about the gender data gap. As we saw with Sheryl Sandberg’s story about pregnancy parking, there will be certain female needs men won’t think to cater for because they relate to experiences that men simply won’t have. And it’s not always easy to convince someone a need exists if they don’t have that need themselves.

Dr Tania Boler, founder of women’s health tech company Chiaro, thinks that the

Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату