The UK aims to become an “AI superpower” but in its National AI Strategy, published last year, the government acknowledged that the country’s AI sector needs greater diversity. “While diverse opinions, skills, backgrounds and experience are hugely important in designing any service – digital or otherwise – it is particularly important in AI because of the executive function of the systems,” it said.
The strategy identified improved diversity in both the AI sector and in the application of AI as crucial objectives. To that end, the UK’s Office for AI and the Department for Culture, Media and Sport recently announced £23m in funding for 2,000 scholarships to AI and data science conversion courses for graduates from underrepresented groups, including “women, black people and people with disabilities”, and those with a non-STEM background.
This is the second round of AI scholarships the government has funded. In the first round, launched in 2019, 76% of scholarship recipients were women, 45% were black and nearly a quarter had disabilities. Over 80% of recipients were based outside London and the South East.
Experts welcomed the new funding but warned that more is needed to address the structural inequalities that women and ethnic minorities will face once they enter the tech industry.
“Participation and inclusion in AI and data science is a massively important part of the puzzle when we’re thinking about the fairness of AI systems and AI working for society as a whole,” says Dr Erin Young, research fellow at the Alan Turing Institute. “But it’s by no means a ‘fix all’ for the kinds of problems related to diversity and inclusion that we’ve seen so far.”
Structural inequalities hamper diversity in AI
Recent diversity statistics show that the AI industry has a long way to go in achieving parity across gender and ethnic lines. According to the World Economic Forums’ ‘Gender Gap Report’, the global AI and cloud computing sector has a significant under-representation of women, with 32% and 14% of the workforce made up of women respectively. Just two out of the eight ‘jobs of tomorrow’ tracked by the WEF have reached gender parity.
And while the proportion of women in ‘Data and AI’ jobs is twice that of cloud computing, according to the WEF, other studies have shown that “persistent structural inequalities” within the former have created gendered careers in the field.
Research by the Alan Turing Institute, for example, found that women are more likely than men to hold jobs “associated with less status and pay” within the AI and data science industry. Based on an analysis of LinkedIn data, the researchers discovered that women have more data preparation and exploration jobs, while men possess more advanced and higher-paid jobs in machine learning, big data, general-purpose computing and computer science.
This stratification of women into lesser-paid subfields and specialities risks exacerbating the gender pay gap, according to the report. “It’s one thing to increase the number of women and people from underrepresented groups in the workforce, but we also need industry to pay close attention to career trajectories,” says Young. “This really translates into women and minorities having a seat at the decision-making table, but also working in frontier AI roles like machine and deep learning.”
Another indication of the make-up of the AI workforce can be gleaned from the diversity reports of Google and Meta (formerly Facebook), two of the world’s largest employers of AI professionals. These reveal limited female representation in senior positions – little more than a third of Meta employees in leadership roles are female (36%), and only 28% at Google.
Google’s US workforce is just 3% black; in EMEA the figure is 3.3%. Meta’s US workforce (the only region for which it provides racial diversity figures) is 4.7% black.
For most AI employers, diversity figures beyond gender are harder to come by. This in itself could deter women and ethnic minorities from entering the industry, says Flavilla Fongang, founder of Black Women in Tech. She called on more AI companies to publish such figures, and predicts they may soon have little choice, as investors demand greater transparency on environmental, social and governance (ESG) matters. “The race to the top is happening now but lots of companies don’t realise that they’re lagging behind,” she says.
The lack of intersectional diversity statistics also has consequences for policymaking, according to Young. “Responsible reporting is such a key part of this because without the data in a very clear picture of what’s happening in the UK AI workforce, particularly on an intersectional level, there’s no way we can know when and where policy interventions will be the most impactful and make a difference,” she says.
Diversity in AI requires private sector participation
Much of the success of the UK government’s initiative in levelling the playing field in data science and AI hinges on support from the private sector. The DCMS called on the industry to provide equal funding for the AI scholarships, arguing that it will go a long way towards solving the existing skills shortage in the country. An independent organisation that will encourage industry investment and participation will also be unveiled later this year.
Professor Dame Wendy Hall, regius professor of computer science at the University of Southampton, and one of the architects of the scheme, is optimistic that employers will heed this call. “They need the skills, whether it’s an AI company like DeepMind, or a manufacturing company in Sheffield that needs people to help them apply AI in their processes,” she says. She also hopes that universities and industries within their communities will pick up the scheme once government funding ends. “We just need a kickstart from the government and we need it to grow from there.”
Fongang is similarly optimistic about the scheme, but says that monitoring outcomes, such as where scholarship recipients end up after they graduate, is crucial. “It’s a good thing that we’re seeing a high number in diversity for once and I hope there’s some real success that comes off the back of that,” she says. “But it’ll be counter-productive if we’re not monitoring these deliverables and tracking the achievements of this policy.”
Afiq Fitri is a data journalist for Tech Monitor.