Is bias deterring women from health and safety careers? AI reveals “pale, male, stale” image

Images of health and safety professionals generated by AI are exclusively white, male, and around middle age, despite the wealth of job opportunities health & safety offers to women.

AI image generators use machine learning to generate new images, based on massive databases of existing images, often sourced from the internet.

When prompted to generate images of health & safety professionals, the results were distinctly lacking in diversity, which suggests a serious bias in how health and safety professionals are represented online. In contrast, when prompted to generate images of other professions such as HR and marketing professionals, AI consistently represents them as women.

If left unchecked, such stereotyping could deter women from lucrative health and safety careers, warns health and safety training experts RRC International, and will worsen the already severe shortage of health and safety professionals in the UK.

On top of the skills shortage, UK health and safety is also suffering from a lack of diversity, with only 21 per cent of health and safety professionals in the UK being women, according to the British Safety Council. Things are slowly changing, at Leeds Beckett University, for example, there is a 60/40 split of women to men studying Health and Safety at postgraduate level, but the level of progress is slow. RRC International, who prompted the AI-generated images, believe that the bias revealed by these AI images could be the reason why.

Vicky Campbell, Director of Compliance at RRC International, commented on the bias,

AI output is only as good as the human input it’s fed, so it’s like a mirror. Health and safety is often perceived as being pale, male and stale. These AI-generated images substantiate that and potentially give us some insight as to why; it’s clear that the industry is representing these roles in such a manner for the AI to fail to generate a single image of a woman, a young person, or a person of colour. This output is reflecting a tired and enduring stereotype back at us.

Vicky Campbell, Director of Compliance at RRC International

The observed bias in AI-generated images not only reflects existing stereotypes but also suggests a potential barrier to entry for women and other underrepresented groups. Addressing these biases is essential for broadening participation in health and safety careers, which in turn can enrich the field with diverse perspectives and experiences.

Click here to see the full gallery of AI generated health and safety professionals. 

Joanne Swann, Content Manager, WorkWellPro
Editor at Workplace Wellbeing Professional | Website |  + posts

Joanne is the editor for Workplace Wellbeing Professional and has a keen interest in promoting the safety and wellbeing of the global workforce. After earning a bachelor's degree in English literature and media studies, she taught English in China and Vietnam for two years. Before joining Work Well Pro, Joanne worked as a marketing coordinator for luxury property, where her responsibilities included blog writing, photography, and video creation.

Share

Latest News

Latest Analysis

Related Articles

Boreout: The Hidden Wellbeing Risk Draining Engagement at Work

Experts warn that chronic boredom and under-stimulation, known as boreout, are emerging as overlooked wellbeing risks.

Rising Financial Strain Linked to Worsening Health Among Staff

Rising living costs are damaging both physical and mental health, research finds, prompting calls for stronger workplace wellbeing support.

Workplace Stress ‘Driving Errors and Silence Across UK Organisations’

Workplace stress is having a significant and often hidden impact on employee wellbeing and organisational performance across the UK.

Jo Ellen Grzyb: How Emojis and Abbreviations Fuel Misunderstandings Between Generations in the Workplace

Typical communication styles vary widely across generations; research revealed that 90% of UK teams experience conflicts stemming from digital tools.