Are you a victim of data bias?

Posted : October 26, 2023 -


HSJ article link: https://www.hsj.co.uk/comment/are-you-a-victim-of-data-bias/7035821.article

Investing in professional analytics teams is vital to eliminate data bias in healthcare and address health inequalities, ensuring better decision-making and patient care, write co-authors Ruth Holland and Emma Wright

If you’ve been reading AphA’s blogs, you’ll know by now that your organisation is sitting on a mountain of data that you can mine and refine with the help of professional analysts and data engineers.

With the right team, the right skills and the right support, this data can help you make better business decisions and improve the health of countless patients.

But there’s another reason you should invest in developing your data and analytics team. Without the right expertise, the data you base critical choices on can fall prey to bias: leaving resources poorly managed and patients underserved.

And as innovations like artificial intelligence become more and more embedded in our systems, the risks of biased data will only grow.

 

Tackling health inequalities

Marginalised groups have faced stubborn health inequalities in the UK for decades — a situation brought into focus by the covid pandemic, which highlighted the serious inequalities faced by some ethnic minority groups. And a follow-up Marmot Report released in 2020 warned of a growing health gap between those in the richest and poorest parts of the country.

Whether it’s the high maternal mortality rates faced by black mothers or the decade-shorter life expectancy faced by men in the most deprived postcodes, data shows us that we aren’t meeting the needs of our populations equally.

Good data analysis can help us understand the people we serve and figure out where we most need to channel resources. It can help us spot those most at risk of illness and deliver tailored preventive programmes to stop disease in its tracks.

It’s already informing pilot schemes that offer targeted interventions for communities with particular health needs.

With integrated care systems finally on a statutory footing, leaders have more opportunities than ever to reduce inequalities. But bad data, and bad analysis, can exacerbate biases and end up doing more harm than good.

 

Incomplete and unspecific data

One of the biggest problems with a lot of NHS data is that it doesn’t always properly represent the population your organisation serves. This leaves any analysis it’s based on vulnerable to bias, and any resources used on this basis are potentially better spent elsewhere.

NHS health data is often incomplete, with information on things like certain protected characteristics, which we have a legal duty not to discriminate against — relatively patchy.

Some of these characteristics (sex, age and deprivation, derived from postcodes) are held for virtually every patient. But others either aren’t routinely collected or are poorly recorded.

Take ethnicity, for example, which acute trusts commonly record as “not stated”: an acceptable national code. Some organisations are missing proper ethnicity data on up to 20 per cent of their patients, making it hard to even measure how equitably services are being delivered.

In primary and secondary care, knowing a patient’s first language can make all the difference to their access to services. But this information is often recorded haphazardly.

Sometimes, a lack of data can be down to a patients’ own reluctance to allow secondary uses of their data. As leaders, it will be important to be aware of the impact controversies around the upcoming Federated Data Platform may have on citizens’ trust, for example.

And just because these are ultimately a patient’s own choices, that doesn’t mean such decisions aren’t also on us. We need to make sure we are doing everything we can to protect patient privacy and build confidence in our use of data (another reason to professionalise your analysts, as AphA has previously argued).

Now, say you’ve checked all the boxes and found yourself with a complete (or relatively complete) dataset with plenty of information about the patients on your patch.

That’s great, but, it can still be prone to bias.

A lack of granularity in data — where the boxes themselves lack specificity — can prevent good analysis. Again, ethnicity data is a prime example of this. The broad definitions of ethnic groups used widely across healthcare can often mask inequalities at a granular level.

There is a balance to be struck between a level of information that becomes burdensome to collect and a level that is too porous to be useful. But with ICSs focused on enabling richer, cohesive datasets that can be used across organisations, accessing better data on your population will hopefully become easier.

 

Bias in AI algorithms

AI represents a new and exciting way to understand data, with massive potential in diagnosis. But don’t forget, it’s prey to bias too. Predictive algorithms, for example, tend to rely on historic datasets, which may by default contain disparities in outcomes. This then permeates into the algorithm itself.

We know in healthcare that many vulnerable groups are underserved and do not always access services when needed. This can leave them underrepresented in the data that informs a predictive model.

A very simple example could be found in an AI algorithm used to predict skin cancer that’s been “trained” mostly on images of White patients. Such an algorithm could lead to missed diagnosis in ethnic minority groups, perpetuating the inequalities already seen in cancer outcomes when ethnicity is used as a marker of inequality.

Similarly, women often face misdiagnoses because clinical trials have traditionally excluded them. As a recent study from Imperial College London found, if algorithms for reading chest X-rays are trained with data from primarily male patients, the results will not be as accurate when applied to chest X-rays of female patients.

AI algorithms that can work faster than humans have the potential to relieve some of the burden facing clinicians. But we need well-trained analysts to guide these programs and help reduce the bias they may fall prey to.

For this reason, we believe data bias in AI should be featured in professional development programmes for digital and data professionals.

 

How can analysts help prevent data bias?

Collecting fuller, more granular data is not a job for analysts alone. But by working closely with both your healthcare professionals and analytics professionals, you can find which areas of your organisation’s data are patchy and work to find solutions.

Analysts can also make sure the algorithms you do have don’t become biased as the demographics of your area change over time.

Just like human employees, these algorithms need regular performance reviews to keep them on track. They need to be tested and validated on data representative of the populations on which they’ll be used to minimise data bias.

Organisations like NHS England’s AI lab and the Medicines and Healthcare products Regulatory Authority have tools and resources to help analyst teams keep data in check.

NHSE AI Lab has a whole stream on AI ethics with advice on how to counter the inequalities that may arise from the ways AI-driven technologies are developed and deployed in health and care.

The MHRA has developed synthetic data which mimics ground truth data so closely that it can be used to validate algorithms, including AI algorithms – in medical devices.

Tools like this can help analysts make best use of the data they have. However, there is a data skills gap in the UK — particularly in areas like data science.

review in 2021 by the Department for Digital, Culture, Media and Sport estimated that there are likely no more than 10,000 new data scientists’ graduation from U K universities each year.

“It is therefore vital to look at upskilling the current workforce to try and plug the gap in demand for data skills,” the authors wrote.

 

If you want to make sure your resources go where they are really needed, you need to invest in your analytics professionals on an ongoing basis, making sure their skills are up to date and their careers are developing.

 

At AphA, this is our speciality.

We’ve worked closely with NHSE to develop a competency framework that lays out clear career progression pathways for healthcare analysts, data engineers and data scientists.

We’re also working to offer professional registration that gives you the confidence your expert analytics team meet rigorous professional standards.

Battling data bias needs to be a collaborative effort between healthcare leaders, clinical professionals and data experts.

So, as you reflect on what you can do – promoting professionalism and investing in your analytics workforce is a crucial piece of the puzzle, it will enable you to eliminate data bias.

 

Are you a leader working across health and care and believe that data is the new currency for transformation? Join us at the inaugural congress which combines HSJ’s access and understanding of NHS leadership challenges and AphA’s deep knowledge of the transformative potential of analytics for a unique and collaborative conversation.

Register here.