Digital health tools could help address gender inequities in healthcare — or make them worse. Here’s how to do it right. Women’s health apps and websites, also known as FemTech, boomed during the COVID-19 pandemic and continue to grow. These women’s digital health technologies aim to support women with their health during pregnancy and menopause, track periods and fertility, and manage their lifestyle and mental health.
The global FemTech market (defined as any form of technology that focuses on women’s health) was valued at USD$45.75 billion in 2022 and is projected to reach USD$139.51 billion by 2031.
The industry is experiencing a boost with the rise of AI-driven platforms, which opens opportunities to tailor healthcare to an individual’s needs, further revolutionising personalised medicine for women.
Investing in women’s health is crucial but progress on gender equity has slowed down since the COVID pandemic.
There has been an increase in online violence against women, and there are alarming reports of more maternal deaths worldwide. To compound these problems, healthcare workers, mostly women, face burnout.
Healthcare is not an even playing field for women, particularly women from minority backgrounds.
Funding for research on diseases that primarily affect men is still higher than for women’s health.
Women, especially women of colour, remain underrepresented in medical research, and only 2 percent of medical research funding is spent on pregnancy, childbirth, and female reproductive health.
Data gaps and biases lead to women receiving later diagnoses, the wrong medication dose, or having their symptoms dismissed, which happens more often to women of colour.
In a UK study, for example, women were 50 percent more likely than men to have a missed diagnosis of a heart attack.
Women’s digital health can bridge this gap and revolutionise women’s healthcare in several ways.
It can help reduce the gender data gap by collecting detailed data about women’s health, and provide personalised health advice using AI and data analytics.
Digital health tools can break down barriers to care that women face more than men, such as lack of time, money, or childcare responsibilities. They can help ease the workload of low-status healthcare workers, who are more often low-income and women of colour.
Digital health can also provide support with family planning, avoiding early pregnancy, and escaping gender-based violence.
Unfortunately, current women’s digital health tools are tailored towards a specific group of women, usually white, highly educated, healthy, and affluent. Because of this, women from underserved populations do not benefit, which risks increasing health inequities.
App developers often treat women as a homogenous group, but different women have different needs based on their racial or ethnic background, socioeconomic status, and health conditions.
Crucially, women of colour are more likely to experience poverty, social exclusion and domestic violence, which are all related to health, and face greater challenges in accessing affordable healthcare.
Indigenous Australians have a significantly shorter life expectancy than their non-Indigenous counterparts, while maternal mortality rates are nearly three times higher for Aboriginal women compared to non-Aboriginal women.
Black women in the US are more likely to die from cardiovascular disease, hypertension, stroke, lupus and several cancers.
Yet digital health studies often do not include Black, Indigenous, or women of colour, those with complex medical needs, or who identify as trans or non-binary. The apps are also often not designed in an accessible or inclusive way.
Only 17 percent of apps for managing diabetes during pregnancy assessed cultural appropriateness — whether the app’s content and instructions were in line with the patients’ culture, language, religion, customs, and beliefs — and 25 percent assessed digital literacy.
With the increased use of AI, there is now a risk that biases related to race, ethnicity and gender will become integrated into algorithms, leading to less effective care or even harmful outcomes.
Researchers have found that large language models like ChatGPT and Bard, which are increasingly being used in health apps, have race-based bias, such as the false idea that Black people have a lower pain threshold.
Many popular women’s health apps have inadequate privacy, sharing, and security practices, and do not follow regulations like the European Union’s General Data Protection Regulation.
If these apps are not secure, it could make women vulnerable to targeted advertising and cyber attacks.
Policies to protect health information privacy, like the Health Insurance Portability and Accountability Act of 1996 in the US, fall short when it comes to reproductive health data.
This is especially concerning since recent changes to abortion laws in the US have made it even more important to safeguard this information.
How this could be fixed
Researchers and developers in digital health could include marginalised groups into app design and testing and ensure they will benefit in tangible ways from the research.
Researchers could co-create digital solutions with marginalised groups and promote an approach in which women themselves set the research agenda, leading to digital health research that can really make a difference in their lives.
To make sure FemTech is accessible for all, researchers could also ensure apps are affordable, easy to navigate, use high school-level language and are available in different languages.
Incorporating gender-sensitive design, such as representing women with various skin colours and figures, and leaving out gender-biased content such as a focus on being thin and beautiful for women, or strong and tough for men, could also help.
Before implementing AI, researchers and developers could seek to understand the benefits and harms, and only proceed if the benefits outweigh the harms.
Providing better access to mobile internet for people in low-income areas could help ensure everyone can access digital health.
Protecting women’s personal data is crucial, since there is no uniform privacy framework for health data broadly and FemTech data specifically — either globally or within national frameworks.
Users need to be protected, especially those who may be more vulnerable to data breaches, such as those who are undocumented or victims of domestic abuse, and should be able to permanently delete their data.
More funding and more women, particularly women of colour, leading this area can help FemTech reach its full potential of improving gender equity in health.
Dr Caroline Figueroa is an assistant professor at Delft University of Technology, the Netherlands. Her research focuses on developing and testing personalised digital health tools to help individuals lead healthier lives, with an emphasis on social justice by tailoring to the needs of underserved populations.
Originally published under Creative Commons by 360info™. Originally published 360info.org.