CyFer: Cyber Security, Privacy, Trust and Bias in FemTech

CyFer Project and Team 

CyFer examines the cybersecurity, privacy, bias and trust in female-oriented technologies (FemTech) such as fertility and period trackers focusing on apps and IoT devices. CyFer is funded by EPSRC PETRAS National Centre of Excellence for IoT systems cybersecurityComments from the reviewers and selection panel: "Very highly praised proposal, both by the Peer Reviewers, and the Selection Panel: This is a clear, well-thought-out proposal. It has far-reaching potential benefits to women, and to industry, in particular. The team are entirely appropriate and have evidenced their specialist expertise via their publications. The issues which this proposal address around FemTech and privacy go beyond the 'normal' levels of privacy violation germane to IoT and cybersecurity, to a deeply intimate level of privacy. It is essential that IoT privacy controls can defend such privacy as a standard."

FemTech promise to enable women to take control of their bodies and lives, helping them overcome the many existing challenges in health and medical care and research. There are already over 1300 FemTech companies offering a huge range of products, with a market size of $40.2 billion in 2020 alone. These technologies gain user-entered data and take body measurements via sensors. By collecting a vast amount of data and processing them through advanced algorithms e.g. AI, these technologies assist in managing reproductive and sexual health, and give scientists more insight about people’s bodies. However, there is a lack of clarity in the law (e.g. GDPR) and the industry practice in relation to this extremely sensitive data on different levels i.e. user consent, third-party sharing, and algorithmic bias which may lead to malicious purposes.  There is evidence that the main audience of these products (women) have been historically discriminated by algorithms (e.g. AI). 

The CyFer project looks to build on the research team’s previous work that demonstrated how the majority of FemTech IoT devices and apps start tracking the user right after the app is open and before any user consent, and how new sensors (e.g. on IoT devices) can put users at serious risk, yet the user perception is far less than the actual risks. The CyFer project looks to achieve its aims by (1) evaluating the security and privacy of FemTech, (2) investigating user perception and practice and (3) studying socio-technical bias and trust in data, algorithms and AI systems.

CyFer is a collaboration between multiple researchers, industrial partners, artists, designers, etc. across the world and we welcome more collaborations. The team members include: 

In August 2022, we (led by Joe) invited artists, designers and creative technologists and commissioned 5 teams competitively from an open call. These teams include: 

Media and News Coverage

This work is featured in the international and national news, with news articles, interviews, in-print, onilne, and TV. These included (selected) worldwide coverage in media online outlets as listed below, withTV interviews with news segments by TV4 and SVT in Sweden. The latter segment interview saw a follow-up segment by SVT (public broadcasting, number one in Sweden) in which commentary by then Sweden’s Minister for Information Technology, Anders Ygeman (2021) further served to show how critical this research on intimate data and technology was in regards to the general public and the institution of law.

Other examples include: 

Authors: Teresa Almeida, Maryam Mehrnezhad, Stephen Cook

Accepted in the 17th Annual UK Fertility Conference 2024

Abstract: There is an abundance of digital sexual and reproductive health technologies that presents a concern regarding their potential sensitive data breaches. We analyzed 15 Internet of Things (IoT) devices with sexual and reproductive tracking services and found this ever-extending collection of data implicates many beyond the individual including partner, child, and family. Results suggest that digital sexual and reproductive health data privacy is both an individual and collective endeavor.

Dr Maryam Mehrnezhad has done an extensive interview about the CyFer project and its findings with Women of Wearables (WoW), a leading global organisation and ecosystem that brings together like-minded women and allies in health tech, digital health and women’s health from more than 50 countries worldwide. WoW's members are startup founders, designers, technologists, industry experts, researchers, bloggers, journalists, investors, and many more. Read the interview here

Authors: Maryam Mehrnezhad, Teresa Almeida 

Full title: "My sex-related data is more sensitive than my financial data and I want the same level of security and privacy": User Risk Perceptions and Protective Actions in FemTech

Abstract: The digitalization of the reproductive body has engaged myriads of cutting-edge technologies in supporting people to know and tackle their intimate health. Generally understood as female technologies (aka female-oriented technologies or 'FemTech'), these products and systems collect a wide range of intimate data which are processed, transferred, saved and shared with other parties. In this paper, we explore how the "data-hungry" nature of this industry and the lack of proper safeguarding mechanisms, standards, and regulations for vulnerable data can lead to complex harms or faint agentic potential. We adopted mixed methods in exploring users' understanding of the security and privacy (SP) of these technologies. Our findings show that while users can speculate the range of harms and risks associated with these technologies, they are not equipped and provided with the technological skills to protect themselves against such risks. We discuss a number of approaches, including participatory threat modelling and SP by design, in the context of this work and conclude that such approaches are critical to protect users in these sensitive systems.

How users (102 UK participants) protect their privacy and security in general vs. FemTech

Examples of user drawings of the FemTech ecosystem 

Sep 2023: Blog Post: Challenges of Extracting Data from Social Media: The Case of Women's Health Misinformation

Author: Dr Adrian Bermudez-Villalva ( 

In the field of cybersecurity research, gathering data from social media platforms has become an essential task for understanding various trends, behaviours, and patterns on the internet. However, when it comes to sensitive and intimate topics such as women's health and misinformation, researchers face a multitude of challenges in extracting relevant data from popular platforms like Facebook, Twitter, and Reddit [1]. In this blog post, we will explore the difficulties researchers encounter and the limitations imposed by the social media giants, which hinder their efforts to acquire valuable data.

Facebook: Restricted API and Crawler Roadblocks

Being one of the largest social media platforms, Facebook poses significant challenges for researchers trying to access public posts containing content regarding topics such as women's health such as abortion. The primary obstacle is the restrictive nature of Facebook's API (Application Programming Interface), which severely limits the type of data that can be extracted from the platform.

To bypass the limitations of the API, some researchers resort to web crawling methods, often utilising tools like Selenium to extract data from Facebook. While this approach seems promising, it comes with its own set of difficulties:

1.     Account Blocking: Facebook employs stringent anti-scraping measures to protect user privacy and platform integrity. As a result, researchers who create dedicated accounts for web crawling purposes are likely to encounter account blocks and restrictions due to the platform's automated security measures.

2.     Account Verification: The process of creating multiple Facebook accounts for data extraction is further complicated by the mandatory verification requirements, which involve phone numbers and other personal information. This makes it challenging for researchers to create the necessary number of accounts for their studies.

3.     Dynamic Website Elements: Facebook's web interface is highly dynamic, relying heavily on JavaScript to generate and display content. Consequently, web crawlers built with Selenium or other tools often struggle to locate and extract the required data consistently, as the website structure may change frequently [2].

4.   Ethical and Legal Considerations: While web scraping itself might not be illegal, scraping Facebook's data can raise ethical and legal questions, as it involves accessing user-generated content without explicit consent from all parties involved [3].

Twitter: Limited API Access for Researchers

Twitter, another prominent platform, has historically provided academic researchers with access to its API, allowing them to collect valuable data for various studies, including those related to misinformation. However, the landscape has changed, and researchers now face significant obstacles:

1.     API Restrictions: Twitter's API access for academic researchers has been severely limited, leaving researchers with reduced access to public posts and historical data [4]. As a result, studies focusing on timely topics such as intimate health and women's health misinformation may face data scarcity and incompleteness.

2.     Challenges in Data Collection: The limited API access hinders researchers' ability to gather real-time data from Twitter, which can be crucial for tracking and analysing the rapid spread of misinformation.

3.     Lack of Data Granularity: With restricted API access, researchers may struggle to obtain detailed information such as user demographics, retweet networks, and engagement metrics, limiting the depth of analysis possible.

Reddit: Ephemeral Nature of Intimate Health and Misinformation

Reddit is a popular platform with various subreddits dedicated to discussions on intimate topics such as women's health. However, extracting misinformation content from Reddit poses unique challenges:

1.     Content Moderation: Reddit's content moderation is a combination of actions taken by both a combination of human central administrators, automated programs and community users [5]. As a result, posts containing misinformation related to women's health are swiftly removed within seconds or minutes after publication, making it extremely challenging to capture such content for research purposes.

2.     Transparency and Access: Reddit's API provides limited access to historical data, and the platform does not offer a comprehensive way to retrieve deleted posts. This lack of transparency hinders researchers from gaining insights into the spread and impact of misinformation over time.

Conclusion: The Uphill Battle for Researchers

As cybersecurity researchers in academia, the pursuit of knowledge about intimate health and other sensitive topics such as misinformation related to vaccine and women’s health on social media platforms is key to develop countermeasures to tackle this issue. However, the challenges imposed by restricted APIs, web crawling complexities, and limitations in historical data access have turned this endeavour into an uphill battle. To bridge the gap between cybersecurity research and understanding the nuances of misinformation, it is imperative for social media platforms to collaborate with academic institutions, facilitating responsible access to data and fostering a culture of open and transparent research.

As we move forward, it is crucial for researchers, policymakers, and social media platforms to work together to find ethical and sustainable solutions that empower researchers to explore the intricate landscape of misinformation and ultimately contribute to a safer and more informed online environment.

If you are an active researcher in this field and have expertise and experience to offer, please get in touch via emailing Adrian (


1. Stieglitz, S., Mirbabaie, M., Ross, B., & Neuberger, C. (2018). Social media analytics–Challenges in topic discovery, data collection, and data preparation. International journal of information management, 39, 156-168.

2. Deen Freelon (2018) Computational Research in the Post-API Age, Political Communication, 35:4, 665-668, DOI: 10.1080/10584609.2018.1477506.

3. Mancosu, M., & Vegetti, F. (2020). What You Can Scrape and What Is Right to Scrape: A Proposal for a Tool to Collect Public Facebook Data. Social Media + Society, 6(3).

4. Davidson BI, Wischerath D, Racek D, et al. Social Media APIs: A Quiet Threat to the Advancement of Science. PsyArXiv; 2023.

5. Shagun Jhaver, Iris Birman, Eric Gilbert, and Amy Bruckman. 2019. Human-Machine Collaboration for Content Regulation: The Case of Reddit Automoderator. ACM Trans. Comput.-Hum. Interact. 26, 5, Article 31 (October 2019), 35 pages.

Error when requesting data due to the restricted access to the Twitter API

Aug 2023: CyFer-RISCS Research Day

Title: CyFer-RISCS Networking Event and CyFer Exhibition tour, 6th Sept 2023

Date/Time: Wednesday 6th September 2023, 12:00 – 17:00 

Venue: Emily Wilding Davison Exhibition and Event Spaces, Royal Holloway University London, Egham, TW20 0EX 

Description: Join an interdisciplinary networking event with a visit to an art exhibition associated with CyFer (an EPSRC PETRAS-funded project). We will have a series of talks and round table discussions about the complex risks and harms of modern technologies and the key role of knowledge exchange to diverse audiences. We would like to identify the research challenges and opportunities of communicating science via art and form new research and activity ideas and collaborations.


12-1 pm Lunch and networking

1:30-2 pm Dr Maryam Mehrnezhad: Welcome talk (Title: CyFer: Cybersecurity, privacy, bias and trust in FemTech)

2-3 pm CyFer Art-Science Exhibition Tour

2-3 pm Tea/Coffee break

3-3:45 pm Dr Ola Michalec: Talk (Title: Artistic responses to research exploring human dimensions of digital energy transformation)

3:45-4:45 pm Breakout sessions

4:45-5 pm Prof Genevieve Liveley (RISCS Director) closing remarks

Authors: Maryam Mehrnezhad, Thyla ven der Merwe, Mike Catt

Abstract: We study the female-oriented technologies (FemTech) ecosystem including regulations, IoT systems, mobile apps, and web-sites and reveal the exploitative patterns embedded in such systems due to inadequate regulations and/or enforcement. We advocate for the policymakers to explicitly acknowledge and accommodate the risks of these technologies in the relevant regulations.

Examples of Femtech products (IoT, Apps) and their categories. These categories are based on FemTech Analytics, a strategic analytics agency focused on the FemTech sector (

We are excited to present the artists and designers who will participate in the CyFer project‘s extraordinary artistic exhibition from 19 June to 10 September 2023. The artworks and designs presented will respond to scientific research on the privacy, security, and ethics of female-oriented technology at The Exhibition Space, Royal Holloway, University London. Artists and designers from around the world have contributed sculptures, textiles, digital art, and interactive experiences that encourage visitors to reflect on the security and privacy of their personal medical information, as well as their expectations regarding menstruation and fertility data. The CyFer exhibition core team includes Maryam Mehrnezhad, Joe Bourne, Teresa Almeida, and Ehsan Toreini. 

Jul 2023: Blog Post: Women's Health-related Misinformation on Social Media

Author: Dr Adrian Bermudez-Villalva ( 

The rise of social media has brought about a revolution in the way people access and share information. While this has created new opportunities for people to connect and learn, it has also led to an increase in the spread of misinformation [1]. This is particularly problematic when it comes to women's health, where misinformation can have serious consequences such as poor health outcomes, anxiety, and confusion. Misinformation about women's health on social media is widespread and takes many forms. Some common examples include misinformation on contraceptive methods, pregnancy, childbirth, menstrual health, breast cancer, and menopause [2]. For example, some posts suggest that unassisted home births are safer and preferable to hospital births, where they can be dangerous to both mother and baby. Another example of misinformation is on COVID-19 vaccination, with posts claiming that these vaccines cause infertility [3].

Women who are exposed to inaccurate information may make poor decisions regarding their reproductive health, delaying or avoiding important medical care, or choosing treatments that are not based on scientific evidence. This can lead to negative health outcomes, such as unintended pregnancies, untreated sexually transmitted infections, delayed diagnosis and treatment of breast cancer, and unnecessary anxiety surrounding (in)fertility as well as menopause.

In the CyFer project, funded by the EPSRC PETRAS National Centre of Excellence for IoT Systems Cybersecurity, we are exploring the issue of misinformation related to women’s health on social media. The team working on this aspect of CyFer includes myself (Dr Adrian Bermudez-Villalva, Research Associate), Dr Maryam Mehrnezhad (PI), and Dr Ehsan Toreini (Co-I). We are analysing social media content to measure the prevalence of misleading information related to several aspects of women’s health. We are developing a crawler to extract posts from various platforms such as Facebook by using different keywords related to health misinformation themes. To analyse the text contained in the posts, we use natural language processing (NLP), a branch of AI that uses computational techniques to analyse and understand human language. Once the content containing misinformation has been identified, the next step is to extract key information from the content, such as the topics being discussed, the sources of the misinformation, and the sentiment of the content.

To tackle the issue of misinformation surrounding women's health on social media, a multi-disciplinary approach is necessary. Information security plays an important role in identifying and preventing the spread of misinformation on social media in order to protect users from a range of complex risks and harms. One example of such measures is the use of content moderation. Content moderation refers to the process of reviewing and removing content that violates the policies of a social media platform. Social media platforms can use content moderation to remove false information related to women's health, which can help to reduce the spread of misinformation. Another measure is the use of data analysis to identify patterns in the spread of misinformation related to women's health. By analyzing data on how false information is spread, social media platforms can develop strategies to address the issue and prevent the spread of misinformation. Social media platforms can also use ML algorithms to identify misinformation related to women's health. ML algorithms can be trained to recognize patterns and identify content that is likely to be false or misleading. By using ML, social media platforms can proactively identify and remove misinformation and the accounts associated with them.

While technical measures such as content moderation, data analysis, and the use of ML algorithms can help to mitigate the spread of false information, it is not enough. Tackling misinformation in the context of women’s health is a complex problem since it is happening not only on social media platforms, but also other places such as forums, darknet, and via communication tools too. Therefore, addressing it comprehensively requires collaboration between information security experts, medical professionals, AI and ML experts, policymakers, industrial partners, and the end users. By working together, we can help to ensure that accurate and reliable information is shared on social media platforms and that women are empowered to make informed decisions about their health and wellbeing.

Bio: I have a PhD in Security Science from University College London (UCL). I have been working on various topics related to cybersecurity and cybercrime where I used data-driven approaches to measure and understand malicious activities on the Internet. My research consisted of conducting experiments, data collection and analysis to study different types of illegal activities on the Internet such as data theft, malvertising and black markets [4,5].


1. Systematic literature review on the spread of health-related misinformation on social media. Social science & medicine, 240, 112552.

2. Nature and diffusion of gynecologic cancer–related misinformation on social media: analysis of tweets. Journal of Medical Internet Research, 20(10), e11515.

3. Widespread misinformation about infertility continues to create COVID-19 vaccine hesitancy. Jama, 327(11), 1013-1015.

4. A measurement study on the advertisements displayed to web users coming from the regular web and from tor. In 2020 IEEE European Symposium on Security and Privacy Workshops (EuroS&PW) (pp. 494-499). IEEE.

5. The shady economy: Understanding the difference in trading activity from underground forums in different layers of the web. In 2021 APWG Symposium on Electronic Crime Research (eCrime) (pp. 1-10). IEEE.

To celebrate the end of our CyFer project, we had a research day to initiate CybeMi2 (Cybersecurity and Privacy for Minority and Minoritized People). CyberMi2 2023 took place on the 20th of June 2023 at RHUL. This was an invitation-only event where my team and those who have been collaborating with us present their work on security and privacy (SP) and minority and minoritized users. The majority of research this year was related to gender and SP. We had participants from academia, industry, government, and the regulatory sectors. See the agenda here

Authors: Diana P. Moniz, Maryam Mehrnezhad, Teresa Almeida 

Abstract: Privacy is a fundamental human right in the digital age. With the proliferation of intimate health technologies, such as datadriven apps and connected devices that track bodily care and sensitive topics, privacy is increasingly critical. In this paper, we explore the complexity of intimate data and user perspectives and the choices they make to protect themselves. We introduce a story completion study with 27 participants to examine individuals’ concerns about data privacy, their protective or avoidant actions, and the potential mismatches between privacy concerns and actual behaviors. We suggest future research that combines User-Tailored Privacy (UTP) and participatory threat modeling to create privacy solutions that account for users’ needs and the potential risks and harms associated with the use of their data.

Authors: Michalec O, Barker K, Coopamootoo K, Coventry L, Duppresoir F, Edwards M, Johnson S, Johnstone E, Jurasz O, Mehrnezhad M, Moncur W

Summary: This report introduces the topic, presents a review on the recent related work, and provides a set of recommendations based on the literature provided by the REPHRAIN researchers. They include: Defining and measuring online harms, Understanding perpetrators, Developing public services, policies and standards, Co-designing safer technologies, and Improving the police practice and the justice system. As the future research questions and priority areas it provides the following: Defining and measuring online harms, Understanding perpetrators, Developing public services, policies and standards, Co-designing safer technologies, Improving the police practice and the justice system, and Revictimisation. The report concludes with a section on Policy Recommendations.

In this extended workshop in MozFest 2023 you will meet researchers and artists in an interactive experience of creative responses to research challenges on the privacy, ethics, trust and security of female oriented technology.

The “FemTech industry” remains largely unregulated. There is a lack of clarity in the law and uncertainty concerning industry and user practice in relation to this extremely sensitive data on different levels i.e. user consent, third-party sharing, and algorithmic bias which may lead to malicious purposes.

The event will provide a short exploration of the research problems and why PETRAS invited artists to address these with us. You will be given two twenty-minute opportunities to interact with two artists/designer’s work in depth. Following these two interactions, we will all come back together as a group to share reactions and discuss the privacy, ethics, trust and security of female oriented technology.

The session will be led by PETRAS’s Synthesis Research Fellow; the CyFer research team from Royal Holloway University of London, Surrey University and Umeå University, and the five artists/designers who joined this creative exchange.

There will be lots of space in this session to ask questions to both the research team about the science and the artists/designers concerning their methods, experience and the meanings of their work.

CyFer exhibition core team and the artists and designers presenting at MozFest 2023!

Jan 2022: Blog Post: The Story behind the CyFer Project

Author: Maryam Mehrnezhad

In 2021, I was awarded the CyFer grant by EPSRC PETRAS National Centre of Excellence for IoT Systems Cybersecurity. I have been working with a fantastic team exploring cybersecurity, privacy, bias and trust in female-oriented technologies (FemTech). This proposal was highly praised by the PETRAS peer reviewers and the selection panel, and I am delighted that we went above and beyond via various high-profile activities and outcomes.

CyFer is an international collaboration between academic researchers, industrial partners, artists, designers, etc. The team includes Dr Maryam Mehrnezhad (PI, RHUL), Dr Ehsan Toreini (Co-I, University of Surrey), Dr Teresa Almeida (academic partner, Umea University, Sweden, and ITI/LARSyS, Portugal), Dr Adriano Villalva (RA), Stephen Cook (RA), Dr Laura Shipp (former RA), Joe Bourne (PETRAS Synthesis Fellow, UCL, and Lancaster), Prof Mike Catt (academic partner, Newcastle University), and Swiss Precision Diagnostics (SPD) (industrial partner, makers of the Clearblue pregnancy tests).

CyFer is a result of a collaboration with Teresa Almeida which initially led to a 2021 ACM CHI paper: Caring for Intimate Data in Fertility Technologies. This long-distance collaboration happened thanks to the Covid-19 restrictions, when working with colleagues (and in this case, a dear friend from our PhD time at Newcastle University) across the globe entered a new phase!

FemTech solutions promise to enable women to take control of their bodies and lives, helping them overcome the many existing challenges in medical care and research. The market is growing fast (predicted to be over $75 billion by 2025). This industry offers a wide range of solutions, including mobile apps, IoT devices and online services covering menstruation, menopause, fertility, pregnancy, nursing, sexual wellness, reproductive health care, etc. The class of technologies is broad, ranging from stand-alone mobile menstruation apps to illness-tracking wearables to IVF services on the blockchain!

From lack of data about women in general, to bias and discrimination in health studies, data sets, and algorithms, FemTech has come a long way to centre women in the design and development of such systems. Yet, the FemTech industry remains largely unregulated, particularly when it comes to security, privacy, and safety. In our 2022 EuroUSEC paper: Vision: Too Little too Late? Do the Risks of FemTech already Outweigh the Benefits?, we show how such threats are putting users at differential and complex risks and harms; in some cases, the lack of proper safeguarding methods for this sensitive data can put human life at risk.

We believe that privacy in FemTech should be looked at via a range of lenses. These include the cases where someone has the user personal data, but the user does not – inverse privacy, when peer pressure causes people to disclose information to avoid the negative inferences of staying silent – unravelling privacy, when the privacy of others also matters – collective privacy, and when systems should also focus on the intersectional qualities of individuals and communities– differential vulnerabilities. More specifically, in our 2022 ACM NordiCHI paper: Bodies Like Yours: Enquiring Data Privacy in FemTech, we present a massive data collection of FemTech on users and others including one’s baby, partner, family, etc. We have been working on standardisation and regulatory aspects of these products too. During my visit to ETH Zurich in 2022, Dr Thyla van der Merwe (another dear friend of many years and also an ISG alumnus) and I identified several gaps and grey areas in the existing regulations and standards around FemTech solutions and data.

Our work in CyFer is not limited to academic papers. This work has been consistently in the news (check the homepage). Furthermore, in August 2022, we invited artists, designers and creative technologists and commissioned 5 teams competitively from an open call. These teams include: (1) Vasiliki Tsaknaki and Lara Reime (IT University of Copenhagen, Denmark), (2) Nadia Campo Woytuk (KTH Royal Institute of Technology, Germany) and Nicolas Harrand (RISE Research Institute of Sweden), (3) Sian Fan (interdisciplinary artist, between Essex and London), (4) Elena Falomo (freelance designer, between London, Berlin, and Italy), and (5) Althea Rao (University of Washington, USA). Joe Bourne is passionately leading these activities. I have not met Joe in person yet, but our collaborative work has been fantastic. These top-notch art pieces have already made a presence at Mozilla MozFest in March 2023. But the best is yet to come!

We are delightedly completing CyFer by organising two exciting events this summer: CyberMi2 2023 (Cybersecurity and Online Privacy for Minority and Minoritized People, 20 June 2023), and an art exhibition (June- August 2023), both at RHUL. Make sure you visit our exhibition by coming to our beautiful Egham campus. For more information, visit For now, enjoy a glimpse of Elena’s work on privacy notions in FemTech in Figure 1.

People often ask how did I come to work on this topic? I have a background in System Security and have been performing attacks on systems. I have also designed trustworthy systems and contributed to standardisation and industrial practices to prevent such attacks. However, human dimensions have consistently been a part of my work. Currently, a major strand of my research is dedicated to minority and minoritized users in cybersecurity and privacy. I have always dreamt of doing something for women’s rights. But I am not an activist, a lawyer, or a social scientist. I am a cybersecurity expert, and I decided to use my expertise to fulfil this ambition of mine. I did it in CyFer, and I continue to do so in my future projects. If you share the same passion, please get in touch!

Authors: Teresa Almeida, Laura Shipp, Maryam Mehrnezhad, Ehsan Toreini

Abstract: The digitalisation of the reproductive body has seen a myriad of cutting-edge technologies to prioritise neglected intimate health and care topics, such as fertility and contraception. The impact of these intimate data on livelihood and society is pervasive including that privacy is critical to safeguarding security as this increasing digitalisation also produces increasingly large datasets. In this paper, we enquire the collective nature of privacy in female-oriented technologies (FemTech) to show how this ever-extending collection of data implicates many beyond the individual. We introduce a pilot study on the data collection practices of a subset of FemTech devices with fertility tracking service. We demonstrate that data is collected about the user and others, such as their immediate relationships and user groups as a whole. We suggest that it is critical we ask who is vulnerable and discuss approaches to mitigate collective harm.

Call for artists, designers and creative technologists to respond to the science of FemTech cybersecurity, privacy, ethics and trust.

Description: Researchers from the CyFer project, funded by PETRAS, UK, are examining cybersecurity, privacy, ethics and trust in FemTech. Female-oriented technologies (FemTech) promise to enable people to take control of their bodies and lives, helping them overcome the many existing challenges in medical care and research. There is a lack of data about women and other minority and minoritised groups in medical sciences. There is also bias and discrimination in health studies, data sets, and algorithms. FemTech solutions promise to centre these groups in the design and development of their systems. However, the FemTech industry remains largely unregulated. There is a lack of clarity in the law (e.g. GDPR and HIPAA), and in industry and user practice in relation to this extremely sensitive data on different levels i.e. user consent, third-party sharing, and algorithmic bias which may lead to malicious purposes.

You can find out more about this call by visiting the PETRAS website. 

Objects find new meanings in a revolution! These are sanitary pads used for preventing bleeding, but not menstrual bleeding! They cover security cameras in metro stations in Iran to stop the Iranian regime from identifying, tracking, arresting, torturing, and killing the protestors! The very same taboo period pads which were once carried in black bags are now means of fighting surveillance!  This is the power of a female-led revolution! #Iranrevolution2022 #mahsaamini #womanlifefreedom

Authors: Maryam Mehrnezhad, Laura Shipp, Teresa Almeida, Ehsan Toreini

Abstract: Female-oriented technologies (FemTech) promise to enable women to take control of their bodies and lives, helping them overcome the many existing challenges in medical care and research. From lack of data about women in general, to bias and discrimination in health studies, data sets, and algorithms, FemTech has come a long way to centre women in the design and development of such systems. Yet, the FemTech industry remains largely unregulated, particularly when it comes to security, privacy, and safety. These issues can lead to catastrophe given the highly sensitive nature of the data FemTech technologies handle. In this paper, we show how such threats are already putting women at risk; where in some cases, the lack of proper security and privacy safeguards can put human life at risk. We also present the results of some of our ongoing research on the massive data collection of FemTech about end-users and others (baby, partner, family, etc.). We set an agenda for research on the security and privacy of FemTech and call for a better legal framework to regulate FemTech.

Published by: Newcastle University

Fertility apps house the sensitive data of millions of users globally. Read our blog from Dr Maryam Mehrnezhad on the risks surrounding fertility app users’ privacy. 

Authors: Maryam Mehrnezhad, Teresa Almeida

Abstract: Fertility tracking applications are technologies that collect sensitive information about their users i.e. reproductive potential. For many, these apps are an affordable solution when trying to conceive or managing their pregnancy. However, intimate data are not only collected but also shared beyond users knowledge or consent. In this paper, we explore the privacy risks that can originate from the mismanagement, misuse, and misappropriation of intimate data, which are entwined in individual life events and in public health issues such as abortion and (in)fertility. We look at differential vulnerabilities to enquire data’s vulnerability and that of ‘data subjects’. We introduce the General Data Protection Regulation (GDPR) and how it addresses fertility data. We evaluate the privacy of 30 top ‘fertility apps’ through their privacy notices and tracking practices. Lastly, we discuss the regulations and fertility data as critical to the future design of tracking technologies and privacy rights. 

This work was also invited to be presented in Fertility Conference 2022, the 15th joint fertility meeting organised by the Association of Reproductive and Clinical Scientists (ARCS), the British Fertility Society (BFS) and the Society for Reproduction and Fertility (SRF), and the 2nd online Fertility meeting.