Monetizing Menstruation: Privacy and Power in Women’s Reproductive Health Apps
- Molly Bombard
- Apr 30
- 17 min read
1. Introduction
Each time a woman uses a reproductive health app, she leaves behind a digital footprint that tells an intimate story—not just about her menstrual cycle, but about her relationships, her family planning choices, and potentially even her medical decisions. While millions of women rely on these apps to manage their reproductive health, few realize the extent to which their personal data has been transformed into a valuable commodity and traded among hundreds of companies (Fox, 2019). Although these apps promise enhanced control over reproductive health, they simultaneously expose users to privacy risks that exist outside traditional medical protections. Following the overturning of Roe v. Wade, these privacy implications have gained new moral urgency as digital health data could potentially be used in legal proceedings.
This paper argues that women’s reproductive health apps fundamentally violate user privacy through practices that undermine both user autonomy and meaningful consent. Through Marmor's framework for privacy rights, which establishes that privacy violations occur when users lose control over their self-presentation through a manipulative informational environment, I demonstrate how these apps create privacy violations that cannot be resolved through current regulatory frameworks. The violation manifests in three key ways: through deliberate opacity in data collection practices, through the commercial transformation of private health information into tradable assets, and through the complete regulatory failure to protect this intimate data. While current privacy frameworks like GDPR attempt to broadly address digital privacy concerns, they fail to adequately protect the unique sensitivity of reproductive health data. After examining how reproductive health apps violate user privacy, this paper will conclude by offering a reimagined model of consent that could transform these essential health tools into trusted partners in women's reproductive health management.
2. Background & Context
Research estimates that over 50 million women use reproductive apps worldwide (Kelly, 2023). From tracking ovulation to planning pregnancies, women’s health apps have become essential tools for navigating everyday reproductive health. In many cases, what once required regular visits to reproductive specialists can now be monitored directly through an app.
This shift has made managing reproductive health more accessible for many. Period-tracking apps emerged in the late 2000s, with companies such as Flo and Clue launching in 2013. Today, their popularity has skyrocketed, with the global women’s health app market valued at $4.85 billion in 2024, projected to grow at an annual rate of nearly 18% through 2030 (Grand View Research, 2024).
These apps allow users to log their period start and end dates and track symptoms like cramps or mood changes. They can also analyze cycle data, basal body temperature, and other metrics to predict fertile windows. Some even sync with wearable devices for more accurate data. Additionally, many apps cater to women experiencing menopause by tracking hot flashes and mood swings while providing advice on managing hormonal changes. Beyond reproductive health, these apps often include features on general wellness, such as fitness and sleep tracking. For example, apps like MyFlo integrate menstrual health with personalized recommendations for diet and exercise based on the user’s cycle phase.
Yet, the vast majority of these apps cost nothing to use. And often, when apps are free, the user’s data is the commodity. While these apps are marketed as a tool for period-tracking, the information it collects can be extrapolated to provide intimate data about the user. For example, it can determine when an individual has sex, if they are trying to expand their family, or whether they had a miscarriage. The purposes of collecting and analyzing this type of information are less obvious.
3. Why Reproductive Privacy Matters
Privacy in reproductive healthcare represents more than just a practical concern about data protection—it speaks to fundamental aspects of autonomy and self-determination. To understand why privacy matters so deeply in this context, we must first understand what privacy protects and enables in human life. Privacy creates the conditions necessary for intimate relationships and the exercise of reproductive autonomy. When women lose privacy over their reproductive health information, they lose more than just confidentiality. They lose a crucial space for making intimate health decisions about their bodies without external surveillance or influence.
The intimate nature of reproductive health data makes privacy particularly crucial because this information reveals deeply personal aspects of one’s life narrative. When a woman tracks her menstrual cycle, she is documenting intimate details about her bodily experiences and family planning choices. This data forms a narrative identity—the story we tell about ourselves that shapes our understanding of who we are and the choices we make. When this narrative is commodified by apps, women lose control over a fundamental aspect of themselves.
Additionally, privacy in reproductive health protects decisional autonomy—the ability to make intimate life choices free from external pressures. When every aspect of one’s reproductive health is monitored and recorded, it creates an autonomy deficit. Women may begin to self-censor or alter their health not based on their actual needs, but on the knowledge that their data could be used against them. They may hesitate to log certain symptoms of health concerns, knowing this information could be accessed by third parties or used to make predictions about their reproductive status. This might include avoiding tracking pregnancy-related symptoms altogether in states where abortion is criminalized. Some might even deliberately input incorrect information about their cycles or sexual activity, undermining the very health management benefits these apps promise to provide. The mere awareness of privacy loss can shape behavior. The result is a form of reproductive self-censorship, where women’s health monitoring becomes shaped not by their actual needs but by concerns about how their intimate health data might be used against them. This undermines a woman’s fundamental ability to maintain an authentic relationship with her reproductive health.
The commercialization of reproductive health data raises additional philosophical concerns about the proper boundaries of market logic. When intimate health information is transformed into a commodity, it violates what philosopher Michael Sandel describes as “sacred” domains, or aspects of human life that should be protected from market forces (Sandel, 2012). In his work “What Money Can’t Buy,” Sandel argues that certain human experiences and relationships lose their fundamental nature when they enter the marketplace. Reproductive health data exemplifies this concern—when a woman’s intimate bodily experiences are transformed into “behavioral surplus” for market exploitation, something essential about the meaning of that information changes.
Currently, standard digital privacy protections are insufficient for reproductive health apps. The intimate nature of this data, combined with its inherent connection to personal autonomy, demands protections that recognize its unique status. When we understand privacy as a condition for human flourishing we can better appreciate why its violation in reproductive health apps represents a strong ethical concern.
4. Privacy Violation in Digital Health
This paper argues that reproductive health apps constitute fundamental violations of user privacy. Philosopher Andrei Marmor’s framework for privacy rights provides a systematic way to evaluate how reproductive health apps violate this privacy. For Marmor, privacy rights center on maintaining meaningful control over one’s informational environment. This control becomes especially important when dealing with intimate health data that has traditionally existed within protected medical contexts. His framework is particularly apt for analyzing reproductive health apps because it helps us understand why the standard notice-and-consent model fails to protect privacy. When an app provides a complex privacy policy while simultaneously creating an environment where users must either accept comprehensive data collection or forgo essential health tools, a user’s autonomy is violated. Consumers lose meaningful control over their intimate health information while operating in an environment structured to obscure and normalize that loss of control.
Marmor states that “a violation of the right to privacy consists in the manipulation of the environment in ways which unjustifiably diminish one’s ability to control how one presents herself to others” (Marmor, 2014). This means that for a privacy violation to occur, there needs to be two things: (1) a diminished ability for one to choose how they present themselves to others, and (2) a manipulation of one’s informational environment. To establish that reproductive health apps fundamentally violate privacy rights, one must demonstrate how they satisfy both conditions of Marmor’s privacy violation framework. I argue that women’s reproductive apps “check” both proverbial boxes.
4.1 The Ability to Choose How One Presents Themselves to Others
The first condition—a user's diminished ability to control how they present themself to others—is demonstrated through multiple mechanisms in reproductive health apps. Most fundamentally, users lose control over their intimate health information through the apps’ data sharing practices. Extensive data collection of women’s intimate health data becomes troubling when examining the deliberate opacity in how this information is handled. The issue of privacy in reproductive health apps stems not from accidental oversight but from systematic practices of obscurity. Research reveals that 30% of period-tracking apps operate without any privacy policy—a startling statistic that becomes even more troubling when considered alongside how these apps handle intimate health data (Najd, 2022). Without being explicitly shown and agreeing to a privacy policy before sharing intimate health information, individuals are inhibited from making an autonomous choice about their data.
In addition, reproductive health apps are not required to follow the Health Insurance Portability and Accountability Act (HIPAA), the United States’ most fundamental health privacy law. Unlike traditional medical providers who must follow strict HIPAA guidelines for protecting patient privacy, reproductive health apps operate outside of these established healthcare privacy frameworks. This means that sensitive information that would be strictly protected in a doctor’s office—such as pregnancy attempts and miscarriages—receives no special legal protection when collected through an app.
This loss of control over self-presentation is particularly troubling because reproductive health information often carries multiple layers of meaning. A missed period logged in an app might indicate stress, a potential pregnancy, or a health condition—interpretations that a woman might choose to share differently with different audiences. In the absence of HIPAA protections, apps can aggregate these data points to create narratives about women’s reproductive apps that exist entirely outside their control. A woman who uses an app to track her cycle for health reasons might unknowingly have this data used to make predictions about her fertility status or family planning choices, creating assumptions about her reproductive life that she never intended to share. Thus, without HIPAA’s restrictions on data disclosure, apps can freely sell intimate health information with third-parties.
And they certainly do—it has been revealed that 87% of period-tracking apps share collected data with companies such as Facebook and Google (Fox, 2019). At least 135 companies have purchased this intimate health information, revealing users’ age and behavioral profiles (Fox, 2019). Users are not privy to the types of information that is collected from them and later shared with third-parties. In this way, women lose control over how they present themselves to others. A reproductive health app does not just collect isolated data points about menstruation, it gathers information that can reveal patterns of sexual activity and even mental health status. This transformation of bodily data into predictive insights creates a digital representation of our intimate health that takes on a life of its own in the data economy.
Further, the transformation of intimate health data into marketing insights becomes particularly concerning when we examine historical precedents of how companies exploit reproductive information. The Target pregnancy prediction case is an important example. In 2012, Target’s algorithm identified a teenage girl as pregnant based on subtle changes in her purchasing patterns before she was able to tell her family (Duhigg, 2013). Target’s interest in identifying pregnant customers stems from a crucial marketing insight: pregnancy represents one of the rare moments when shopping habits become flexible. As Target statistician Andrew Pole explained, most shopping patterns are deeply ingrained and resistant to change (Duhigg, 2013). However, during pregnancy, these patterns become shapeable as consumers develop new needs and brand loyalties. This makes pregnant consumers extremely valuable to retailers—so valuable that Target invested significant resources in developing algorithms to identify them before competitors could.
While Target used this data to send targeted coupons for baby products, today’s data brokers can create detailed profiles that track a woman’s reproductive journey and pregnancy outcomes. These profiles can be bought and sold, with at least hundreds of companies documented as purchasing intimate health information from reproductive apps (Fox, 2019). The scale of this data sharing means women have effectively lost control over how their most intimate health information is presented and interpreted by commercial entities. In our post-Roe society, this loss of control takes on new urgency. The same predictive insights that companies use for marketing could potentially be used as evidence in legal proceedings.
The Supreme Court’s decision in Dobbs v. Jackson Women’s Health Organization in June 2022, which effectively overturned Roe v. Wade, altered the privacy landscape for reproductive health data in the United States. After nearly 50 years of federal protection for abortion rights, the Dobbs decision returned the power to regulate abortion to individual states. This shift has created a complex legal situation, with some states enacting near-total bans on abortion while others move to strengthen protections for reproductive rights.
With 61% of reproductive health apps tracking location data (Fox, 2019), this new legal landscape has important implications for reproductive health apps and the intimate data they collect. In states where abortion is now restricted or criminalized, the vast amounts of reproductive health data collected by these apps could potentially be used as evidence in criminal investigations. This is due to the fact that period-tracking apps can reveal patterns that might indicate miscarriage or abortion through the data they collect about menstrual cycles.
The legal vulnerability of this data is particularly concerning given that reproductive health apps operate outside the protections of HIPAA. As aforementioned, while medical providers are strictly limited in what patient information they can share with law enforcement without a court order, these apps face no such restrictions. Since many of these apps freely share user data with third-parties, they may be compelled to share information with law enforcement without a warrant.
The risks of reproductive health data being used in investigations are not hypothetical. In 2022, prosecutors in Nebraska successfully obtained Facebook messages through a court order to Meta as evidence in a criminal case involving an alleged abortion (Kaste, 2022). These messages between a mother and daughter were ultimately used to bring criminal charges against both individuals. Additionally, in a 2017 Mississippi case, prosecutors attempted to use a woman’s online search history related to abortion medication as evidence after she experienced a pregnancy loss, though these charges were later dropped (Dellinger, 2024). These cases demonstrate how digital evidence can be weaponized in abortion prosecutions. However, reproductive health apps present an even greater privacy risk. Unlike individual searches or messages, these apps collect comprehensive and longitudinal data about users’ reproductive lives. This data collection is far more detailed than sporadic communications or searches.
The combination of detailed reproductive health tracking and the post-Roe legal environment creates an unprecedented privacy crisis for women. Until meaningful privacy protections are established, millions of women face a difficult choice—either sacrifice useful digital health tools or accept that their most intimate health data may be used against them.
4.2 Manipulation of One’s Informational Environment
It would be one thing if the risks of sharing reproductive health data were disclosed, yet this is not the case. Marmor’s second condition for a privacy violation—that the flow of our informational environment is manipulated—is equally established through examination of reproductive health apps' practices. Currently, more than a third of American adults say they “never” read a privacy policy before agreeing to it (Auxier, 2019). Further, just 9% of individuals say they “always” read a company’s privacy policy (Auxier, 2019). Yet, it cannot be that only 9% of individuals care about their privacy.
The answer becomes more clear when we take into account that even when reproductive health apps provide users with privacy policies—and recall that nearly a third do not—the documents are deliberately written in complex language. In Litman-Navarro’s study “We Read 150 Privacy Policies. They Were an Incomprehensible Disaster,” the authors reveal that understanding the average privacy policy requires a college education reading level, placing them beyond the comprehension of most Americans (Litman-Navarro). Thus, documents meant to inform users about what happens to their most intimate health information are written at a level that most of the intended audience cannot meaningfully understand.
The implications of this deliberate complexity are concerning. Women tracking their menstrual cycles or pregnancy symptoms through these apps often have no real way of understanding that their intimate health data might end up in the hands of data brokers or even law enforcement. The complexity of these privacy policies inhibits the flow of information in one’s environment. It's worth noting that this is not accidental. Writing clear, understandable privacy policies is possible. The choice to write them at a college reading level represents a strategy to maintain legal compliance while minimizing user understanding of data practices.
The apps' exploitation of proprietary information laws represents another form of environmental manipulation. Even if privacy policies were clear, reproductive health apps are not required to disclose the exact ways they use data. This practice is allowed due to legal protections that prevent a corporation's valuable assets from being disclosed and used by competitors. In this context, reproductive health apps employ proprietary algorithms to analyze women’s health data and generate health insights. These algorithms process deeply personal information including sexual activity and mood patterns, yet the companies are not required to disclose how this information is analyzed or shared.
This protection of proprietary information creates a dangerous asymmetry. Apps can collect and analyze users' most intimate health data, yet users have no right to understand how their information is being processed or by whom. For example, when a reproductive health app claims to predict fertility windows or detect potential health conditions, the algorithms making these determinations remain hidden behind proprietary protection. Consider what happens when a woman logs her menstrual cycle data. Using proprietary algorithms, it could identify potential fertility windows or predict health conditions. However, because these algorithms are protected as trade secrets, the woman has no way to understand how her intimate health information is being assembled into predictions or insights about her reproductive status. She cannot know whether sudden changes in her cycle might trigger marketing profiles or be shared with data brokers. In this way, her informational environment has been manipulated and her privacy has thus been violated.
Through these multiple mechanisms, reproductive health apps satisfy both conditions of Marmor's privacy violation framework. They systematically diminish users' ability to control how their intimate health information is presented to others while simultaneously manipulating the informational environment to prevent meaningful understanding or control of data practices. The violation is not incidental but fundamental to how these apps currently operate, creating what I argue is an inherent privacy violation that cannot be resolved without fundamental changes to their business models.
6. A New Way Forward
One might argue that reproductive apps should apply the same data collection practices used under the General Data Protection Regulation (GDPR) that was enacted in Europe in 2018. The GDPR establishes strict requirements for data collection and user consent, with potential fines of up to 4% of global revenue for violations (Zaeem, 2020). At its core, the GDPR classifies certain types of personal data as "special category data" requiring enhanced protection. This includes information such as genetic data, race or ethnicity, and political opinions. Special category data requires explicit consent for processing and mandates additional safeguards like data protection impact assessments and dedicated data protection officers. However, even within this robust framework, the protection status of reproductive health app data remains ambiguous. In the UK and the European Union, it is unclear whether data from female-oriented technologies falls under "special category data" in the GDPR framework or whether such data qualifies as "medical" data under UK healthcare regulations. The regulatory uncertainty stems from the hybrid nature of reproductive health apps, which combine elements of lifestyle tracking and personal health data. For example, when a woman logs her menstrual cycle, is this considered health data requiring special protection or lifestyle data subject to standard processing rules? This ambiguity creates a pressing need for a new privacy system specifically designed for reproductive health apps.
Helen Nissenbaum’s theory of contextual integrity provides a valuable framework for developing such protections. She argues that “privacy as contextual integrity ties adequate protection for privacy to norms of specific contexts, demanding that information gathering and dissemination be appropriate to that context and obey the governing norms of distribution within it” (Nissenbaum, 2004). Under Nissenbaum’s framework, proper consent requires maintaining appropriate information flows that respect both the context of data collection and users’ reasonable expectations. Applied to reproductive health apps, this would require significant changes to how consent is structured and maintained.
First, apps would need to clearly delineate different contexts of data use—separating health management functions from commercial exploitation. The opacity of current privacy policies must be replaced with contextually appropriate disclosure. Rather than lengthy policies that obscure actual data practices, apps should provide transparent and context-specific explanations of how reproductive health data will be used and shared. This transparency would address the current situation where 30% of period-tracking apps operate without any privacy policy and many others employ deliberately complex language to obscure its data practices (Fox, 2019).
Additionally, the current all-or-nothing approach to data collection in reproductive health apps undermines user privacy, requiring what I call "granular consent architecture"—a multi-layered approach to user privacy. This architecture would separate data collection and processing into distinct categories that users can individually accept or decline. For example, users should be able to consent to basic cycle tracking while declining mood tracking, or accept local data storage while rejecting cloud synchronization. The granular approach would extend to third-party sharing, allowing users to specifically approve or deny data access for different purposes. For example, a user might consent to anonymized data being used for medical research while declining its use for advertising profiles. This granular control should extend to the frequency and duration of data collection, letting users decide how long their data is stored and enabling them to delete specific types of data while retaining others.
Further, this architecture must include clear explanations of the implications of each consent choice, helping users understand what services or features they might lose by declining certain data collection. This would reform the current all-or-nothing status quo into a system that respects user autonomy while preserving app functionality.
Further, to address the vulnerability of reproductive health data in the post Roe v. Wade landscape, a reformed consent model must explicitly acknowledge the legal risks of using reproductive health apps. Corporations should implement automatic data deletion policies, regularly purging historical data that is not essential for health tracking. At the policy level, apps must provide clear disclosures about how they would handle law enforcement requests for user data. Additionally, apps should offer privacy preserving alternatives for potentially sensitive features—for example, allowing pregnancy tracking without explicitly logging pregnancy status, or offering simplified cycle tracking that does not record detailed symptoms of sexual activity. These safeguards would give users meaningful control over their privacy while protecting them from potential legal vulnerabilities.
By building upon Nissenbaum's contextual integrity framework while addressing the gaps in current regulations like HIPAA and the GDPR, we can develop privacy protections that specifically address the informational challenges of reproductive health apps. This framework would transform these apps from privacy violators into trusted health tools that respect both user autonomy and privacy rights.
6. Conclusion
This paper has demonstrated how reproductive health apps violate user privacy through systematic practices of data collection and sharing. Through Marmor’s framework, we can understand how these violations extend beyond simple data collection into the realm of compromised privacy and autonomy. The current landscape creates an ethically problematic situation for millions of women. While reproductive health apps have revolutionized how women manage their health, they have simultaneously created privacy risks that current regulatory frameworks are ill-equipped to address. The solution lies not in abandoning these valuable digital health tools, but in reimagining how they operate. By implementing granular consent architecture and contextually appropriate privacy protections, we can ensure that user’s are afforded privacy. The stakes could not be higher—in an era where digital health data can be weaponized in ways previously unimaginable, protecting reproductive privacy is not just about individual rights but about preserving women's fundamental autonomy over their health and lives.
References
Auxier, B., Rainie, L., Anderson, M., Perrin, A., Kumar, M., & Turner, E. (2019). Americans’ attitudes and experiences with privacy policies and laws. Pew Research Center: Internet, Science & Tech.
Dellinger, J., & Pell, S. (2024). Bodies of Evidence: The Criminalization of Abortion and Surveillance of Women in a Post-Dobbs World. Duke Journal of Constitutional Law & Public Policy, 19(1), 1-108.
Duhigg, C. (2013). How companies learn your secrets. In The Best Business Writing 2013 (pp. 421-444). Columbia University Press.
Grand View Research. (n.d.). Women’s health app market size, share & trends analysis report by type (menstrual health, fitness & nutrition, pregnancy tracking & postpartum care), by region, and segment forecasts, 2023–2030. Retrieved December 23, 2024, from https://www.grandviewresearch.com/industry-analysis/womens-health-app-market
Fox, Sarah et al., Vivewell: Speculating Near-Future Menstrual Tracking Through Current Data Practices, DIS ’19: Proceedings of the 2019 on Designing Interactive Systems Conference 2 (June 2019), http://nourahowell.com/static/pdf/19_DIS_Vivewell.pdf [https://perma.cc/7ZJ9-P6HJ].
Kaste, M. (2022). Nebraska cops used Facebook messages to investigate an alleged illegal abortion. NPR.
Kelly, B. G., & Habib, M. (2023). Missed period? The significance of period-tracking applications in a post-Roe America. Sexual and reproductive health matters, 31(4), 2238940. https://doi.org/10.1080/26410397.2023.2238940
Litman-Navarro, K. (2019). We read 150 privacy policies. They were an incomprehensible disaster. The New York Times, 12.
Najd Alfawzan, Markus Christen, Giovanni Spitale, Nikola Biller-Andorno, et al. 2022. Privacy, data sharing, and data security policies of women’s mhealth apps: scoping review and content analysis. JMIR mHealth and uHealth 10, 5 (2022), e33735
Nissenbaum, H. (2004). Privacy as contextual integrity. Wash. L. Rev., 79, 119.
Sandel, M. (2012). What money can’t buy: The moral limits of markets. Farrar, Straus and Giroux.
Zaeem, R. N., & Barber, K. S. (2020). The effect of the GDPR on privacy policies: Recent progress and future promise. ACM Transactions on Management Information Systems (TMIS), 12(1), 1-20.



Comments