
Originally published on Medium.
The short version
When a nonprofit offers a free service and generates revenue by monetizing your data, you are the product. This is especially troubling for intimate data: health, mental health, behavioral patterns of people in crisis. The greater-good defense ("we use these insights to help people") rarely traces clearly from "sold anonymized health data" to "helped someone." GDPR forces transparency in the EU; the US has only sector-specific protections like HIPAA and FERPA. AI makes this more urgent because machine learning extracts deeper insights and infers sensitive information you never explicitly shared. The call isn't to stop monetizing data. It's to be honest about it and require informed consent from vulnerable populations, especially children.
A Disclaimer
I want to be clear upfront - I'm not writing about any single nonprofit. I'm writing about a pattern I'm seeing across the sector. The specific examples below are illustrative, not accusations.
If You're Not Paying, You Are the Product
That phrase gets thrown around a lot about Silicon Valley tech companies. But it applies equally to nonprofits.
When a nonprofit offers a free service and generates revenue by selling insights about your data or licensing your information to other organizations, you are the product. Not the customer.
Your Data as a Commodity
This is particularly troubling when the data is intimate. Health data. Mental health data. Behavioral patterns of people in crisis.
That data has value. Pharmaceutical companies want to know about drug efficacy. Insurance companies want to know about risk patterns. Employers want to know about productivity and wellness. Researchers want datasets to train models.
None of these uses are inherently bad. But the question is: did you consent? Did you understand?
The "Greater Good" Defense
Many nonprofits defend data monetization with the greater good argument. "We use these insights to help people."
But here's the problem: it's hard to trace from "we sold anonymized health data to a pharmaceutical company" to "this actually helped people." The connection is theoretical.
And in the meantime, your intimate information is in someone else's database.
The Ethical Dilemma
It gets worse when children are involved. A child using a mental health app consented to what, exactly? Their parents consented to what?
There's a difference between implicit and informed consent. "You can use this app" is not the same as "we will monetize insights from your mental health data and share them with third parties."
Transparency and Accountability
The EU's GDPR has forced some transparency and accountability around data. If a company processes your data, you have rights. You can ask what data they have. You can ask who they shared it with.
The United States has no equivalent. We have sector-specific rules - HIPAA for health, FERPA for education - but no comprehensive framework.
The Vulnerability Problem
Here's what really bothers me. People using these services are often vulnerable. They're in crisis. They're struggling with mental health. They're desperate for help.
When you're desperate, you don't carefully review privacy policies. You don't contemplate the long-term implications of handing over your data. You just sign up because you need help now.
AI as a New Frontier
AI makes this all more urgent. Machine learning algorithms can extract deeper insights from data than ever before. They can predict behavior. They can infer sensitive information you never explicitly shared.
The data you gave to a nonprofit five years ago becomes infinitely more valuable and infinitely more revealing with modern AI.
The Call for Honesty
I'm not saying nonprofits shouldn't monetize data. I'm saying they should be honest about it.
If you're offering a free service and generating revenue from user data, say so. Be transparent about:
- What data you collect
- How you use it
- Who you share it with
- What legal protections apply
- What users can do to opt out
And for vulnerable populations - especially children - you need informed consent. Real informed consent. Not fine print. Not "the service is free because we monetize your data." Clear, explicit agreement.
What Comes Next
This matters because the line between nonprofits and for-profits is blurring. Many "nonprofits" are now venture-backed. They have exit strategies. They're businesses wearing nonprofit clothing.
That's not inherently bad. But it means they should follow the same ethical and legal frameworks as for-profit companies.
Transparency. Accountability. User rights. Informed consent.
Your data is valuable. If someone is profiting from it, you should know. You should understand what's happening. And you should have the right to say no.
Also on Medium
Full archive →AI Agents and the Future of Work: A Pixar-Inspired Journey
What product managers can learn about AI agents from how Pixar runs a film team.
Many AI Agents Are Actually Workflows or Automations in Disguise
How to tell agents from workflows from cron jobs, and why it matters for what you ship.