LeadershipUpdated·Falk Gottlob··updated ·4 min read

When Your Data Becomes the Product

Many nonprofits offer free services but monetize the insights from your data. With AI making deeper analysis possible, the stakes are higher than ever.

data privacynonprofitsethicsAIGDPR
Helpful?

Originally published on Medium.

The short version

When a nonprofit offers a free service and generates revenue by monetizing your data, you are the product. This is especially troubling for intimate data: health, mental health, behavioral patterns of people in crisis. The greater-good defense ("we use these insights to help people") rarely traces clearly from "sold anonymized health data" to "helped someone." GDPR forces transparency in the EU; the US has only sector-specific protections like HIPAA and FERPA. AI makes this more urgent because machine learning extracts deeper insights and infers sensitive information you never explicitly shared. The call isn't to stop monetizing data. It's to be honest about it and require informed consent from vulnerable populations, especially children.

A Disclaimer

I want to be clear upfront - I'm not writing about any single nonprofit. I'm writing about a pattern I'm seeing across the sector. The specific examples below are illustrative, not accusations.

If You're Not Paying, You Are the Product

That phrase gets thrown around a lot about Silicon Valley tech companies. But it applies equally to nonprofits.

When a nonprofit offers a free service and generates revenue by selling insights about your data or licensing your information to other organizations, you are the product. Not the customer.

Your Data as a Commodity

This is particularly troubling when the data is intimate. Health data. Mental health data. Behavioral patterns of people in crisis.

That data has value. Pharmaceutical companies want to know about drug efficacy. Insurance companies want to know about risk patterns. Employers want to know about productivity and wellness. Researchers want datasets to train models.

None of these uses are inherently bad. But the question is: did you consent? Did you understand?

The "Greater Good" Defense

Many nonprofits defend data monetization with the greater good argument. "We use these insights to help people."

But here's the problem: it's hard to trace from "we sold anonymized health data to a pharmaceutical company" to "this actually helped people." The connection is theoretical.

And in the meantime, your intimate information is in someone else's database.

The Ethical Dilemma

It gets worse when children are involved. A child using a mental health app consented to what, exactly? Their parents consented to what?

There's a difference between implicit and informed consent. "You can use this app" is not the same as "we will monetize insights from your mental health data and share them with third parties."

Transparency and Accountability

The EU's GDPR has forced some transparency and accountability around data. If a company processes your data, you have rights. You can ask what data they have. You can ask who they shared it with.

The United States has no equivalent. We have sector-specific rules - HIPAA for health, FERPA for education - but no comprehensive framework.

The Vulnerability Problem

Here's what really bothers me. People using these services are often vulnerable. They're in crisis. They're struggling with mental health. They're desperate for help.

When you're desperate, you don't carefully review privacy policies. You don't contemplate the long-term implications of handing over your data. You just sign up because you need help now.

AI as a New Frontier

AI makes this all more urgent. Machine learning algorithms can extract deeper insights from data than ever before. They can predict behavior. They can infer sensitive information you never explicitly shared.

The data you gave to a nonprofit five years ago becomes infinitely more valuable and infinitely more revealing with modern AI.

The Call for Honesty

I'm not saying nonprofits shouldn't monetize data. I'm saying they should be honest about it.

If you're offering a free service and generating revenue from user data, say so. Be transparent about:

  • What data you collect
  • How you use it
  • Who you share it with
  • What legal protections apply
  • What users can do to opt out

And for vulnerable populations - especially children - you need informed consent. Real informed consent. Not fine print. Not "the service is free because we monetize your data." Clear, explicit agreement.

What Comes Next

This matters because the line between nonprofits and for-profits is blurring. Many "nonprofits" are now venture-backed. They have exit strategies. They're businesses wearing nonprofit clothing.

That's not inherently bad. But it means they should follow the same ethical and legal frameworks as for-profit companies.

Transparency. Accountability. User rights. Informed consent.

Your data is valuable. If someone is profiting from it, you should know. You should understand what's happening. And you should have the right to say no.

Share this post

Also on Medium

Full archive →

Keep Reading

Posts you might find interesting based on what you just read.

Leadership11 min read

The New Org Chart for AI

AI coding tools like Cursor and Claude Code boosted developer output, but org-level velocity stayed flat. The bottleneck shifted from writing code to reviewing it, with PR review times up 91% according to Logilica. This article breaks down three layers to fix: engineer adoption, process redesign for AI speed, and flattening the coordination layer. Backed by data from METR, CodeRabbit, Gartner, and examples from Shopify, Coinbase, Amazon, and Klarna.

opinionai-nativeleadership
Execution16 min read

How Your PMs Ship Their First Pull Request (and Why You Should Want Them To)

A step-by-step guide for product leaders on how PMs can ship code with AI tools like Claude Code and Cursor. Includes a four-level PR ladder, PLANNING.md templates, a PM PR review skill file, pitch docs for CTO buy-in, and a customer signal monitoring playbook. Built from real coaching engagements across B2B SaaS orgs.

product managementai-nativeoperating-model
templates6 min read

The Product Builder Job Ladder: From L4 to Principal, Four JDs You Can Fork Today

A complete, fork-ready job-description ladder for Builder PMs. Four levels calibrated to scale from your first Builder hire to your most senior IC. Each level downloadable as its own file.

hiringproduct managementAI
12 min read

The PM Role Is Being Rewritten in Real Time. Are You Rewriting Yourself?

AI-forward companies are hiring for 'Product Builders.' This isn't a quirky experiment. It's the new standard. Here are the six skills that actually matter now.

opinionproduct managementAI