Mental Health Therapy Apps vs Wellness Trackers: Data Exposed

Mental health apps are collecting more than emotional conversations — Photo by Sam Lion on Pexels
Photo by Sam Lion on Pexels

In 2023, analysts identified 30 distinct data points that mental health therapy apps collect, many of which go unnoticed in privacy policies. These apps capture everything from mood logs to biometric signals, blurring the line between therapy and surveillance.

Medical Disclaimer: This article is for informational purposes only and does not constitute medical advice. Always consult a qualified healthcare professional before making health decisions.

Discover the 30 types of data mental health apps secretly track - beyond what you can see in the privacy policy

Key Takeaways

  • Apps gather far more data than advertised.
  • Wellness trackers share many of the same data points.
  • Australian privacy law offers limited protection for health data.
  • Consumers can limit exposure with simple settings.
  • Regulators are beginning to scrutinise mental-health apps.

When I started covering digital health for the ABC, I expected the conversation to be about clinical outcomes. Instead, I kept running into the same story: behind every soothing meditation timer lies a suite of sensors, logs and algorithms that map a user’s life in astonishing detail. The line between a mental-health therapy app and a wellness tracker has become so thin that many users can’t tell which data are being harvested for research, which are sold to advertisers, and which are simply stored in the cloud.Below I break down the 30 data types I’ve seen across the most popular mental-health platforms - from headspace and Calm to newer AI-driven chatbots - and compare them with what typical fitness wearables collect. I’ll also flag the privacy blind spots that the ACCC and the Office of the Australian Information Commissioner (OAIC) are still wrestling with.

1. Core behavioural data

  1. Mood entries: Users rate their emotional state several times a day, often with a colour-coded wheel.
  2. Journal text: Free-form notes are stored and sometimes analysed for sentiment.
  3. Sleep logs: Bedtime, wake-time and self-reported sleep quality.
  4. Medication adherence: Reminders and confirmations of pill intake.
  5. Therapy session timestamps: Exact start and end times of video or audio sessions.
  6. Chatbot interaction logs: Every prompt and response, including typing speed.
  7. Goal completion data: Whether a user marks a habit as done.
  8. Push-notification engagement: Which reminders are opened or dismissed.
  9. Community activity: Likes, comments and posts in peer-support forums.
  10. Survey responses: Periodic mental-health questionnaires.

2. Biometric & physiological signals

  1. Heart-rate variability (HRV): Often captured via phone camera or connected wearables.
  2. Respiratory rate: Measured during guided breathing exercises.
  3. Skin conductance: Some apps pair with external sensors to gauge stress.
  4. Voice tone analysis: Audio recordings are parsed for pitch, speed and pauses.
  5. Facial expression recognition: Camera-based mood detection.
  6. Movement patterns: Accelerometer data during mindfulness walks.
  7. Location data: GPS used to suggest nearby safe spaces or to tag anxiety-trigger zones.
  8. Device usage statistics: How often the app is opened, screen-on time.
  9. Ambient sound levels: Background noise measured during meditation.
  10. Blood oxygen saturation (SpO₂): When paired with smart rings or watches.

3. Personal identifiers & contextual info

  1. Age and gender: Required for personalised content.
  2. Ethnicity and language: Used to tailor cultural references.
  3. Sexual orientation: Some apps ask for it to recommend inclusive resources.
  4. Relationship status: Influences suggested coping strategies.
  5. Employment details: Job title or industry for stress-related tips.
  6. Education level: Determines reading difficulty of content.
  7. Device identifiers: IMEI, advertising ID, MAC address.
  8. Health conditions: Users may self-report depression, anxiety, PTSD.
  9. Insurance information: Occasionally requested for reimbursable therapy.
  10. Payment history: Subscription status, billing cycles, failed payments.

4. Comparative snapshot - mental-health apps vs wellness trackers

Data CategoryMental-Health Therapy AppsWellness Trackers (e.g., Fitbit, Apple Watch)
Mood / emotional logsYes - core featureRare, limited to occasional prompts
Sleep trackingUser-entered self-report + optional sensor syncAutomated actigraphy + HRV
Heart-rate variabilityCollected via camera or wearablesBuilt-in optical sensor
LocationUsed for context-aware interventionsOften used for activity mapping
Voice analysisSentiment and stress detectionNot typical
Payment & subscription dataYes - recurring billing infoUsually separate (e.g., Apple ID)
Community interactionForum posts, peer supportSocial sharing, but not therapy-focused

What strikes me, after interviewing developers in Sydney and Melbourne, is the sheer overlap. A fitness band that measures HRV, steps and sleep already has the physiological “raw material” that a mental-health app needs to infer stress. The difference lies in how that data is packaged and, crucially, who gets to see it.

5. Why privacy matters - Australian regulatory backdrop

Australia’s Privacy Act 1988, together with the Australian Privacy Principles (APPs), requires organisations to be transparent about the purpose of data collection. However, mental-health apps often fall into a grey area because they market themselves as “wellbeing” tools rather than medical devices. The ACCC’s 2022 report on digital health noted that “many platforms rely on broad consent language that makes it hard for users to understand secondary uses of data.”

Moreover, a 2023 OAIC investigation into a popular meditation app found that biometric data were being shared with third-party advertisers without a clear opt-out. In my experience around the country, the biggest surprise for users is not that their data is collected, but that it can be combined with other datasets - for example, linking mood logs with credit-card purchase history to predict spending behaviour during depressive episodes.

6. Practical steps you can take today

  • Read the fine print: Look for clauses about “analytics”, “research” or “third-party sharing”.
  • Limit sensor permissions: On Android and iOS you can disable microphone, camera or location access when not needed.
  • Use a secondary email: Register with a throw-away address for non-essential apps.
  • Export and delete data: Many platforms now let you download your journal and then request deletion.
  • Prefer local-only apps: Some Australian startups store data on-device only and never upload to the cloud.
  • Check for accreditation: Look for apps vetted by the Australian Digital Health Agency or listed on the Therapeutic Goods Administration’s (TGA) register.
  • Read reviews on data practices: The ACCC’s Consumer Rights Watch often highlights apps that have breached privacy.

When I asked a product manager at a Sydney-based mental-health startup about their data-minimisation policy, she told me they now default to “opt-in” for any biometric sharing - a shift that aligns with the OAIC’s recent guidance on health data.

7. The future - what will regulators and developers do?

The momentum is building. In November 2024 the ACCC announced a “Digital Therapy Taskforce” aimed at drafting clearer labelling standards for data collection. Meanwhile, appinventiv.com’s guide to healthcare mobile app development warns that “privacy-by-design will be a market differentiator in 2026”. Developers who embed end-to-end encryption, on-device processing and granular consent toggles will likely earn consumer trust faster.

At the same time, wearables are getting smarter. The next generation of smartglasses can detect facial micro-expressions in real time, potentially feeding a mental-health platform with data that is even more intimate than a written journal. If those sensors are paired with therapy apps, the privacy stakes will rise dramatically.

Here’s the thing: technology will keep pushing the envelope, but the law and consumer expectations will catch up. Until then, the best defence is awareness - knowing exactly which of the 30 data points listed above your favourite app is hoovering up.

8. Bottom line

In my nine years covering health tech, I’ve seen the pendulum swing from “any data is fair game” to “data minimisation is a competitive edge”. Mental-health therapy apps already track a richer set of data than most wellness trackers, and the gap is narrowing as wearables become more capable. Australians should treat every app as a potential data collector, exercise the privacy controls offered by their devices, and keep an eye on regulator announcements.

Frequently Asked Questions

Q: Do mental-health apps share my data with advertisers?

A: Some do, especially if you agree to “analytics” or “research” clauses. Check the privacy policy for third-party sharing and use the app’s privacy settings to opt out where possible.

Q: How are wellness trackers different from therapy apps in data collection?

A: Trackers focus on physical metrics like steps and heart rate. Therapy apps add layers of mood, journal, voice and location data, often linking them to mental-health assessments.

Q: Are Australian privacy laws enough to protect my mental-health data?

A: The Privacy Act offers basic safeguards, but many apps sidestep stricter medical-device regulations. The ACCC is tightening rules, but you still need to manage permissions yourself.

Q: Can I delete all data I’ve entered into a mental-health app?

A: Most reputable apps now let you export your data and request full deletion. Look for a “Data Export” or “Delete Account” option in the settings.

Q: What should I look for when choosing a mental-health app?

A: Prioritise apps with clear consent flows, on-device processing, TGA registration or OAIC compliance, and a track record of transparent data handling.

Read more