
In therapy, “health data” almost never arrives as a clean story and Perplexity’s latest health update leans right into that reality. The announcement centers on new integrations that let people bring together personal health information, organize it into dashboards, and use it to create clearer summaries and questions for medical visits. From the therapy room, that immediately raises a human question: what changes when a person can gather their health signals in one place and actually talk through them, instead of chasing them across apps?
From a therapist’s point of view, the best-case impact is simple and practical: structure. Many clients struggle to summarize what’s happening in a way a clinician can use, when it started, what triggers it, what makes it better or worse, what’s been tried, what changed, and how it affects sleep, work, appetite, mood, and relationships. If a tool can help draft a pre-visit summary from the mess of real life, that can reduce cognitive load, reduce shame (“I can’t explain it well”), and help someone walk into an appointment with clearer questions and fewer omissions.
But the inconveniences and challenges are real, and they’re not just about setup. The biggest one I see clinically is that a single dashboard can quietly become a “threat monitor.” For clients prone to health anxiety, panic, OCD-style reassurance seeking, trauma-related body scanning, or chronic stress, more tracking doesn’t always equal more clarity. It can increase checking, amplify normal fluctuations, and keep the nervous system on alert, especially when numbers feel like verdicts instead of context.
Another challenge is false clarity. Wearables are noisy, labs are snapshots, and medical records can be incomplete or inconsistent. When an AI-generated summary sounds confident, it can pull people toward conclusions that aren’t actually supported sometimes in ways that increase catastrophizing, sometimes in ways that minimize something important. In therapy, I’m less worried about whether the tool is “smart,” and more worried about whether it can communicate uncertainty honestly, and whether the person using it can hold that uncertainty without spiraling.
There’s also the basic friction of access and use. Integrating accounts, permissions, and records can be confusing, and the people who need the most support are often the least resourced to troubleshoot a complicated setup, especially when they’re already exhausted, in pain, or overwhelmed. If the tool becomes another task that they “fail” at, it can reinforce the very helplessness we’re trying to reduce.
Privacy is the quieter challenge that shows up later in session. People don’t just upload “health data”, they upload fear, vulnerability, and context that crosses into mental health, relationships, substance use, sexual health, and trauma history. When someone is distressed, they tend to trade privacy for reassurance. Part of a therapist’s job is to slow that moment down: not to shame the choice, but to help the client make it with clear eyes.
If I were to incorporate something like this into therapy, I’d treat it as a collaborative artifact, not an authority. Bring the summary in, and we do what therapy does best: slow it down, reality-check it, and translate it into next steps. What’s missing? What might be an overinterpretation? Is this helping you feel more agency or is it feeding compulsive monitoring? Used carefully, these tools can support better conversations with medical providers. Used carelessly, they can make the story feel more coherent while quietly increasing anxiety. The difference is rarely the technology alone; it’s the relationship the person forms with it.
