top of page

Subscribe to our blog

About

© 2025 by Nymble Health Inc.

The Clinical Confidante: How GenAI Increases Self-Disclosure to Improve Obesity Care

  • Jeremy Hessing-Lewis & Megha Poddar
  • Oct 15
  • 5 min read

Chronic Under-Reporting Due to Stigma and Guilt


On average, how many alcoholic beverages do you consume in a week? 


No question better triggers our inclination to outright lie. We know that this important health measure is so chronically under-reported that healthcare providers apply an assumed statistical multiplier (e.g. +50%) to any response they receive. In the research context, robust validation techniques are necessary to tease out the real story where self-reporting relates to any substance or activity subject to social stigma.


While patients may understand that transparency in the clinical context will improve their care, the perceived judgment within the self-reporting context leads patients to misrepresent important health information. At nymble, we are now recognizing the value of a “clinical confidante,” an automated, confidential patient support program that partners with the healthcare provider. Whether this is because nymble is present in your preferred channels (SMS/WhatsApp) or in the comfort of your own home, or because it delivers empathetic and kind evidence based information, the relationship between nymble and its user is entirely new to the healthcare context. 


Nutrition studies are another excellent example. Not only is data on nutrition highly observational, the recall bias repeatedly shows that people underestimate their caloric intake. Self-disclosure in obesity treatment, whether it is regarding intimate details about nutrition, exercise, mental health, the use of medications, or overall adherence with their care provider is a major obstacle to achieving successful treatment. Many patients are understandably hesitant to share these details where there are strong psychosocial drivers of denial and most data on recall bias shows that people want to believe what they say is true. They are not overtly trying to mislead the other person. 


What if you were to report how you felt the moment you felt it? Or document your food as you were consuming it. Or better yet, curious about something in relation to the behaviour you were engaging in, would you be more likely to document accurately? In addition to recall bias, obesity itself is highly stigmatized. The bias and judgment associated with obesity is real and patients know this from lived experience. More importantly, the internalized bias and self judgement is the true challenge of successful treatment. If you had access to a safe space, where you could truly be yourself, without anyone judging you, and ask all the scary questions - would you act differently? Who you hold the mirror up to your real self rather than the social expectation?


A Clinical Confidante


Our experience suggests that generative AI chats can provide a “safe space” to share sensitive information, in a way that was not previously possible. While the media has emphasized incidents relating to self-harm in the mental health context that have led to the implementation of new parental controls, these sensational stories provide a glimpse into the broader social trend with generative AI tools: we tell the computer things we would never tell another person. The idea of a clinical confidante recognizes that computer interactions facilitate greater intimate self-disclosure under conditions of perceived anonymity. As an automated patient support program, nymble embodies the clinical confidante concept. On a consent-by-design basis, information is only disclosed to third-parties according to the user’s preferences.


Our approach with nymble is that a patient is more likely to accurately self-report to an automated patient support service than to the health provider managing their care. This novel relationship allows nymble to generate deeper insights into the complexities of chronic disease management, thereby enhancing outcomes. Given that self-efficacy is closely linked to success in obesity treatment, shouldn’t our treatment strategies be designed to improve it?


Pumpkin Spice Reality


Let’s look at this in practice. Instead of soliciting nutritional behaviour proactively, let’s try a different, patient-led approach:


User: I have been drinking pumpkin spice lattes almost every day. I don’t want to tell my doctor.


nymble: Drinking a pumpkin spice latte daily can contribute to nausea while on [GLP-1], especially because it often contains caffeine and added sugars or dairy, which might irritate  your stomach. It's ok to enjoy it occasionally, but having it every day could make nausea worse or affect digestion. You might try reducing the frequency or switching to a gentler option like a herbal tea or smaller coffee with less added sugar.


It’s completely understandable to feel hesitant about sharing this with your doctor, but being honest about things like how many lattes you drink is really important. Your healthcare provider is there to support you without judgment and having the full picture helps them tailor your treatment and manage side effects like nausea more effectively.Sharing this info can lead to helpful advice on how to enjoy your favourite drinks while minimizing discomfort. Would you like some tips on how to bring this up with your doctor?


Before speaking with their care provider, they are now informed about the issues and hopeful that accurately sharing this information will improve their care. 


Benefits of Improved Self Disclosure


For the clinical confidante model to work effectively, trust and safety are essential prerequisites:


  • Content should reflect best evidence and be free of commercial bias;

  • Patient privacy must be respected;

  • The service should clearly disclose that it is an automated service;

  • It should only be available to attached patients, through a regulated healthcare provider;

  • Safety guardrails must constrain the scope of the service.


We must acknowledge the ways in which humans will interact differently with a machine than with a human healthcare provider. There will be both risks and benefits within this distinction.


If done effectively, improved self-disclosure introduces opportunities for all parties within the circle of care:


  • Patients can share information and ask questions judgment free;

  • Patients are more likely to share accurate information with a care provider after validation with a clinical confidante;

  • Care providers receive questions tailored to the context;

  • If the patient consents, data can be anonymized to inform research and clinical methodologies.


This is worth celebrating with a pumpkin spice latte. But perhaps in moderation.


At nymble, we are embracing the clinical confidante model and moving forward to prioritize this approach by:


  1. Enhancing security, privacy, and safety assurance;

  2. Integrating nymble with care provider clinical workflows;

  3. Iterating on safety guardrails and labelling of synthetic content; and

  4. Engaging in public policy discussions about the roles and risks of generative AI in healthcare delivery.



To healthcare organizations interested in learning more, contact us at info@nymble.health, and if you’re going to be at HLTH (Las Vegas), Obesity Week (Atlanta), the Patient Support Summit (Toronto), or the CAN Health Network Annual Conference (Toronto), reach out to us via LinkedIn or email to coordinate a meeting - we’ll be there.

 
 
bottom of page