A Tense Human Moment
For months, Lauren Bannon, a mother of two, suffered from vague diagnoses of arthritis, acid reflux, as her tests came back normal. She felt trapped in increasing pain and persistent fatigue. Then, on a desperate evening, she turned to ChatGPT. Within moments, a simple prompt, “What could mimic rheumatoid arthritis?” yielded an unexpected suggestion: Hashimoto’s thyroiditis. Her doctor ordered the test. It was positive. An ultrasound followed, revealing thyroid cancer. “Then ChatGPT suggested something my doctors missed.”
A Quick Dive into the Diagnosis
Doctors initially dismissed Lauren’s joint stiffness and stomach pains. But ChatGPT’s suggestion prompted her to request thyroid peroxidase tests, doctor’s reluctance notwithstanding. The results led to scans that confirmed cancerous growths in her thyroid and lymph nodes. The early intervention likely prevented metastasis. In January 2025, she underwent surgery and now prepares for lifelong check-ups.
Without Getting Too Technical
AI tools like ChatGPT analyse language patterns using huge databases of symptoms and treatments. They match clusters of symptoms like joint stiffness, pain, and weight loss to known conditions. In Lauren’s case, while doctors focused on common diagnoses, ChatGPT offered a path less traveled. Other AI tools in healthcare, such as Google’s DeepMind for protein structures or specialised radiology systems, similarly uncover hidden clues in scans or data.
Hope vs Risk
- Doctor: Dr Harvey Castro, an emergency medicine physician, says, “AI can assist, alert, and even comfort but not replace medical expertise.”
- Ethicist: WHO advises responsible deployment of AI in health, stressing that tools “must put ethics and human rights at the heart”. Yet studies show AI risks misdiagnosis in rare or underserved groups.
- Government: The US Health and Human Services is working on frameworks to guide safe and equitable use of AI in care, a nod to growing official interest.
- Patients: Lauren urges others to use AI cautiously and always consult professionals. Another case in Paris saw ChatGPT flag possible lymphoma months before doctors did.
Should You Trust It?
Yes, with caution.
- Strength: AI can surface overlooked diagnoses and prompt vital tests. Harvard vignettes show GPT-4 diagnosing accurately over 90% of the time.
- Limits: AI may misinterpret symptoms or reinforce biases, missing rare diseases or harming marginalised patients.
- Responsibility: If AI prompts a wrong test or treatment, who is to blame: the software provider, the doctor, or the system?
- Transparency: Authorities like the WHO call for AI tools to explain their reasoning, not operate as opaque black boxes.
Popular AI-powered Health Apps
Here is a structured comparison of the most popular AI-powered health apps available in 2025. The table groups them by category (diagnosis, mental health, tracking), highlights their main function, the AI technology behind them, and their platform availability. You can now easily compare features across these apps and identify the ones most relevant to your needs.
1. Ada Health
- Function: Symptom checker and personalised health assessment.
- AI Feature: Uses a probabilistic reasoning engine to suggest possible causes based on symptoms.
- Use Case: Helps users determine whether to seek medical attention.
- Available on: iOS, Android, Web.
2. Babylon Health (now part of eMed)
- Function: Virtual consultations, symptom checking, and health monitoring.
- AI Feature: Chatbot for triage, which routes patients based on urgency.
- Use Case: Widely used in the UK and some US states for digital GP appointments.
- Note: Babylon filed for bankruptcy in 2023 and was restructured under eMed.
3. Woebot
- Function: Mental health chatbot.
- AI Feature: Uses NLP (natural language processing) to deliver CBT-based (Cognitive Behavioural Therapy) conversations.
- Use Case: Helps users manage anxiety, stress, and depression.
- Backed by: Clinical research from Stanford University.
4. SkinVision
- Function: Skin cancer risk assessment via image analysis.
- AI Feature: Analyses photos of moles and lesions using a deep learning model.
- Use Case: Early detection of melanoma and skin anomalies.
- Certified: CE-certified in Europe; recommended for self-checks.
5. HealthTap
- Function: Telemedicine and AI-driven health Q&A.
- AI Feature: Offers preliminary answers before routing to doctors.
- Use Case: Combines AI advice with real-time consultations.
6. K Health
- Function: Symptom checking and primary care consultations.
- AI Feature: Uses anonymised health records to provide data-backed diagnosis suggestions.
- Use Case: For conditions like UTIs, flu, COVID-19, mental health, etc.
- Partnered With: Cedars-Sinai, Mayo Clinic.
7. MySugr (for Diabetes)
- Function: Blood sugar tracking, insulin logging, and reports.
- AI Feature: Provides insights and predictions based on user inputs.
- Owned By: Roche.
- Use Case: Designed for people with type 1 and type 2 diabetes.
8. Ovia Health
- Function: Fertility, pregnancy, and parenting tracking.
- AI Feature: Predictive cycle tracking, pregnancy risk insights.
- Use Case: Used by women and employers to monitor maternal health.
9. Youper
- Function: AI-powered mental health assistant.
- AI Feature: Conversational agent using CBT and emotional insight tracking.
- Use Case: Mental health screening, mood tracking, and therapy planning.
A Thoughtful Reflection
Lauren Bannon’s story highlights both the promise and the peril of AI in healthcare. It shows that AI can help catch what humans miss, but it cannot replace the rigor of doctor-led investigation, especially given potential errors and ethical concerns. The safest path is partnership: AI as aide, not authority. “AI might not replace doctors, but it might help them save more lives if used responsibly.”