Friday, December 26, 2025

AI as the Bridge: A Shared Middle Space for Patients and Clinicians

 

AI as the Bridge: A Shared Middle Space for Patients and Clinicians

For too long, healthcare has been a game of catch-up. 


Patients walk into appointments trying to summarize weeks (or months) of symptoms, stress, sleep, habits, and “life context” in a few minutes.

Clinicians walk in with limited time, limited context, and a chart that often misses the human story.

So we treat symptom lists… instead of treating the whole person.

What I’d love to see is AI serving as the bridge between patients and clinicians—a shared middle space where both sides can work together to diagnose and support the whole person.

Here’s the workflow I’m imagining:

1) Clarity before the appointment (patient side)
At home, a patient uses GenAI to organize what’s been happening: symptoms, patterns, triggers, stressors, lifestyle shifts, emotions, daily realities—everything that usually gets lost in the rush of a clinic visit.

The goal isn’t “Dr. Google.”
It’s getting clear and getting organized.

2) A shared workspace during the visit (with consent)
If professional care is needed, the clinician can “plug into” that same AI workspace (with the patient’s permission).

Now both walk into the visit already aligned—less time spent reconstructing the story, more time spent thinking clearly together.

3) A collaborative “AI health wiki”
Instead of a one-way conversation, the appointment becomes a partnership: a shared workspace where patient + clinician refine questions, test hypotheses, and build the best plan possible—based on both clinical insight and lived reality.

4) Support between visits (where most care actually happens)
After the appointment, the patient goes home with the clinician’s guidance in a privacy-safe format—redacted if needed—ready to use inside their personal GenAI system for ongoing follow-through.

Between visits, AI helps the patient:

  • track progress and adherence

  • notice patterns and friction points

  • adjust routines and habits

  • flag when it’s time to re-check with a professional

And if the patient chooses to allow it, the clinician could monitor high-level summaries over time—rather than relying on scattered appointments and half-remembered details.

The point

This isn’t about replacing clinicians.
It’s about upgrading the space between visits—and making care more continuous, collaborative, and human.

That’s the “new world” I’m trying to build—at least in my mind, and in my books. Maybe someday the tools (and the system) will catch up, and the two sides will finally merge into something seamless.

Question for both clinicians and patients:
Would you want care to work like this? What excites you… and what concerns you?

Here's but a start of what I'm doing if interested - https://kdp.amazon.com/en_US/series/8RDYVK4Y0XW

#DigitalHealth #HealthTech #GenerativeAI #PatientExperience #FutureOfHealthcare #PatientAdvocacy #CareTransformation #ClinicalInnovation #AIinHealthcare #HealthcareLeadership

Friday, December 12, 2025

How ChatGPT Can Help People Experiencing Homelessness or Poverty

 

How ChatGPT Can Help People Experiencing Homelessness or Poverty

Most people think of AI tools like ChatGPT as something for offices, students, or tech professionals. But over the past year, I’ve become convinced of something different:

Used carefully and responsibly, tools like ChatGPT can help people who are homeless or living in deep poverty navigate daily survival and begin rebuilding their lives.



Not as a replacement for human support.
Not as a miracle solution.
But as a 24/7, free, nonjudgmental assistant that can help people think, plan, and take small next steps—especially when other support is limited or unavailable.

The Reality Many People Face

For someone experiencing homelessness or poverty, the challenges are not abstract:

  • Where can I sleep safely tonight?

  • Where can I find food today?

  • How do I apply for benefits without an address?

  • How do I look for work without a phone or computer?

  • How do I manage my health when care is hard to access?

  • How do I stay motivated when everything feels overwhelming?

These are questions people often ask late at night, between appointments, or when they feel ashamed to ask another human being. That’s where a tool like ChatGPT can quietly help.

What ChatGPT Can Actually Do (When Used Properly)

ChatGPT cannot fix systemic problems. But it can help individuals and helpers with very practical tasks, such as:

  • Organizing daily plans and priorities

  • Finding local shelters, food pantries, clinics, and services

  • Practicing job interviews or writing simple resumes

  • Understanding benefit applications and next steps

  • Creating checklists for documents, appointments, or tasks

  • Offering grounding exercises during moments of stress

  • Helping people put their thoughts into words when they feel overwhelmed

Importantly, it can do this without judgment, at any time of day, and on library computers, borrowed phones, or shared devices.

Why This Matters to Frontline Workers and Institutions

Libraries, shelters, social workers, faith organizations, outreach teams, and case managers are already overwhelmed. AI tools will never replace human care—but they can extend it.

ChatGPT can act as:

  • A between-appointments support tool

  • A digital literacy bridge

  • A planning assistant

  • A confidence-builder

  • A thinking partner for people who feel stuck or lost

When introduced responsibly, it can reduce frustration, improve follow-through, and help people feel a small sense of control again.

Why I Wrote My Book

I wrote ChatGPT for Homelessness and Poverty because I couldn’t find any clear, practical guide that showed how to use AI safely and realistically in situations of extreme hardship.

The book is written for:

  • People experiencing homelessness or poverty

  • Social workers, case managers, librarians, and volunteers

  • Faith-based and community organizations

  • Anyone who works directly with people in crisis

It focuses on real-world use, not theory—and it’s careful about privacy, safety, and limitations.

The Kindle edition is available here:
👉 https://www.amazon.com/dp/B0G5SHSTDJ
(Paperback edition also available.)

A Final Thought

Technology alone won’t solve homelessness or poverty. But access to thinking tools matters—especially for people who have been stripped of stability, confidence, and support.

When used ethically and compassionately, AI can help people take the next small step. And sometimes, that step is enough to begin moving forward.

If you work with people in crisis, I invite you to explore this possibility—and to share resources that empower dignity, clarity, and hope.


Disclaimer - Article is for information only  and is not medical/legal advice.