All Eyes On You

Aditi Peyush

I’ve never liked therapists who say, “I’ve experienced anxiety [or any other disorder], so I know how to treat it.” Nope. Therapists with that perspective usually just tell you that joining a book club will help you conquer social anxiety because it worked for them. But maybe your social anxiety prevents you from joining any group—even one you’re interested in. 

In the game of mental health maintenance, what works for some doesn’t work for others. Maybe you’ve experienced this. You’re talking to a friend about everything you have to do, and they (harmlessly) give you unhelpful advice, like “take power naps” or “meditate.” This might work for some, but I can’t even lay down if I have pending work. It’s almost like your friend is paying attention to the problem, but not the human with the problem.

Sometimes, I wish clinicians focused on the whole picture of their patients, instead of fixating on one small fragment. 

Then, in August 2023, I was presented with a new perspective on the issue: A tweet by Robert Scoble about his session with a psychiatrist who was using OpenAI’s GPT to take clinical notes. You can watch the video here. The LLM generates a report stating the purpose of the therapy session and descriptive characteristics of Scoble’s appearance. 

This AI integration had potential. Here’s a voice that’s removed from our social microcosm—a voice that won’t just tell you that vitamin B12 is the cure to your PMDD, nor will it default to your self-diagnosis, like the comments on a Reddit post asking whether you have ADHD. 

Of course, there are some issues. If you’re a woman, you’re less likely to be diagnosed with ADHD because women don’t display the classic ADHD behaviors that are associated with distracted young boys. Also, I wonder what the integration would have to say about fibromyalgia, a condition that has left most licensed practitioners stumped. 

The clinical notes show that the model picked up on some of Scoble’s behaviors that a therapist might miss. His speech is at an “appropriate pace and volume,” and he’s “clear about his goals and concerns.” However, the influx of information, like factors delaying recovery, almost dominates the diagnosis instead of informing it—which was the original intent behind integrating this sort of technology in therapeutic cases. Not to mention, there’s a note under billing that mentions factors to “maximize billing reimbursement.”

Therapists serve as an objective voice of reason who are divorced from the messy intricacies of your life. And in our distraction-based existence, sometimes that’s what we need. 

This is where Ossabot comes in. The name Ossa, also known as Pheme, comes from Greek mythology. She’s the personification of rumor or report. In this context, Ossabot is an objective voice who guides you to understand the meaning of what you tell her. Ossabot was built using the Elizabot model, primarily because I wanted users to escape the chatter of LLMs and interact with a more rudimentary bot. Ossabot focuses on you—the user—and your problems, and stirs up thoughts and feelings about why you’re speaking and thinking the way that you are. 

Lately, I’ve personally struggled with focusing on my personal goals outside of the changing world. My mind has been drifting, I’ve been throwing myself into new projects, hobbies, and other things to avoid sitting with my thoughts. If distraction is a pool, I’ve gone off the deep end. 

If you can relate to my experience, I encourage you to find some time alone—without distractions—to chat with Ossa and discover what’s really bothering you.