top of page

What I Learned Moving from Humanitarian Work to HealthTech

  • info847897
  • Jul 9
  • 3 min read

Updated: Jul 10

by Erin Okhrymenko, OHT Fellow


Where it all started 

Picture this: me, a decade ago, teaching Business English to humanitarian workers in Ukraine during active conflict. Yeah, you read that right! Try getting someone coordinating emergency aid to war zones excited about grammar lessons when they're literally saving lives between coffee breaks.

Here's what I learned fast: when someone's dealing with life-or-death stress, you can't just pretend emotions don't exist and power through your lesson plan. I had to completely redesign everything to honor their emotional reality – the stress, the urgency, the incredible weight they carried every single day.


And you know what happened? Engagement went through the roof!

That experience smacked me with a truth bomb: technology that ignores human emotion is ineffective and completely missing the point.


When healthcare AI reality hit different

Fast forward to me diving into AI product development in healthcare, and BOOM – same pattern everywhere! We'd build these absolutely brilliant systems that could predict health outcomes and crunch massive datasets, but they'd completely flop when real humans tried to use them.

Patients weren't abandoning apps because they didn't work. They were bailing because the apps felt overwhelming, cold, or just plain stressful!

I'll never forget working on this remote patient monitoring system where I was getting completely wild feedback from every direction. CTO freaking out about security risks, founder buzzing about shiny new features, and users stuck somewhere between confused and totally frustrated.

That's when it hit me: we needed a whole new way to visualize and tackle these emotional dynamics!


What gets me absolutely buzzing about this space

The untapped potential here is INSANE! We're literally on the edge of creating AI that doesn't just process medical data but actually gets the human experience of health and illness.

Imagine AI that picks up when you're anxious about test results and adjusts how it talks to you! Or systems that sense when someone with chronic illness is having a rough day and responds with extra compassion!

I've started thinking about emotional intelligence in AI through three game-changing lenses:

  • Understanding everyone's emotional landscape (patients, clinicians, caregivers – the whole crew!)

  • Spotting key emotional moments in the user journey

  • Gracefully handling the inevitable chaos when things go sideways


The future I'm absolutely obsessed with building

The applications that make me wake up excited? Patient-facing ones – mental health support, chronic disease management, rehabilitation tools. These are spaces where trust and emotional connection aren't just nice-to-haves; they're literally make-or-break for outcomes!

I'm convinced we're heading toward devices that understand us like a trusted friend would. Not in a creepy Big Brother way, but in a way that makes healthcare tech feel supportive instead of scary.

This is why I'm absolutely in love with this space! We're not just building software – we're building bridges between human hearts and artificial intelligence. We're creating technology that truly serves human needs at the deepest level.


And honestly? I can't think of anywhere I'd rather be working, or anyone I'd rather be figuring this out with than this incredible community!

Who's with me in baking emotional intelligence right into the core of how we design and develop these systems? Because the future of healthcare AI isn't just about being smart – it's about being human!


P.S. Still can't believe my weirdest career pivot turned into my biggest superpower. Life's funny like that!


 
 
bottom of page