top of page

Lizzie Barclay: Twitter Takeovers

Curated by Lizzie Barclay, Medical Director at Aidence (@LizzieBarclay1) on 31st May 2021

Every week the global OHT Twitter account is curated by a wonderful member of the OHT community. They share with us how they do what they do, what they're interested in, their top tips and general learnings. We like to turn these Tweets into blogs as there is so much goodness in them!

Go on... sign up to curate our account on a Monday know you want to!


I trained as a doctor in the NHS and became interested in healthtech during my Radiology specialty training. I spend most of my waking hours driven by the potential that technology has to improve patient outcomes and reduce health inequalities, and I hope I can help nurture a responsible, trustworthy culture in AI for healthcare. To switch off (because sometimes we just need to run around a field chasing a ball) I can be found playing some sort of sport: most recently Aussie rules football! Since moving to Amsterdam I’ve also developed an affinity for driving a boat around canals, and of course, cycling. I've been volunteering with One HealthTech since November 2020, am currently involved in The HealthTech Toolkit campaign, The Diversity in the Workplace campaign, and I'm setting up the OHT Amsterdam Hub (launch event will be announced soon…!) #OHTenthusiast

During my #Mondaytakeover I covered some themes on QUALITY in AI-driven healthtech, including,

- nurturing a culture of commitment to QUALITY in healthtech ,

- ensuring healthtech doesn’t exacerbate health ineQUALITY ,

- and developing QUALITY standards for user training in healthtech

Qn 1: How do we nurture healthcare's commitment to quality in the healthtech sector? #NHS #corevalues #healthcare #quality #OHT21 1/6

As someone who trained and worked in the #NHS perhaps I took for granted how well the

core values resonate with my personal values. The quality of every (clinically-related) decision we make has a direct impact on #patients' & relatives' quality of life. 2/6

Understandably in healthtech when the office isn't near a hospital we can feel somewhat detached from the clinical setting, &yet the quality of the healthtech devices we develop have just as much impact on quality of life(whether clinical decision support tools, wearables etc)3/6

So, how can we nurture a culture which is committed to quality? I would love to hear your suggestions and/or experiences... I liked @chrissyfarr 's 2019 article with the @omadahealth

's idea of a minimally clinical viable product #mvp --> #mcvp 4/6

And in a recent #leadership programme for #womeninhealthtechinnovation, I asked the same Qn to a healthtech company founder and I loved her reply: "First, hire people based on their values, attitudes & personalities. Skills can be learned*..." (*within reason) 5/6

"Second, #reiterate your company's core values (and strategy) on a regular basis. There’s not much point having them written down if nobody reads them - you need to remind #everyone - it’s how the company #culture becomes embedded even as you scale up." 6/6

Qn 2: How can we prevent healthtech from exacerbating health inequalities? #responsiblehealthtech#ResponsibleAI#OHT21 1/7

One of my 'whys' for working in this sector is the potential that tech has to improve #patientoutcomes and to reduce #healthinequalities. But only if the tech solutions are developed responsibly and used appropriately. 2/7

For #AI to benefit society fairly the data used to develop algorithms must be #highquality, #nondiscriminatory & #relevant for the intended use of the resulting device. We've seen what can go wrong if bias is introduced into algorithms... @jovialjoy 's #codedbias documentary. 3/7

In healthcare it's no different. Imagine training an algorithm to detect early signs of skin Ca on datasets which only include white skin. Or developing an algo to detect a brain pathology that can occur in young adults, using CT datasets only of >70yr olds. #data#bias 4/7

When working in healthtech, my suggestion is to regularly #askyourself "If my relative was ill would I be happy for their doctor to rely on our algorithm's results to help make the best clinical decision?" #patientfocus 5/7

During a recent roundtable discussion on #trustworthyAI we discussed the ongoing challenges of accessing datasets&the need to incentivise medtech investors&decisionmakers to commit to accessing high quality, relevant, non-discriminatory datasets. Any ideas how to go about it? 6/7

Talking of #trustworthyAIforhealthcare @NHSX are driving forward conversations on this topic and I'll be joining their panel discussion at #CogX in a couple of weeks See you there?! 7/7

Qn 3: What does QUALITY user training look like in healthtech? #usertraining#enduser#responsibleAI#explainableAI#OHT21 1/6

Currently most AI-driven medical devices in use in clinical practice are not autonomous (thankfully), and still require a 'human-in-command' i.e. to verify results +/or determine the overall clinical decision for the patient by piecing together all relevant info. 2/6

However, many devices are intended to #support clinical decisions and therefore (presumably) do *influence* clinicians. So how can we #empower clinicians (aka endusers) to feel #confident using our medical devices? i.e.knowing specific intended use/clinical limitations etc 3/6

A suggestion for those #responsible for delivering user training for AI-driven meddevices, #askyourself "If I was using this algorithm in clinical practice tomorrow would our user training material give me confidence to use our device appropriately?" #explainableAI 4/6

In a recent talk by @DrGMcGinty 1 of many(!) points that stuck with me was(excuse the paraphrasing): "As radiologists who will be using AI-driven devices in our daily practice, we should be demanding more [quality&explainability] from AI companies" #responsibleAI#radiology 5/6

To practice what I preach I'm collecting clinicians' feedback on our user training so we can continually improve I also think it would be useful to develop a #framework / #standard for good practice in delivering user training for AI-driven medical devices. Anyone agree?! 6/6


bottom of page