top of page

Data, Bias, Chatbots, Permission... and AI Cupcakes

I made a good choice for my first One HealthTech Bristol event experience: AI in Health and Care – How do we innovate responsibly? on 6 November. What a (literal!) treat it was. There were great speakers, a friendly crowd and tasty AI cupcakes thanks to Norts at TinyGiant. Why did I go? I’m interested in doing my part to design a better world in my role with user centred innovation and product development consultancy Kinneir Dufort.



Dawn Walter kicked off the afternoon discussing the social impact of algorithmic decision-making and why you need social scientists on tech teams. She took a look at decision-making models that may have problematic biases that can arise from using a biased data set, or when they’re developed by people with underlying prejudices or biases that were not controlled for (amongst other things). Dawn’s transformed her talk into a highly-referenced and well-written article, so I’d recommend a read.


Laura Sobola, Senior Consultant at Unai, then tackled the topic of digital consent. It’s a tough one with data being so valuable, and companies going to extreme measures to extract that value, often with the consent processes behind it lacking transparency. We’ve all experienced the issue when mobile apps are installed, we’re so focussed on getting the shiny new phone to work, that we’ll hit OK to anything! Companies will generally ask for access to more personal data than they need as it’s cheaper to store it than it is to delete it… Oh, and you’ve probably already clicked another “OK” box and given them consent to sell that data on to third parties. Laura reminded us that we need to be extra-careful when we’re giving digital consent for the use of any health-related data. New consent models are on the horizon as we understand more about company and consumer attitudes alongside data management and anonymisation. You can watch a video of Laura’s talk recorded at Anthropology & Technology 2019.


Next up was Ellie Foreman – Automation Research Fellow at South West Creative Technology Network. Ellie’s work has a focus on the relationships between humans and chatbots. When is the right time for a human to intervene, to step in or out? Or should chatbots have a place at all in the mental health space? There are plenty of good reasons from both sides of the fence: occasions where chatbots have played a part in saving lives and occasions where a lack of empathy and compassion have led to worst case scenarios. Accountability is a focal point and it is something that requires a lot more work to find the balance between risk and reward. You can watch a video of Ellie’s talk recorded at Anthropology & Technology 2019. The podcast KD Conversations also discussed Ellie’s work recently. Subscribe here to catch it as soon as it’s released.


After tea with cupcakes co-created by AI and Tiny Giant, and plenty of networking, we reconvened for a fantastic panel discussion with Dawn Walter, MD & Founder at Mundy & Anson; Anne Marie Cunningham, GP & Associate Director for Primary Care at NHS Wales Informatics; John Kellas, Co-director of This Equals and Innovation and community engagement consultant; and Katie Gibbs, Head of AI at BJSS - moderated by Sally Powell, Hub Lead at One HealthTech Bristol. There was great conversation and crowd engagement, with the panel speakers itching for their turn with the microphone! It was helpful to hear perspectives from an NHS GP on the challenges of setting rules for symptom data, and how inherent biases that exist in medicine are in danger of getting replicated in AI. We also gained an insight into what’s going on in Bristol around public data, including health data, from John at This Equals.


My favourite contentious comment from the audience was that we should “scrap all data and start again”, almost creating a data ground zero. Bad data can be dangerous, unstructured data can be dangerous, and good data can be dangerous used in the wrong way. Becoming more informed about data, what we can do with it and the risks of bias, will help ensure we can develop better ways to collect data, its limitations of use, and how data can be better used to create improved experiences for all.


Join One HealthTech Bristol to hear about local and national events, opportunities, jobs and more.

 

Andrew Rugg is a Digital Portfolio Manager at Kinneir Dufort



bottom of page