BHBIA Winter Seminar wrap up
A few weeks ago I attended the BHBIA Winter Seminar in London, a one day event with engaging talks from experts in the field of artificial intelligence (AI) (followed by a hearty Christmas lunch!). As well as showcasing real examples of how AI is changing the business intelligence and healthcare industry today, we debated what the implications might be for the future. What does this mean for our roles as insights professionals? And where do we stand on the ethics and morality that surrounds big data and relying on technology to make decisions for us?
Here are my three key take aways:
1 / Our jobs as marketing professionals are safe… for now
At the moment, AI is great at doing specific tasks well and at doing the jobs that we, as humans, find too time consuming, too repetitive, or too dangerous. And it needs data, a lot of data, as the input to provide the most accurate outputs and predictions. While AI is making and will make our jobs easier and more efficient, it still needs humans to provide and ensure the quality of inputs, interpret the outputs and understand the commercial impact for companies. So it’s reassuring to know that my job is safe, at least for the next 50 years.
2 / AI’s fundamental flaw is that it lacks what makes us ‘human’
‘What does it mean to be human?’ This is the fundamental question that Professor Hiroshi Ishiguro, the world renowned Roboticist and Director of the Intelligent Robotics Laboratory at Osaka University, is seeking to answer in his quest to advance androids and AI. Psychology is at the heart of Ishiguro’s research; if robots and AI are to successfully interact with humans in the future we need to understand humanness and bring this humanity to AI.
Much of the AI that we use today (Siri, Amazon Echo, Google Hub) relies on Natural Language Processing (NLP) to interact with us with growing accuracy. But they all have a common flaw: they lack social context and emotional intelligence, amplified by the fact they still feel and look very much like machines. So, when it comes to a subject as sensitive as one’s health, how comfortable would a patient feel discussing treatment options with AI versus a human healthcare professional?
A machine that can predict patient outcomes with high accuracy or a human who understands the patient as an individual and has empathy and perception? The likely scenario is that the combination of both will have a role to play in the near future, but we are certainly some way away from no longer needing human-human interaction.
3 / There are still some big questions on ethics
The subject of ethics and morality surrounding big data and AI has been and still is under much debate. At the seminar, it raised more questions than answers. Europe is further behind in the development of AI compared to the US and Japan due to stricter data protection laws (GDPR). AI is only successful when it has access to huge volumes of data, so it’s unsurprising that in healthcare it has most application in clinical trials and R&D.
But if we want to advance AI technology, what are the implication for our data privacy laws? And how comfortable are we, as consumers or patients, with allowing access to our data? There is also the question of accountability. In the example of health, healthcare professionals are frequently making decisions that will directly impact a patient’s quality of life, sometimes they are a matter of life and death, for which they would come under scrutiny. In a future world, where AI becomes responsible for making decisions, who is accountable if it all goes wrong?
What is certain is that AI is here to stay and has changed and will continue to change our world. As insights professionals, we need to stay on top of what impact this is having on our industry, for us and for our clients. The seminar was an engaging forum to debate and learn, and provided a sense of reassurance that we, in the healthcare industry at least, share a mix of curiosity, optimism and uncertainty in this field. I’ll certainly be keeping my eyes peeled for the latest developments.