Big data analytics has dominated the imaging market headlines in recent times. This emerging sector may appear tantalizing and visionary, but much of the hype is based on conceptual discussion, and there is a lack of substance and evidence about how big data analytics can be applied to mainstream use in radiology.
The other frustration in picking through the volumes of analytics-focused news is the scarcity of a clear definition for this emerging marketplace. All too often jargon replaces concise explanation of the wide array of types and use cases. Even worse, little is done to discuss the viability of implementation and the myriad of hurdles this sector must overcome.
So, it's time to cut the hyperbole, pull out the key issues, and make some sensible guidance.
One term, multiple meanings
Firstly, let's be clear on what we are talking about.
In the simplest terms, big data analytics describes using computer software to find patterns or useful information from large or complex sets of data. Machine learning (or artificial intelligence) takes this one step further, in applying advanced computer "learning" to the analytical process, thereby speeding up the process, providing better results, handling larger more complex datasets, or all of the above.
However, the term "analytics" in healthcare has been so quickly whipped into marketing jargon that differentiation between these two key terms is quickly lost or interspersed.
Takeaway 1: Big data analytics is common in many healthcare IT applications in use today; machine learning is not (yet).
Analytics = Big business already
The large volume of vendors in the healthcare technology sector offering analytics in some shape or form further muddies the water in terms of clarity of definition and understanding the purpose and future of this segment.
Perhaps the best and simplest way to define analytics in the health space is to use the following categories (and yes, many of these are umbrella segments cover multiple subcategories):
- Analytics for clinical decision support and diagnosis uses large volumes of clinical data such as medical images or monitoring data to "learn" most likely outcome of each diagnoses and advises clinician. Today it's mostly applied for research and development trials.
- Analytics: care management and operational efficiency uses operational and institutional data on procedures, staffing, outcomes, and cost to better manage resources.
- Analytics: risk stratification of a given patient population or group of chronic disease sufferers, i.e., highlighting most at-risk patients, potential burden to healthcare system, and forecasting future resource and care needs based on defined and available criteria (usually a mix of demographics).
- Analytics: advanced symptom checker/personal health assistant is often provided as part of a digital health and wellness device, and it compares personal health data with large dataset and gives advice on how better to manage personal healthcare.
To add some context to the scale of developments in this sector, there have been more than 52 deals and $265 million U.S. (240 millon euros) worth of venture capital and investment funding for diagnostic analytics (imaging, clinical car,e and personal health assistant) since 2014, and this doesn't include the biggest investor of them all, IBM.
If you consider that risk stratification and care management analytics are increasingly major parts of electronic health record software, hospital information systems, and population health management software, the scale of the sector becomes apparent; it's just not neatly defined.
Takeaway 2: Healthcare is already a multibillion euro market sector for big data analytics; the vast majority is focused on healthcare operational efficiency (administration), rather than diagnostic applications.
Dr. HAL-9000 will see you now...
Now for the contentious part: the role of these big data analytics and machine learning in clinical diagnosis for imaging.
However, first a further distinction to the definitions above must be made, as the impact, ethics, regulation, and potential future role of each type has very different connotations for radiologists. Machine learning in clinical decision support solutions in research and development today has two broad categories of purpose: detection or diagnoses.
The vast majority, especially in terms of start-ups entering the marketplace, are only focused on detection. That is, applying machine learning to a large volume of images to feed the creation of an algorithm that detects a particular disease indicator, before alerting the reading physician to its existence.
To simplify, think of this as artificial intelligence (AI) fuelled computer-aided detection (CAD), albeit CAD that will become more refined and better at detection as more images and cases are fed into the system.
The benefit? Speeding up reading time and minimizing missed diagnoses.
There is still a lot to be addressed with machine learning for detection. Firstly, no system driven by AI has been regulated for clinical use yet; most industry expectations today suggest it will be at least another five years before full-scale implementation is possible. Moreover, there are other significant issues to consider.
What if a system detects an abnormality the radiologist decides to ignore? Will "overalerting" hinder radiologist reading and diagnosis? Or will "overalerting" lead to "alert-ignorance"?
Machine learning for diagnosis adds a further layer of complexity and contention. Could we use machine learning as an improvement on physician diagnoses? Will this lead to replacement of radiologists and physicians to automated diagnosis, or as a diagnostic tool for decision support?
Very few start-ups or investors have jumped into full-scale machine learning for diagnosis, apart from the biggest and most widely known: IBM and its Watson Health initiative. Why? Because it is a hugely complex and challenging proposition, both from a technological, regulatory and ethical stance, which means uncertainty of success and concern for commercialization.
Few in the industry today believe a machine learning diagnostic solution will be in widespread clinical use within five years. Instead, expect to see a raft of trials, lobbying, and ethical discussion on the role and purpose of AI-based diagnosis across the healthcare, and you guessed it, much more marketing hype.
What's more, no one is 100% assured diagnostic AI is a certainty for implementation on a wide-scale of use, so there's still clearly a long way to go to make sci-fi a reality.
Takeaway 3: There is a big difference between AI for detection and diagnosis. Detection AI solutions are likely to be implemented in the next five years; diagnostic AI could be a decade or more, if ever!
So, does machine learning in AI have massive potential? Absolutely.
Expected mainstream diagnostic use anytime soon? Unlikely.
Stephen Holloway is principal analyst and company director at Signify Research (www.signifyresearch.net), a health tech, market-intelligence firm based in Cranfield, U.K.
The comments and observations expressed herein do not necessarily reflect the opinions of AuntMinnieEurope.com, nor should they be construed as an endorsement or admonishment of any particular vendor, analyst, industry consultant, or consulting group.