Deploy analytics tools
In the first blog of this series, we discussed the importance of aggregating data to provide a complete view of the patient drawn from across multiple sources, wherever the patient has received care. Almost as soon as any data has been aggregated, a range of analytic tools can be used to help advance the quality of care and improve clinician-patient interactions.
One of the simpler approaches is to analyse population-level quality measures such as the number and identity of patients with diabetes and abnormal A1c levels, or patients with diabetes who’ve not had an eye check. The ability to have both a population-level score, as well as immediate linkages to the individual patients involved, is key to enabling real-time improvements in care.
Quality scores are provided in great numbers and detail by both clinical governing entities as well as other organizations such as the National Committee for Quality Assurance (NCQA) in the US and the National Institute for Health and Care Excellence (NICE) in the UK. For ongoing healthcare quality reporting purposes, it is essential to have analytics and reporting tools that are automatically updated to meet the latest measures.
In addition to providing quality measures, tools that enable healthcare providers and population-level managers to mine and explore whole-population and specific cohort-level data are key. To deliver advanced population health care, analytic tools need to be employed to explore data sets efficiently, discover meaningful insights and identify gaps in care at both a population and individual level.
Explore data in-depth, discover insights, identify gaps in care
Soon, as a much broader range of data elements become integrated into the patient record (including social determinants of health (SDOH), behavioural, patient-generated data, and genomic data) there will be a corresponding demand for analytic tools to mine that data. To make sense of the vast volume of healthcare data now available to clinicians and population health managers, we need to employ next-generation tools that enable population exploration in detail according to a broad range of factors.
For example, if a healthcare provider is interested in investigating and improving the immunisation rate in their jurisdiction, population exploration would involve first understanding the overall immunisation rate. Simply determining that the rate is low is not enough to develop action plans to address the problem. In this way, the quality measures need to be followed by exploring the data looking for causative factors. Is it that children of minority groups are less likely to be vaccinated? Are low-income families a concern? Or perhaps the low rate is due to vaccine refusal typically associated with upper middle-income groups?
Predictive modelling using artificial intelligence and machine learning
So far, we have been completing the analysis in near real-time. Even so, it is ‘after the event’ - after the patient was or was not vaccinated on schedule, for instance. What if we could look ahead and predict with some degree of accuracy the patients who, in the next time period, will receive a vaccination and who won’t? What about finding patients in a population who are both early-stage pregnancy and at high risk for poor outcomes? If we could see more clearly into the future, clinicians and managers would have truly powerful tools to improve the quality of care and outcomes. To do this requires predictive modelling using artificial intelligence and machine learning algorithms applied to aggregated, high-quality data in a population.
It’s an ambitious undertaking to have high-quality data aggregated in real-time and available for a range of analysis including quality measures, exploration of data and application of predictive modelling. Nevertheless, it is achievable with the technology available today, with improved population health as the major benefit.
Interested in learning how Orion Health can help you to make sense of your data?