These are busy and information rich times. I don't know about you, but my world is not short of content to consume through plenty of channels - Facebook, Twitter, LinkedIn and the rest. It is an unrelenting torrent of data, overwhelming at times. With my filtering goggles on, I scroll at speed for little golden nuggets of content, influenced by snappy titles that match my interests. I expect my information fast, and I want it relevant. And if there is one domain where fast and relevant data really counts, it's healthcare.

Now we e-health bods love innovation, the art of the possible is very exciting - for example my favourite clip of the moment that I show is data coming from our health platform to subscribing restaurants (authorised by the owner of the data of course). Smart use of Open API's allows allergy information to be flagged to those able to consume and work with such content, so the peanut allergy appears on the waiters iPad. It's a snazzy clip, but I think it also works for a couple of key reasons:

  1. It's quick (even the clip is quick).
  2. It's information in context. Relevant healthcare data flagged to a waiter or waitress to support the right selection from the menu.

When chatting to end users of our tech, the main wants and needs of the health and care professionals are generally consistent. The users just want the basic info they need to do their job and to make informed decisions. If they click a button, they want data back quickly. If there is lots of data, they need it filtered to support the context of an episode of care. 

I was once in a conversation with a consultant who made it clear that system performance and quality of data coming through is everything - and every second counts. The scenario we were discussing was of a suspected stroke patient rushed into A&E. In this scenario, he said, time is brain. Deciding to administer clot-busting drugs or not is reliant on speed & quality of information available regarding existing medications. Also, it's not just the medication data, but there is important context on why someone was prescribed something (the summary care record is useful, but has it's limitations). That additional insight can provide vital clues as to what is going on and can support faster, more informed decision making.

In the shared care record domain, the best view of the citizen’s world is achieved by bringing together all of those useful information assets from all care domains, one place, a single view. People will log in and use such a system if the data they need is in the system, and if the data they need appears on the screen in a suitable timeframe. That can mean both the speed of data return, but also smart and in-context surfacing of data that would otherwise be lurking, but might not be readily visible to deliver optimum value. 

Some shared care records are made with a federated model - data exists in its source system, and appears on the screen on demand when the shared care system queries the source for info based on relevant ID and parameters. The challenge here is the numerous different connectivity networks and speed risks that exist across the IT estate - you're only as fast as your slowest system...

The alternative is to move the data into a large repository as part of shared care record planning. This will always be our recommendation as it means data can be standardised as it moves. It is easier to build useful lists and provide decision support/reason against the data with analytics if the data is in one source. With a repository approach, you consolidate your performance management focus around one system & it provides a back-up if the source systems go down.

But like a top athlete or formula one car, high performance needs investment, in expertise and in time. A good system will be well used. One of our award-winning customers has over 2bn messages in the repository. That's a lot of data & a big database. That means capacity planning, provisioning of resources, logs to control and areas to optimise. It is this steady state operational management that lies at the heart of a sustained, successful programme, but is often overlooked or poorly controlled until an end user grumbles about something being too slow. Performance is also the reason why many of our most mature customers are choosing a continuous delivery approach, a platform in the cloud, with aims to minimise downtime and to cope with spikes in demand. With 50 times the amount of health data expected to be generated in the next 8 years, coping with these big data realities is crucial - and as we know from our A&E consultant mentioned earlier...every second counts.