We’ve been working with a leader in the obstetrics field to push fetal data into the HALO. Not only have we matched the color coding used in the fetal charting, as well as integrated the key data flows into an easy-to-read object that a mom-to-be can read, the HALO can even mimic contractions in a way that shows an up-to-the moment picture through transition and delivery.
And the HALO can rewind like a movie of the entire birth for any of the birth team to see. We’re really excited about this one.
4. Can you talk about the origins of the HALO visualization itself? It’s not hard to imagine people getting a plant, or a digital pet, or other gamified visuals. Where do you see this layer of the platform evolving?
The HALO is part of a larger 3-dimensional, 4-component system called the Protostar, which is still in development.
Without getting too deep into what that is and how it works — it’s safe to say that it’s a self-populating coalescence structure that aggregates and distributes data from the life of a person or entity. Think of it as a 3D ‘browser’ that is navigable and explorable that, if we’re right its application to medical, could become a next generation EHR.
Anyway, the Protostar is obviously a pretty large-scale development initiative. And with the pressure to move our start-up to a viable revenue model, we decided to extract and develop one component of that system as a stand-alone product. That was the piece that signals moment-to-moment health; the HALO.
We actually see a future where social platforms will become object-based. Where the words, pictures, and sounds that comprise a person’s profile will become signaling aspects in objects that are a next-level identity. Beyond the constrictions of DNA, but also algorithmically precise and individualized. Like plants and even worlds.
But that is a crazy BHAG and it’s good to have as a guiding vision. In the near-term, we can imagine a navigable platform of objects that represent an ecosystem of health organizations, their practitioners, and their patients. And a way to start seeing our communities, our geographic regions, and our civilization as a whole in a way that tells us an immediate story about our collective health.
This begins by working out a visual language that can integrate and grow through a progression of data-architected objects. And this necessarily starts with biological signals, and the multitude of applications that that project implies.
5. The healthcare system has shown little interest in improving data visualization for patient engagement. For example, most patient portal lab results are unformatted raw data that are never explained to patients. Practically speaking (culturally, politically, financially), what will it take to leapfrog forward into the modern digital era? Will this sort of thing ever become a standard of care?
I think so. Because we’ve already seen advancements in this regard.
My co-founder Peter Crnokrak says that whenever we significantly decrease the time between a person and their ability to receive, comprehend, and act on data, we mark an evolutionary thrust.
I don’t know what the initial reaction to the development of MRI was, but it would be hard to imagine the forces that would have stood against it, except time and money. And a lack of imagination. Can you imagine any one saying, ‘we have no need for a highly versatile imaging technique that can give us pictures of our anatomy that are better than X-rays?’
That is a prime example of improving data visualization for patient engagement. I know because when my father who was a scratch golfer started missing 2 foot putts, he was able to be diagnosed with neurological cancer. But like the X-ray before it, these are practitioner-side innovations.
The developed world is in the throes of a paradigm shift in medicine. For lots of reasons that I am sure everyone of your readers is deeply familiar with, people are going to have to start learning to take care of themselves. They are going to have to understand this amazing technology called the human body and how it works. And how it lives and why it dies. And to the extent that they take up that challenge, there will be parallel innovations that give them the tools and processes to do that.
Seeing the body and its components in visualizations that not only show their present and historical health, but also guide the user to optimize them will be a demand that technologists and caregivers will meet.
More, we are now moving into a stage — call it population management — in which large medical institutions want to be tracking patients remotely. And there is still a role for humans in that endeavor. By this I mean, we still need people to watch and care for people. In the future this will mean pattern recognition of individuals and their health signals as opposed to an entire health monitoring system being controlled by machine learned thresholds. (Talk about a dystopian nightmare)
And in that scenario, which is already coming fast upon us, would you rather have nurses staring at screens of multiple complex data flows, or objects that can instinctively be read, isolated and acted upon?
I’m going with the latter.