#HIMSS17 Perspective: Connected Healthbots Will Wake Up Clinical Decision Support

This article was first published on the HIMSS Conference site on January 29, 2017:

The computer “woke up once it got connected to other people”.Anil Dash

On a recent Krista Tippett’s OnBeing broadcast, Anil Dash talked about how for the first half of his life the computer was an island that wasn’t plugged into anything. But the computer “woke up once it got connected to other people” via the Internet. He jokes that young people he now mentors find it hard to understand the use of a computer that doesn’t communicate with other computers and they wonder what computer users did with those early machines: stare at the screen?

Those of us who follow the development of EHRs and their effect on the doctor-patient relationship don’t think there’s anything funny about a computer that doesn’t communicate with other computers. Unfortunately, in the health IT community, we’re still pondering what we could do ‘if only’ data could flow more freely between computers.

But, it requires more than just creating a communications network between the boxes to wake up computers and use them to their fullest; it requires transmitting data that can be programmed or executed. My colleague at the InfoCommerce Group, Russell Perkins, who has been producing conferences on the theme of Data Content for over a decade, has long spoken of the power of “data that can do stuff.” HIMSS prefers the hashtag #PutData2Work. The sentiment is the same: we need data that are interoperable and can be communicated efficiently across networks so that important information can be consulted at the point of need and integrated with other data to support health care decisions.

Stocks versus Flows

I view it as a stock versus flow problem. The major EHR vendors focus on creating record-keeping systems; they don’t specialize in the communications layer that moves information from one place to another. Their reluctance to develop—or even enable— inter-organization communication is similar to the reaction of the computer hardware and software vendors when faced with the Internet in the mid-90s. These vendors underestimated the changes that would occur once users were allowed to communicate and exchange data across a wide network of computers. It took a major external development effort to invent the Web. In health IT, we don’t have to reinvent the Internet or the Web; they already exist. But, based on events to-date, we should look to a new class of vendors that understand data science and APIs to introduce enhanced communications and cross-institution data analysis. The legacy EMR/EHR vendors that market record-keeping systems either don’t have the skills, imagination or urgency to extend their field of competency into data exchange.

Enter the Bots

Technology continues to make it easier and faster to find information. Early search engines required training courses to master the interfaces. Then Google came along and offered a simple interface that sat atop algorithms that aimed to present the most relevant results first.

Now, we’ve entered a new phase, where the search box itself is being replaced by bots that surface relevant information without requiring a search query. For example, with my Google Pixel phone, all I have to do is ask Google Assistant to find me direct flights from point A to point B and I’ll receive verbal information about the length of the flights, along with search results for specific flights that I can click on. If I’ve previously searched for flights to that destination, Google will remember the dates I chose and use them as the starting point. Once I’ve booked the flight, I can easily retrieve the information with a quick request to Google. Note, this is an incremental step that focuses on advances in the chat interface and builds upon data extraction and presentation tools that Google has been developing for years.

Through cognitive computing advances, consumer chatbots like Google Assistant, Amazon Alexa, and Apple’s Siri have matured to a point where bots can learn what is important to a user and continuously hone its ability to personalize its services. The popularity of these consumer chatbots is paving the way toward adoption by other segments; telemedicine interfaces are a good example.


I define healthbots as special purpose bots that perform automated context-specific information retrieval and analysis services via voice, text, or other natural language interface.

One of the more obvious areas where healthbots could improve efficiency in health IT is in EHR interfaces. Bots are ideal for navigating through complicated structured databases. Healthbots could speed up the process of finding the right location for both data entry and data retrieval and they could perform countless advanced operations, specialized for each use case.

Still, to achieve higher-value implementations, healthbots need to operate across broader collections of data than what is stored in a single healthcare provider’s EHR system. With the massive amounts of data being produced by medical and life science researchers, devices, sensors and health data analytics services, healthbots will need to work in tandem with APIs to enable the degree of information flows needed to take clinical decision support solutions to the next level, where data from multiple sources can be interpreted in context and delivered within the clinician’s or patient’s workflow.

What to Watch at HIMSS17

Each year that I’ve attended HIMSS since 2010, I’ve sought out signs of progress in integrating evidence-based medical information into the clinician’s workflow. Good progress has been made in appending links to relevant content at the point-of-need within patient records (via Infobuttons & more recently CDS Hooks), but there’s still a long way to go before we see a clinical decision support resource that pulls together relevant data on a topic from diverse collections of data and includes collaborative features that help construct a learning system based on collective experience. The biggest limiting factor has been business models for licensing or sharing data across competing publishers and my hope is that APIs will continue to advance in how they manage business terms for data licensing or sharing.

This year at HIMSS17, I’ll be seeking out companies that produce healthbots, as well as those that provide APIs that embed business rules for exchanging data. APIs are a critical piece of the solution needed to wake up data stored in siloed repositories so that healthbots can reinvent the way we communicate with broader collections of health data.

It’s important to recognize that siloed repositories exist across the entire health information landscape, not just in EHR/EMR systems. Medical and life science research information is scattered across many public and private publishing and research organizations. Publishers and information services companies try to aggregate and synthesize results of new research, but there’s no single source for such a complex and constantly-changing set of data, especially considering the number of potential use cases for the data.

We need a solution that allows collaboration across the many sources of data to create learning systems in healthcare. Medical publishers that are implementing APIs to their data silos will definitely catch my attention at HIMSS17.


HIMSS17 Social Media Ambassador Janice McCallum on moving patient engagement beyond paternalistic compliance

This interview was originally published on on Feb 1, 2017:

Healthcare IT News asked the HIMSS17 Social Media Ambassador about what she’s looking forward to at this year’s show, things that get under her skin and she reveals facts about her professional background that even devout followers might not yet know about.

Q: What are you most looking forward to at HIMSS17?
 I’m looking forward to Ginni Rometty’s (CEO, IBM) opening keynote this year. IBM Watson Health has the might to create and market breakthrough technologies and it is exciting to see a woman, Deborah DiSanzo, leading that unit, as well as a woman leading the $80 billion parent company.

Q: What issues do you think are top-of-mind for your social media followers?
My social media followers look to me for evidence of big trends in health IT and healthcare delivery models. At HIMSS17, I expect concerns about how the new administration will affect regulations and reimbursements will be top of mind.

Q: Who’s your favorite healthcare hero? Why?
 I’ve never been one for hero worship; instead I reserve my admiration for all the people involved in healthcare who exhibit empathy and understand that what may be right for one patient may not be right for another patient. I include clinicians, researchers, other industry insiders and patient advocates in this group and I can say without hesitation that all of my fellow SMAs, past and present, are stellar exemplars of the people I admire!

Q: What’s your pet peeve? (Either on- or off-line?)
 My pet peeve is the disconnect between what providers and vendors call patient engagement programs and what patients actually need to become more engaged with their healthcare providers. For starters, patients need to have a voice in their care and they should have full access to data related to their care, including their complete health record.  Without fully including patients in their own health care decisions, patient engagement programs are nothing more than paternalistic compliance programs.

Q: What is something your social media followers do not know about you?
 Most of my social media friends and followers don’t know about my early hands-on experience with data modeling, which includes work at the OECD in Paris, the Urban Institute in DC, and in graduate school in Chicago. There are lots of stories I can tell about transferring data from mag tape to mainframes, to licensing data from Alan Greenspan, to staying awake powered by coffee to run econometric models all night! While studying for my MBA, I worked for my econometrics professor, John Abowd, who is now chief scientist leading research and methodology at the US Census Bureau. 


HIMSS17 Schedule, Meetups & Themes

Navigating the HIMSS annual conference is a challenge. This year, I’ve booked almost all of my time in advance—from dawn to way beyond sundown. I’ll wear comfortable walking shoes, since I’ll likely be speed-walking from one meeting to another across the exhibit hall in near constant motion for 3 days. Here are a few places you can catch me during the conference:

Monday, Feb 20:

11 - 11:45 am Meet the Social Media Ambassadors, HIMSS Spot Lobby C

evening           HIStalkapalooza

Tuesday, Feb 21

11 - 11:45 am HealthITChicks meetup, HIMSS Spot Lobby C

 4 -  5 pm        New England HIMSS Social Event at Lenovo Booth 6170

 6 - 8 pm         New Media Meetup, Cuba Libre Restaurant (will be late arrival!)

Wednesday, Feb 22

11 - 11:30 am  Social Media Ambassador Debates, HIMSS Media booth 2123

                        (Thrilled to be paired with Dr. Rasu Shrestha, Chief Innovation Officer at UPMC, to debate/have discussion on physician engagement with technology)

  2:30 pm         Facebook Live intereview at Conduent Health booth 951     

  5:00 pm         The Walking Gallery meetup, sponsored by Conduent Health


See my industry perspective article on healthbots and advances in clinical decision support here:

Look forward to seeing a few tens of thousands of other healthIT enthusiasts next week! You can reach me at



Searching for Healthbots That Advance Research and Clinical Information Discovery at HIMSS17

Improving the flow of information is a consistent motivator in everything I do in my professional life. With early experience as a researcher and product manager at a pre-Internet era search engine, my consulting practice has focused on helping information-centric companies to disseminate their content more effectively within the context of their business objectives. With this long-view of the publishing and information dissemination segments in mind, I’d like to offer some observations and predictions for trends at HIMSS17.

1)      Growth in usage of digital devices and sensors will be a catalyst for progress in interoperability. This is a safe prediction, but I include it because it sets the stage for my other predictions. With the proliferation of devices and sensors, all of which produce data, we need some rationalization in the way the data are recorded and integrated for analysis. It won’t be sufficient to suggest that IT systems manage each device and its data separately. The devices will have to interact with other devices and with EHR & other systems. Another way to state this prediction: as the Internet of Things (IoT) develops into the Industrial IoT, transmitting data in multiple directions in real time for clinical and research purposes will be commonplace. Through consolidation among startups and the entry of big players, more resources will be devoted to resolving health data interoperability issues.

To make sense of the all of the data being produced by sensors and other devices, information systems that can interpret the data in context (i.e., AI/cognitive computing systems that incorporate machine learning techniques) will be needed. That brings me to my second prediction:

2)      Healthbots increasingly become the new interface to health information & health data for patient information. Chatbots have evolved from simple voice recognition technologies to cognitive computing interfaces that can execute complex commands and improve their utility over time with machine learning technologies. I expect success in the consumer space via Apple’s Siri, Google Assistant, Amazon’s Alexa and other examples to carry over to the patient engagement and patient education space quite rapidly, although a secure channel will be required for healthbots, whether the bot uses a voice, haptic, or typing interface. Telemedicine services represent an obvious segment where chatbot interfaces are already in place.

Applications in clinical decision support for professionals will emerge in areas where the knowledgebase is complex, but mostly contained to similar datasets (e.g., EHRs and medication reconciliation use cases). Chatbots make sense, too, in areas where hands-free use is important. But, overall, adoption within clinical enterprises will be hampered by data access issues and will take longer to reach wide acceptance.

See for an excellent round-up of opinions from industry leaders on the future of chatbots/healthbots.

My final prediction is a cross-industry trend that will improve information flows in general, but I address it here in context of clinical decision support (CDS).

3)      Information discovery will no longer require an active search. Search will still exist, but it will exist primarily in the background. In fact, Susannah Fox, former CTO at HHS, once called search “wallpaper technology, something we don’t even see anymore, yet it’s clearly an activity worth discussion[1]. [This prescient quote comes from a post written in early 2010; Susannah is one of the best prognosticators in health care, after all!]

This shift from blank search boxes and look-up tables toward surfacing relevant information based on context, prior behavior, collaborative filtering algorithms & other patterns has been occurring for some time. One example that bridges the search and discovery paradigm is Google’s inclusion of knowledge graph items that are displayed at the top of search results when one enters a disease such as ‘diabetes’ in the search box.  

Another example is the TrendMD model[2], which appends personalized contextual links to the article someone is reading. The links can be sourced from any of the 3,000+ sources within the TrendMD network of scholarly publishers and professional news sites, which allows related information from other fields or specialties to be surfaced, but offers the assurance that links won’t be sourced from unwanted advertising sites. Like machine learning-enhanced healthbots, the quality of the related links improve over time with increased usage as the algorithms learn about an individual’s preferences and gain knowledge from the broader community of users.

Closer to home for the HIMSS audience, CDS Hooks, an open standard within the SMART on FHIR framework[3], will advance the clinical decision support goal of delivering the right information to the right person at the right time in the right format within the right channel. However, as described above, cognitive computing and machine learning technologies can take this type of information alert to the next level and act on the data that are surfaced. It will take time for executable CDS to become widespread; mistakes are too costly and clear rules for executing clinical orders aren’t sufficiently established yet to create workflows that are generally acceptable.   

At this point, I’d like to insert a cautionary note about the importance of privacy and transparency in CDS and healthbot systems. Bots are becoming popular and can be rather addictive when they learn from large numbers of information sources and deliver personalized results. For example, it’s great when Google Maps redirects us around traffic accidents in near real time, but the downside of a mistake, say a 10 minute detour, doesn’t compare to the downside of an incorrect dosage or incorrect rehab instructions. We don’t want to become too dependent on bots without understanding how they calculate outputs and maintain privacy of the individuals using the bots.

At HIMSS17, I look forward to reporting on notable advancements in these three areas and to putting the whole HIMSS experience in context of improved information flows and decision support for clinicians, researchers, and patients. 

See also this recent Firetalk video chat on chatbots and healthbots with Chuck Webster, MD and me:  



[2] TrendMD is a current client.

[3] See my blog on this topic:


Health IT Infrastructure Enables Clinical Decision Support within Workflow

“Infrastructure enables innovation” –Mignon Clyburn, FCC Commissioner 

I like this quote by Mignon Clyburn that Rob Havasy used in his presentation at the New England HIMSS National Health IT Week event last evening in Boston. People often balk at the effort and expense required for large infrastructure projects (remember, I’m from Boston and lived through the Big Dig!). Nonetheless, a strong reliable infrastructure is essential to establishing the basis for a vibrant and innovative ecosystem. 

Since attending my first national HIMSS meeting in 2010, this has been my consistent refrain: we need to establish foundational health IT infrastructure so that we can move on to disseminating information more efficiently and enabling advanced analytics. Large scale outcomes analysis and population health management simply aren’t feasible without a basic layer of data organization and management provided by open standards and interoperable systems. 

Much has been achieved in establishing the core record-keeping infrastructure. Currently, we’re making good progress in establishing interoperability standards for basic data exchange. Still, we need to go further than simple data exchange; the data that are exchanged have to be executable if we want to build real-time clinical decision support applications. In other words, we need a higher level of data interoperability that includes sufficient metadata to enable real-time integration into analytics systems for population health management analysis, diagnostic support systems, and the like.

CDS hooks

One of the initiatives in the health IT standards domain that I find promising is the CDS (clinical decision support) Hooks effort spearheaded by Josh Mandel, MD, a health informatics researcher at Harvard Medical School & Boston Children’s Hospital[1]. CDS Hooks works within the SMART on FHIR ecosystem to send notifications of information sources that may be of value to the user in real time. Users don’t have to know in advance that resources are available; instead relevant resources are presented within the user’s workflow for them to consult at their option.

For the most part, CDS resources have been important reference sources for academic and medical researchers, but their usage by practicing clinicians has remained limited. To move from being “nice to have” reference sources to truly achieving the goal of “making the right decisions as easy as possible to come by, and as easy as possible to execute”[2], clinical decision support tools need to be embedded in the workflow of the clinician, patient, or other decision maker.  There are still a lot of interoperability issues to work out, but I plan to watch the development in CDS Hooks and encourage publishers of evidence-based databases and other resources to explore intently how they can connect their resources to the SMART on FHIR ecosystem.

Delivering the right information to the person at the right time in the right format via the right channel (the 5 rights of clinical decision support) enables better decisions and supports improved information flows to all stakeholders, including patients. Advancements in core health IT infrastructure and improved interoperability standards will help make these 5 rights an everyday practice. That’s why #IHeartHIT.


[1] This interview with Josh in Healthcare Informatics provides a useful introduction to CDS Hooks:

[2] Jonathan Teich, MD quoted in, June 14, 2006.