If I said you are illiterate, in your own culture or native language, you would likely feel insulted…take umbrage. Brooding and mulling over the notion, the extent of its possibilities, as if some small voice was telling you that I may be right.
You might even be incented to quietly reflect on it, to learn how it might be limiting you, and whether additional effort and change are needed, on your part… to investigate and learn: to grow.
The practice of medicine is arguably described by and subject to social constructs—healthcare is often used as an example of a complex sociotechnical system—with a culture and a language (medspeak) and difficult problems to solve.
There is talk of healthcare being a deemed a basic human right and some believe that a newly crafted social contract is merited by the advent of artificial intelligence—per its human-like potential to be biased, make bad decisions, or to be used for nefarious purpose.
In recent framings (much as the founding fathers framed the Constitution) of data and digital transformation in healthcare, the path to unbiased equity is kindled by data democracy—democratization being the table stakes for progress along a virtuous path; furthering the social notions surrounding artificial intelligence.
Moreover, to gird and comport ourselves, there are myriad dictionaries, books, journal articles, online materials to look things up. Science progresses, driving a continual need to “keep up” should any knowledge gap grow too great…breaching illiteracy.
In some cases, the issue is not to be well-read as a physician, but to be capably enabled for the various uses and implementation of the necessary knowledge—with what’s necessary being defined largely by the culture surrounding safety, efficacy, efficiency, and privacy. And so, the verge between necessary knowledge and implementation of a new “tool” is a space—a thin wedge– for literacy, training, rehearsal, and testing. Unfortunately, as legacy culture has been built on the failing chassis of fee-for-service healthcare, there are a lot of “solutions” vying for and trying to enter that thin wedge.
So why then is the “blog-giest” recommending data literacy as the next foundational strategic chess move for the adoption of advanced analytics and, eventually, AI? To those who practiced medicine for entire careers with their expert minds, paper and pen, along with fax machines, this AI stuff doesn’t seem…necessary.
To those who practiced medicine for entire careers with their expert minds, paper and pen, along with fax machines, this AI stuff doesn’t seem…necessary. Click To Tweet
The seemingly inexorable advancement of technology creates a tandem discomfort which must be managed by either a tailspin into obsolescence (deny, dismiss, delay), or additive skilling to absolve the discomfort—for, as Wilbur Wright noted, “It is possible to fly without motors, but not without knowledge and skill”.
Though there was resistance to implementation of percussion and the stethoscope, in the days of Laennec, those technologies were perceived to eventually reduce, not amplify, certain diagnostic biases of the day, and certainly not attended with calls for social contracts and data democratization (though they were perceived to impinge on patient physical privacy, based on the mores and medical culture of that time).
Tee up some of the more recent technology candidates of the past 15 years—smart stethoscopes, portable pulse oximetry, smartphone-based ultrasound, home-based monitoring (telemetry and EKGs), and accelerometry, to mention a few, and we are moving toward a data exigence where either failure to implement, blissful naivete, or data illiteracy will be perceived—and might actually come to be negligent: particularly if a technology, via the “retrospectoscope”, could be shown to have flagged an adverse event early enough to intervene.
Given the arc subtended by artificial intelligence, to broadly impact both low-level repetitive process automation with “digital workers”, along with the many more complex high-level use cases “north” of those, where discernment and classification (pretty much the bailiwick of medicine) are necessary—to scale out broadly across many systems and workflows (i.e., this ain’t your grandaddy’s stethoscope)— the argument for literacy is more compelling than with many other technologies that don’t discern, classify, or decide separate significant human interaction in those choices.
Some level of literacy is needed for subjects of the realm (or culture) to align on consensus social behaviors and collective responsibilities (social contract). There are organizations to help with literacy in artificial intelligence: the ABAIM is a great example.
The time to study up is upon us. Stay literate, my friends.