For nearly 10 years, Tuck professor M. Eric Johnson has prowled the internet searching for data leaks from so-called secure systems. He first did this for the banking industry, and he found so many customer account numbers and pieces of internal correspondence that he was called to testify before Congress on the problem. Three years ago, Johnson, the Benjamin Ames Kimball Professor of the Science of Administration and director of Tuck’s Center for Digital Strategies, focused his attention on the health care sector and discovered huge piles of sensitive patient data readily available on public peer-to-peer networks.
Johnson’s latest health care-related research pivots from ringing alarm bells to crafting solutions. His first significant finding is that software usability—or the lack thereof—is one root cause of data leaks. A second line of inquiry, still a work-in-progress, indicates that investments in information security work best when made proactively and voluntarily. The stakes in this game are high—breached data costs the industry $6 billion per year, and is added to the already astronomical health care tab—so Johnson’s work comes at a crucial time.
“Usability” is a term of art in the computer software business. It’s a measure of how intuitive, helpful, and just plain easy a particular program is to use. Apple’s iTunes, for example, might score high marks for usability. On the other hand, if manipulating a piece of software is laborious, confusing, and frustrating—you know it when you see it—its usability suffers. The problem for the health care industry, which is highly fragmented and has a wide range of user sophistication, is that “when software is not usable, we create workarounds,” says Johnson. And workarounds—transferring data from a hospital’s proprietary billing software to an Excel spreadsheet, for instance—put data in a portable format that can easily leak out to the Internet in various ways.
Johnson identified two categories of at-risk data caused by the usability issue. The first he calls “born-vulnerable,” which is data that is created in an unsecure format such as a nurse’s notes entered into a Word document. The second is so-called “moved-vulnerable” data. This is information that began in an enterprise system but was transferred to a more usable form. The most common types of moved-vulnerable data, according to Johnson’s research, are spreadsheets created from human resource, operational, research and analysis, and financial data. Under the latter category, Johnson has found spreadsheets from a Texas hospital chain that were leaked to peer-to-peer networks, containing 20,000 patient files. Each file contained 82 fields of information such as diagnoses, insurance numbers, addresses, and phone numbers. “Intensely personal stuff,” Johnson says, “but information that could be used to drive a lot of fraud, too.”
Leaks of born-vulnerable data files might be high in number, but each file usually exposes just one patient. Moved-vulnerable data leaks are less common but more dangerous—a recent leak from Health Net, for example, affected two million people.
If the answer to these breaches is better usability, then the entire health care industry, from the solo practitioner pediatrician to hospitals and health insurance giants, needs to come up with better software. That’s not going to be easy, or cheap. Johnson’s other related research addresses this point. He examined the effects of information security investments made both before and after data breaches, and looked at instances where the investments were instituted after internal decision-making, or as a result of external mandates. The results indicate that investments in information security work best when made proactively (before a breach) and voluntarily (not forced by a regulatory regime). Why don’t mandated information security regulations work? Because, Johnson says, they aren’t holistic in scope, and they often solve last year’s problems in a one-size-fits-all way that isn’t easily applied across the spectrum of the health care industry’s players.
Thankfully, Johnson asserts, it appears that the federal government has created a framework for effective information security innovation, through the 2009 Health Information Technology for Economic and Clinical Health Act (HITECH). The law not only imposes fines for violations of the privacy and security rules in the Health Insurance Portability and Accountability Act (HIPAA), but beginning in 2011 physicians and hospitals are eligible for significant incentive payments if they demonstrate meaningful uses of electronic health records. These financial encouragements stand the best chance, Johnson says, of getting the health care industry to adopt more usable, and thus more secure, information technology practices.
Even after HITECH was passed, Johnson was able to find the same amount of private health care data in peer-to-peer networks that he found before. But he hasn’t lost hope. “I think it will improve slowly,” he says, “but it will be slowly, and by that I mean a decade.”