The Story Behind Meducation
When handwritten notes on a prescription bottle lead to increased access for millions of patients nationwide.
by Elena Langdon
Pivotal moments in one’s personal life don’t often lead to improved outcomes for millions of people, but in the story below, that’s just what happened. The spark was connected to a close family member and stemmed from a common immigrant experience. It also considered language and culture from the get-go. The result was increased meaningful language access for millions of patients in the United States, with a side of technology and a lot of innovation.
When Dr. Charles Lee was in medical school in California, he experienced one of those moments that would change everything for him professionally. And luckily for the millions of language-marginalized patients in the US, this moment also led to an innovative and effective way for patients to access clear prescription and medical device information.
During his medical training he went to visit his grandmother, who lived in Los Angeles’ Koreatown. He saw four pill bottles on her kitchen counter and, curious to see what she was taking, noticed it had a piece of paper taped to it handwritten in Korean script he couldn’t read. He asked her about it, and she told him a story that is not uncommon among immigrants.
“She said she goes to a doctor who’s actually Korean, but a lot of what he says still goes over her head, and he gives her a prescription that she knows to take to the pharmacist,” Lee said. “She goes to the pharmacy, the pharmacist doesn’t talk to her and gives her pill bottles written in English she can’t read. So what she does is have somebody in her apartment complex write it on a piece of paper, and that’s what she tapes to the outside of the pill bottle. So I thought, ‘This can’t possibly be the way we do this! With around 25 million people who can’t speak English very well, and we expect them just to figure it out?’”
At the time, Dr. Lee talked to the FDA. He said they didn’t have any plans to develop mandates around prescription instructions in other languages because they were still struggling with English.
“To me, it’s just common sense that better health outcomes start with better informed patients,” he said in an 2013 interview. And health information is “often written in high-grade reading levels using medical jargon, and often only available in English. If it is available in another language, it’s usually only in Spanish.”
Dr. Lee became determined to help solve this problem with technology.
The Institute of Medicine (IOM) estimates that medication errors account for about 7,000 deaths annually and cost approximately $2 billion to the nation as a whole. Medication errors were also found to account for over 700,000 injuries each year. Language barriers have been found to be a significant cause of some of these deaths and injuries.
As a refresher for those of you unfamiliar with US language requirements in healthcare, know that while there are federal laws and guidelines for providing meaningful language access, the mandates are unfunded and not actively enforced. Title VI, from the 1964 Civil Rights Act, forbids discrimination based on national origin, and Executive Order 13166 signed in 2000 by President Bill Clinton specifies that “national origin” implies language. The Department of Health and Human Services’ (HHS) Culturally and Linguistically Appropriate Services (CLAS) Standards, which must be met for a healthcare organization to obtain and maintain its accreditation status vis-a-vis the Joint Commission, specify what meaningful access is and make recommendations about how to meet the standards.
Some states, such as Massachusetts, Oregon, and California, have additional laws, but none of them require insurers to reimburse for the often expensive services, which include written translation, bilingual providers, and “qualified” interpreters. (Yes, the terms are vague and do not define who can provide the services or how they should be tested or trained.)
In a fascinating December conversation, Dr. Lee spoke about all the different projects he has developed throughout the years, which he did even as he continued to practice medicine. In 2005 he received four National Institutes of Health (NIH) grants to develop language communication tools for medical professionals. “We did a lot of different things to see what would work,” he said.
He started with ProLingua, a bedside communicator for nursing staff who have access to interpreters but typically don’t call them, especially for short non-medical exchanges. It was a verbal communication device with which patients could respond to simple questions about going to the bathroom or needing water, in their own language. The questions and answers were pre-recorded. Yet because this required hardware and so-called “COWs” (computers on rolling carts, or computers on wheels), and hospitals have a variety of setups and differ in their use of hardware, ProLingua proved tough to sell. “After a lot of R&D and market research we found that the commercialization of building up a marketplace for that was much more limited and the sales process was complicated.”
After that came a discharge instruction translation program in eight different languages, to help patients continue their care at home. This project met with the same fate as ProLingua, but Dr. Lee hasn’t lost hope it will be picked up in the future. A third project, AMLETA (Automated Multi-Language Emergency Triage Assistor), focused on emergency triage situations. “There were several cases where people would come to the emergency room with [for example] chest pain, they didn’t speak English, so they had to wait for the interpreter, and there was one hospital […] where people were walking by a person and didn’t know that he was dead.” The idea behind AMLETA was to prevent such tragedies, in which language-marginalized patients are not triaged in time. If providers can quickly assess the major symptom of a patient and determine whether immediate treatment was needed or not, a lot of unnecessary suffering can be avoided. Most emergency rooms have interpreters on call for a few or even multiple languages, but this does not mean that one will be available for every patient in every language immediately. Even over-the-phone (OPI) interpreters cannot be counted on 100%. A pre-translated set of questions and answers to quickly assess the level of threat to a patient’s life seems like a dream plan, but it too was met with low marketplace demand.
The final project Dr. Lee developed was both closer to his original reason for looking into how to increase language access in healthcare and more successful in the marketplace. The difference? In litigious America, the project took off for the reason that many initiatives do when an unfunded mandate like Title VI is involved: a civil rights complaint by a Spanish speaker. The complaint was against Medco, a large pharmacy management service that had some of the industry’s biggest clients, like CVS, Rite Aid, and Costco. In 2009, Medco and New York’s Attorney General settled, and the company agreed to a series of actions to promote language access across its broad clientbase and among language-marginalized patients. That’s right: Because of a lawsuit in one state, millions of patients nationwide would soon have access to prescription instructions in their preferred language. For Dr. Lee, the time had finally come.
So what’s the best approach to centralizing and streamlining AE reporting for staying ahead of authority deadlines and maximizing safety-signal insights?
Companies can take several practical steps. The first step is agreeing on a central capability and process with the right support from a specialist-managed service provider. This provider should have not just appropriate language skills across all markets, but also the safety/PV experience to drive change in how the company handles and uses incoming AE information.
Additionally, companies often rely on cleaning up data after the fact, pulling it into data lakes to perform analysis and extract meaningful insights. But the ideal approach should be to achieve cleaner, more centralized, reliable data earlier in the AE data capture process, putting the organization in a better position for success — both in regulatory compliance and business intelligence.
Pharmaceutical companies should also invest in stronger and more consistent connectivity between dispersed operations and information silos. One option here might be to build a portal through which translations of AE reports are submitted and managed by a central service, which pushes this content directly into a central drug safety database for ongoing analysis. As long as there is a global, centralized view, this kind of development can help to streamline processes, introduce greater consistency and speed of turnaround, and reduce human touch points as information is collated and processed.
With regard to translation, centralizing resources could help with retaining experts in languages subject to high-volume demand and thus creating economies of scale. Specialized language service providers should do this anyway, and having experts on standby rather than buying in more ad-hoc support can cut AE report translation costs down by as much as seven eighths.
Another benefit of standardizing and streamlining approaches to global AE report management is the potential for using artificial intelligence/machine learning (AI/ML). It is difficult to apply this kind of technology to scribbled notes jotted down by busy medics, or the wide variety of formats and languages through which AE information is submitted. But the greater the consistency in the record, the greater the scope for repetitive pattern recognition. At the very least, this could help with trend identification because the same terms and phrases could be picked up in translated reports.
The scope for near real-time AI/ML-enabled trend capture is likely to grow, including voice-to-text capture, as digital/online/mobile AE reporting increases. This can provide more clarity in event recording — but only as long as information is captured and processed centrally, enabling a 360-degree view of global input. And the earlier and more reliable the trend information that’s captured, the greater the potential for making timely improvements to drug formulations, thus enhancing patient safety and improving clinical outcomes.
The final area where AI/ML technology has great potential is in processing content from threads in public online forums and published global medical safety literature. Together, these form a large and growing element of PV, and the input exists in all languages and character sets. This all needs to be interpreted, cross-referenced, and included in PV reports. Although “crawler” software can easily find and filter this content, key information still needs to be translated, examined, and amalgamated. This is labor-intensive work that can add up over the course of a year. A million words of such content is not uncommon for larger companies over a 12-month period.
The more that a single specialist resource can be tasked with centralizing and consistently managing the work process, the greater will be the scope for cost efficiencies — and also for breakthrough insights to be made in the field of drugs and medicines.
Elena Langdon is a staff writer for MultiLingual.
Localizability checks are performed to identify elements in source material that could potentially cause localization issues and lead to delays. The purpose of such checks…→ Continue Reading
In 2004, I founded a language services company. Two years earlier, experts claimed that by now, the language services industry would be largely consolidated, and…→ Continue Reading
Many roads lead to a career in the language industry. Take Melissa Meyer, for example. The CEO of Barbier, Inc., Meyer’s connection to multilingualism reaches…→ Continue Reading
Subscribe to stay updated between magazine issues.