BUSINESS

A Solid Foundation for Structured Content Mangement

By Agnes Cwienczek

P

 harma firms face a growing requirement to collect, manage, and publish data. Structured content management and authoring solutions can help. But many companies are trying to run before they can walk. They first need to get a handle on the huge volumes of unstructured data in many languages that are likely to be scattered across multiple, disparate systems.

Companies need to be smarter, faster, and more cost-efficient in the way they capture data and create critical content — from clinical trial study data to product labelling. Regulators are demanding increased safety and traceability data and stipulating increasingly stringent packaging and labeling requirements. At the same time, healthcare professionals and the public alike are calling for greater transparency around medical devices and pharmaceuticals.

Significant challenges lie ahead, and the road to effective multilingual data management is long and full of potholes. However, there are actions companies can take to smooth the path to effective information management:

1. Build in quick wins

To secure buy-in, look for some quick wins early on. It’s a good idea to start with a proof-of-concept in an area where documents are simpler. Choose factual rather than descriptive documents that are, preferably, in a single language. Chemistry, manufacturing, and control documents detailing drug composition may fall into this category and they are usually in a common language. Formula descriptions can easily be transferred to a table format with standard fields that describe a manufacturer or drug. Clinical study narratives — although historically written free-form — typically contain common standard text and tend to be generated in English as the default language.

Once companies have learned what works well and have discovered the benefits of using a standard, consistent format to capture and manage content, they can approach international labeling management in a structured manner, leveraging reusable master content.

Demonstrating specific measurable benefits in one area of the business can help justify the budget for further changes. It’s important, too, to get people on board and address their fears about change. Changing the way organizations manage content inevitably means disrupting the way people behave and work. Identifying corporate-level champions and creating a team to showcase successful initial use cases plays an important role in maintaining momentum.

2. Start small but keep the bigger picture in view

Companies should approach change with a clearly-defined and focused business case in a single area. However, once benefits have been demonstrated, it’s important to move quickly towards the goal of creating a strong technical structure. Having a master data source capable of supporting current and future use of regulated product data from one end of a global organization to the other is a critical requirement.

Too often, project teams try to address an isolated problem by applying a new approach to data management, expecting that focus alone — and perhaps simple use of XML to publish the same information to different channels — will bring them their desired results. But unless they approach their goals in the context of a wider journey, such investment will likely see a poor and limited return. These goals should include creating a credible, accurate, up-to-date, and compliant master data source that can be continually and consistently relied upon across the organization.

3. Establish a standard dictionary

When undertaking the daunting task of transforming and retrofitting an organization’s existing content into new, structured templates, it is critical to create a standard dictionary for all content. A standard dictionary establishes set rules for referring to products and product data and defines content metadata that makes assets searchable and connectable to context.

There is no getting away from the fact that this will be a vast undertaking that involves assessing content, de-duplicating repeat records, addressing subtle linguistic differences between versions of content, and so on. There are excellent tools that can help with this, by doing things like analyzing and comparing documents between countries and languages.

The positive impact of the change will begin to be felt in everyday activities once historic product data has been drawn down, cleaned up, transformed, and assigned proper schema — mapping relationships between data, different-language versions of the same content, and so on. Additionally, processes must be put in place to ensure that information maintenance (edits, additions) adheres to the new structure.

4. Regularly review the direction of travel

Establishing common structures and templates for data helps put a hard stop to continuing data complexity by imposing firmer parameters over what data is captured from document authors and how. Legacy formats and systems take time to sort out, but there comes a point when teams must stop creating and handling content “the old way.” By restricting data input to what is needed by regulators, companies can start to curb the creation of free-form content.

This helps keep everyone focused on building consistent, high-quality data with the potential for extensive re-use. As long as the data is kept up to date and accurate across its lifecycle it will remain trusted as a definitive information source.

5. Don’t expect overnight transformation

A high-quality, standardized global data structure is likely to take years to create. Ideally, it would be possible to analyze and transform content by tackling small sections at a time, comparing different sources to look for discrepancies or overlap. However, if respective systems and teams have captured data in different formats and with differing degrees of granularity, comparisons will require too much time. Other potential issues include data ownership. If content ownership has tended to exist at a document level rather than a source-data level, it may not be immediately obvious who should be driving any data transformation initiative.

It’s easy to underestimate the scope of what may be needed by an organization to put its house in order, especially after decades of working in fragmented ways with massive variation around how product-based data is captured and published. International firms will have amassed huge and highly dispersed volumes of data comprising multiple different formats. These will be of variable completeness and quality, and include considerable duplication and redundancy.

A roadmap to success

For any data transformation initiative to provide maximum long-term benefits, companies need to think in terms of building a roadmap to their desired destination. One that will help their operations run more smoothly once the way is clear. And this will take time, not least because there will be existing side roads in a poor state of repair and disused dead-end routes to be attended to as part of the new construction effort.

There are no shortcuts to structured, multilingual product information management. The pharmaceutical sector will benefit greatly from data-driven operational transformation, but only after completing the substantial groundwork required, which simply cannot happen overnight. 

Agnes Cwienczek is head of product management and consulting at Amplexor. Prior to joining Amplexor, Agnes worked at Merck in its global regulatory and quality assurance department. She received her master’s degree in information management from the University of Koblenz-Landau.

RELATED ARTICLES