Skip to main content

The Ideal Architecture for Digital Transformation

Know the ‘Current State’ of Your Business AND
‘Future State’ of Your Business in Real Time.

As covered in my last article, the first steps toward a digital transformation are to develop a strategy, get buy-in, and create a roadmap aligned with the business. Now, you’re ready to select a proof-of-concept pilot project. Typically, the pilot should be small enough to be completed over a 3-4 month period and use an Agile project management approach. An Agile approach promotes flexibility, collaboration, and continuous improvement throughout the project lifecycle—which is particularly important for the introduction of a new technology. In addition, smaller projects are more easily funded and quick wins can help justify bigger capital investments as the scope grows.

The process of digital transformation and the implementation of advanced technologies, such as Machine Learning (ML) and Artificial Intelligence (AI), require vast amounts of data. This data is not useful unless it can be interpreted in a meaningful way. Context adds meaning to raw data and transforms it into actionable information. We’ve all heard about the failures of Big Data projects due to lack of data contextualization.

The ability to transmit and contextualize large amounts of data throughout the enterprise requires rethinking how OT networks are architected. Traditionally, the main focus has been to protect the data behind firewalls, the argument being that opening ports in the firewalls begins to compromise the integrity of existing cybersecurity systems. The most common OT network architecture is the vertically stacked Purdue model, where hardware and software solutions are siloed, and vendor locked. In this scenario, unlocking the data requires custom protocols and licenses—not exactly a viable approach. Therefore, a new architecture has emerged, that is better suited for large amounts of data and enterprise-wide data buses, called the Hub/Spoke model.


A Hub/Spoke architecture uses: a low-bandwidth protocol called MQTT, an efficient data transmission method called publish/subscribe (or pub/sub), and data brokers that scale to high traffic volumes and provide redundancy. This network structure can communicate seamlessly with cloud applications because they are already standardized on MQTT. A framework, called the Unified Namespace (UNS), is superimposed on top of the Hub/Spoke network to further extend its capability as an enterprise-wide data hub. The combination—a Hub/Spoke network and UNS—forms the ideal infrastructure for an enterprise-wide data hub to support Industry 4.0 and predictive analytics, like ML and AI. Hub/Spoke is the backbone; MQTT data brokers provide the ability to transport large amounts of data; and the UNS provides a contextualized framework for interoperability and seamless data access.


Ultimately everything and everyone is plugged into the network. As a single source of truth, your UNS is a virtual representation of the business in real time which forms the basis for predictive analytics and actionable decisions. This real-time data is used to collect and analyze data/information to leverage ML/AI. ML predicts future outcomes based on past patterns and the current state. The layers of the business are integrated and operate based on data and information from all layers—in real time. Stakeholders know the ‘current state’ of the business in real time and stakeholders know the ‘future state’ of the business in real time.

Now that you’ve been exposed to Hub/Spoke and UNS, you probably have some questions about your selection of a proof-of-concept pilot. Please reach out to me to start a dialog.

The Business Case for Digital Transformation

Unlock New Revenue Streams. Innovate Product Offerings.
Build Resilience Against Market Disruptions.

There are numerous white papers and surveys indicating that most CEOs believe data-driven transformation is crucial to their companies’ futures. Furthermore, the popularity of generative AI (ChatGPT, LLM, ML) allows us to believe that this technology is within practical reach. At the very least, manufacturers know they need to start investing in generative AI technologies in order to stay competitive.

The end-goal is that embedded AI and Machine Learning models will fuel predictive analytics and proactive decision-making—optimizing operations and maximizing quality, production, and yield.

We see and hear that companies do not have enough visibility into plant operations. For strategic planning and flexibility, a company needs to know the current state of its operations. Digital transformation empowers companies to optimize operations, enhance decision-making, and improve customer experiences through data-driven insights and automation. It enables companies to stay competitive in rapidly evolving markets by reducing costs, increasing efficiency, and accelerating time-to-market for new products and services. By leveraging advanced technologies, like AI and Machine Learning, businesses can unlock new revenue streams, innovate their product offerings, and build resilience against market disruptions.


Your first actions should be to invest in a clear strategy and vision aligned with your business and to secure executive sponsorship. Companies need a digital strategy to provide a clear roadmap for leveraging technology to achieve their business goals, ensuring alignment between digital initiatives and overall company objectives.

A well-defined digital strategy helps businesses navigate the complexities of digital transformation, prioritize investments, and address potential challenges like data security, integration, and change management. It enables companies to stay competitive by adapting to market trends, enhancing customer engagement, and continuously improving operational efficiency in an increasingly digital world.


Now that you understand the essentials to getting started on your Digital Transformation journey, I’ll next explain the ideal architecture needed for digital transformation to occur. In the meantime, please reach out to me with your comments or to start a dialog.

Paul Brodbeck to Present at Continuous Manufacturing Forum

September 17-19 | Princeton, NJ

Join us at the Continuous Manufacturing Forum, where Continua’s Paul Brodbeck will be speaking on Advances in Technology.

The Continuous Manufacturing Forum serves as a catalyst for compelling conversations, bringing together key opinion leaders in biotech and pharma. It unites executive leadership, directors, senior scientists, consultants, and a closely-knit network of CDMOs, academic leaders, and equipment providers.

Paul Brodbeck
Chief Technologist
Continua

Augmenting an MES

This pharmaceutical customer has a large amount of data that needs to be contextualized and brought into their MES, but that’s not what an MES does best. Our project entails bringing that data into a Unified Namespace (UNS) ecosystem and then back into the MES, once contextualized. If you want to use your data outside of the MES, for instance in an ERP or other business systems, then the number of links in and out can grow exponentially. Our eventual goal is to get away from proprietary software solutions that sit in the middle of your stack. When you move to a UNS, you can use any software and it’s easy to upgrade or switch, as well as much more flexible. Contrary to what many people may think, security is better too. Each layer has a firewall and the data is well-protected; if you open more ports, there’s inherently more risk. By the nature of the way a UNS works, getting IOT data out is less risky because you’re publishing out with no need to open inbound ports. In fact, Gartner is now recommending this architecture in lieu of the traditional Purdue model.

Integrating Lab Automation Systems

Every pharmaceutical company uses manual laboratory automation systems, like the Nova FLEX cell culture analyzer, and frequently they need to perform calculations based on the results. Continua worked with this company to integrate their Nova FLEX data with their plant control system to automate the required actions based on the analysis results and to publish this data into the UNS for visibility across the business area.

Monitoring High-Value R&D Materials

This biotech company had already started its Unified Namespace (UNS) journey with a proof-of-concept project in mind, our recommended approach. As one of the most advanced companies in their industry, they wanted to use the UNS technology platform as a way to move a manual process—that of monitoring and tracking its highly valuable R&D materials data throughout the earliest stages of drug development—to a cloud-based, contextualized, and accessible asset. UNS is so new, that it is not well understood by many companies and is too risky to roll out on a larger scale. Both the customer and Continua, while equally committed to UNS architecture, feel that these projects are best approached in an agile, iterative way, with engineering adjustments made to find the best path forward.

MES Data Informs Analytics

Manufacturing Execution Systems are a rich source of data for manufacturing analytics. They often provide information and context that are valuable for optimizing manufacturing and are not available in any other systems. However, extracting analytics data sets from these systems is challenging. Data generated by an MES is not structured and labeled for retrieval, but rather is organized in the context of manufacturing recipes which vary from site to site, product to product, batch to batch, and over time. The result is that many companies are not realizing the full value of their MES data to support analytics.

One way to attack this challenge is to hire a highly trained MES engineer to add code to export MES data and context in the format needed for analytics. While straightforward to implement, this approach is often time-consuming and expensive. It also requires constant upkeep and can result in MES and reporting performance issues. This approach can also add considerably more load on recipe authoring and testing, both initially and during lifecycle change management.

Another tactic is for the analytics tools to query the MES recipe directly. This approach can be faster to implement as it requires no recipe changes and can be tailored to the unique needs of the specific analytics platform. Unfortunately, because of the variability of recipe execution, changes to recipes over time, and limitations of the analytics software, these queries are often complex, brittle, and difficult to maintain. This solution can also result in duplicative efforts as custom queries must be crafted for each analytics tool. 

A third solution is to create a data presentation or abstraction layer, for example a Unified Namespace (UNS), as multiple systems want similar data and context from an MES. This third approach takes time and thought and may bring with it new systems and data flows—you will likely want to start with a pilot project. Done properly, the result can be a comprehensive and scalable solution that is robust and efficient to maintain. Depending on your situation, any of the above methods might be best.

At Continua, our focus is to understand what data you need and how you want to use it. That understanding then informs the solutions we recommend, whether it be a targeted point solution that addresses an immediate need or a broader, holistic solution to provide a foundation for a wide range of analytics goals.

Ideal Unified Namespace Pilots

Two ideal UNS pilots are Continuous Process Verification (CPV) and Machine Learning. CPV because it involves batch-to-batch comparisons that may have been manufactured in different ways or locations. Customers ask, “How can I measure the quality of my product if the critical process parameters are measured differently?” Machine Learning is a great UNS example because you’re bringing in data from multiple systems and UNS is a good way to feed those models.

Machine Learning has demonstrated the potential to extract insights and value from data; however, relative to other domains, manufacturing has been slow to realize this value. While pilot studies and proof of concept projects abound, scaling these solutions has been challenging. When asked, manufacturers cite inadequate data quality, poor data integration, and weak data governance as major reasons for the inability to easily scale solutions. Specifically, poor data integration and weak data governance are barriers that must be overcome to scale the integration of the multiple OT and IT systems required for deployment of Machine Learning solutions.

The concept of a Unified Name Space (UNS) directly addresses these barriers. UNS draws on modern distributed technology & architecture and combines it with hierarchical data naming and organization. The result is a centralized and standard presentation of OT data to enterprise systems. UNS is a scalable approach that addresses data integration and governance issues, removing two of the main impediments to the wide deployment of Machine Learning models.

While UNS can enable wider deployment of Machine Learning in manufacturing, its value extends far beyond any single use case. Having a single source of truth for manufacturing data can reduce the cost, simplify the deployment, and improve the robustness of any analytics or reporting solution that depends on manufacturing data. UNS should be considered as a key component of any digital transformation strategy.

MES Maintains Chain of Identity

With Cell/Gene Therapy, the treatment must be provided ‘vein to vein’ in a limited period of time—with the same rigorous controls as any other biologic. This manufacturer, not unlike others in this industry, also had additional process challenges to accommodate cell count variations in the starting material. In addition, this customer’s autologous process requires maintaining strict chain of identity across the supply chain with no room for error.

In Cell/Gene Therapy, the uniqueness of each batch requires it to be adjusted for every patient’s starting material. That’s where MES comes in: MES maintains control over a complex manual process; MES automatically tracks materials usage to an individual patient lot and maintains identity of the lot; and MES recipes help accommodate varied growth timelines for varying starting material.