Bioprocess development has long been recognized for its complexity, requiring rigorous experimentation, considerable time, and substantial resources. Yet a shift is underway.
With the advancement of data modeling and digital tools, the industry is now entering a new phase—one driven by predictive insights, accelerated timelines, and personalized therapeutics. At the heart of this transformation is process modeling: a discipline that is proving essential to biotech innovation.
From streamlining development to enabling scalable personalized medicine, process modeling is emerging as the backbone of digital bioprocess development. This concept is discussed in greater detail with Fabian Feidl in an episode of the Smart Biotech Scientist Podcast, hosted by David Brühlmann, founder of Brühlmann Consulting.
From Purpose to Platform: A Journey into Biotech and Modeling
The path to pioneering process modeling often begins with a deeply personal motivation. One such example is the experience of Fabian Feidl, whose career was shaped by early exposure to both the clinical side of patient care and the scientific side of biotherapeutics. After completing civilian service in a cancer rehabilitation clinic, he was inspired to study biotechnology, eventually earning bachelor's and master's degrees in pharmaceutical molecular biotechnology.
During a pivotal internship at Boehringer Ingelheim, Fabian encountered the striking concept of using genetically modified cells to produce therapeutic proteins. This sparked a passion that led to a research project at UCL, where he worked with robotic systems for downstream processes. The fascination with automation and high-throughput screening set the stage for his doctoral work at ETH Zurich, where he first encountered multivariate data analysis and process modeling.
Unlike traditional biotech curricula that emphasized experimental design, this exposure to modeling was a game changer. Later, during his MBA studies, he gained economic and managerial insight, further refining his understanding of return on investment and strategic implementation. This unique combination of experiences culminated in the founding of DataHow, where he now leads as CTO.
Why Process Modeling Matters More Than Ever
In many organizations, process modeling is still viewed with skepticism, often dismissed as overly technical or even unnecessary. But this perspective underestimates both the complexity of bioprocessing and the power of predictive tools.
Biotech processes rely on living systems, with a wide array of input variables—such as temperature, pH, and nutrients—and outputs, including product yield and critical quality attributes. These relationships are often nonlinear and difficult to decipher. Experiments are expensive and time-consuming; without modeling, navigating these interdependencies can feel like a trial-and-error process.
Process models allow teams to simulate outcomes, characterize design spaces, scale up processes, and even implement model-based real-time controls. By leveraging both experimental data and engineering knowledge, models reduce cost, accelerate development, and improve manufacturing outcomes.
Understanding the Difference: Data Analysis vs. Process Modeling
While both data analysis and modeling use data to generate insights, their objectives differ significantly. Data analysis focuses on exploring data to find trends, patterns, and answers to specific questions. It's retrospective in nature—what happened, and why?
Data modeling, by contrast, builds mathematical representations of the bioprocess itself. It connects inputs to outputs in a way that enables simulation and prediction. For instance, a model might describe how bioreactor temperature affects viable cell density, allowing scientists to forecast outcomes or back-calculate the conditions needed to achieve a specific glycosylation profile.
A powerful evolution of this concept is hybrid modeling, which integrates machine learning with engineering principles. Some process parts are described by known equations (like mass balances), while others rely on AI-driven pattern recognition. The result is a more robust, extrapolatable model that requires less data and performs more accurately than black-box approaches.
A Breakthrough in Personalized Medicine
Among the most promising applications of process modeling is in the field of personalized medicine. One breakthrough involved applying modeling to a CAR-T cell project, where it was possible to predict process outcomes across multiple passages based on donor-specific data and process conditions.
This capability enabled scientists to forecast whether a production run would succeed, allowing for better resource allocation and potentially sparing patients from ineffective or stressful treatment phases. Such predictive foresight could redefine how the industry approaches individual therapies, moving toward truly tailored treatments at scale.
From Raw Data to Action: Bridging the Decision Gap
Many biotech companies excel at data collection but struggle to make meaningful decisions from it. In too many cases, raw data accumulates without yielding clear insights or actionable outcomes. Process modeling addresses this gap.
The journey from raw data to actionable decisions begins with a well-designed experiment. Rather than relying on full factorial designs, more efficient sampling methods—like Latin hypercube sampling—can be used for nonlinear systems. Once data is collected, it undergoes transformation, outlier removal, and feature engineering before being used to train models.
But the real power comes after model training.
A validated model enables teams to conduct in silico experiments, making confident predictions and identifying areas where more data is needed. This creates an iterative loop between experimentation and modeling, where each informs the other, making decisions faster, more precise, and more strategic.
Breaking Through Adoption Barriers
Despite the benefits, only a few biotech companies have embraced digital modeling tools. Several barriers stand in the way, falling into three broad categories: technical, financial, and organizational.
Organizational resistance is often the hardest to overcome. It begins with strategic decisions about whether to develop models in-house or work with external providers. Then comes the challenge of cross-functional communication—bridging the languages of data science, IT, biology, and business.
Digital advocates who champion learning, experimentation, and agile collaboration can be critical in overcoming resistance. But even on the technical side, challenges remain. Scientists and operators face a fragmented landscape of specialized tools, each with its interface and data format. Poor integration, inconsistent APIs, and manual data transfers (often via Excel) create inefficiencies and errors.
Streamlining Bioprocess Intelligence
To address these challenges, platforms like DataHowLab aim to unify digital environments. Rather than replacing control systems or databases, they integrate with them, providing a user-friendly interface for model-driven decision-making.
By embedding tools such as hybrid modeling, transfer learning, and active learning, these platforms enable tailored workflows for unit operations, whether upstream bioreactors or downstream purification steps. The goal is to make advanced analytics accessible to scientists and operators—not just data experts—while simplifying the user experience.
Communicating Value: Overcoming Skepticism
One effective strategy for promoting adoption is to treat the model as a vehicle for shared knowledge. Models create a common language, enabling teams to explain their decisions, document their rationale, and transfer expertise across scales and locations.
Misconceptions persist, such as that models obviate the need for wet-lab experiments. In reality, models reduce the number of experiments; however, validation remains essential. Another common reaction is to blame the model when it fails in unfamiliar regions. However, this is also informative. It highlights knowledge gaps and guides the next experimental focus.
Investing in the Long Game
Adopting digital tools is not an overnight transformation. It's more like training for long-term fitness. Early results may be subtle, but consistency pays off. The process requires a sustained commitment, even in the face of setbacks.
Notably, an increasing number of digital-native professionals are entering leadership roles. These individuals understand the strategic value of data management and modeling and view digital infrastructure not as overhead but as a competitive edge. Companies that delay risk falling behind as the industry moves toward intelligent, data-driven operations.
The Rise of Large Language Models in Bioprocessing
One of the most exciting developments is the integration of large language models (LLMs) into bioprocessing platforms. By training LLMs on proprietary knowledge—such as manuals, publications, and webinars—tools can be built to answer complex queries in plain language.
Imagine asking, "What was the final titer of experiment X?" or "Plot the cell density profile where glucose fell below a threshold"—and getting instant results. This simplifies the workflow, shortens training time, and expands access to analytics.
Converging traditional modeling engines with generative AI opens up powerful new possibilities. While caution is needed to avoid errors and hallucinations, the potential is enormous. It's not just about convenience; it's about dramatically accelerating insight and action.
Toward Scalable Personalized Medicine
Personalized medicine presents a fundamental challenge: designing and executing a custom process for each patient, often with a single production opportunity. There is no time for trial and error.
By applying systematic transfer learning, models can learn from each case and use those insights to future treatments. This vision entails deploying edge algorithms at decentralized sites, such as hospitals, guided by cloud-based intelligence. Over time, a self-improving system could emerge—one that adapts to patient needs with increasing accuracy and speed.
This is not science fiction. It's a future within reach, built on the foundation of process modeling and digital tools.
Getting Started: Advice for Smaller Companies
For small organizations or startups, the question often becomes: where do we begin?
The answer is: start now, start simple. Focus on low-hanging fruit, rather than setting overly ambitious goals, such as full real-time digital twins from day one. What matters most is laying the groundwork—collecting the correct data, establishing a digital mindset, and building a solid foundation.
Smaller companies may have an advantage. Their agility and flexibility can enable faster progress and better adaptation. It's not about having massive resources—it's about having commitment and clarity of vision.
Final Remarks
Ultimately, the value of a model lies in its usability. Models are not meant to be theoretical showpieces. They should be as simple as necessary, intuitive, and directly connected to actionable decisions.
In the same way that a weather forecast helps plan your day without requiring a PhD in meteorology, a process model should guide bioprocess decisions without needing a PhD in data science. The focus should always be on insight and impact, not just accuracy.
With the right tools, mindset, and guidance, process modeling can become second nature to biotech professionals. And when that happens, the field will transform.
About Dr. Fabian Feidl
Dr. Fabian Feidl studied Molecular Biotechnology at the Technical University of Munich (Germany). After his Master’s thesis, he performed a research project at the University College London (UK), before he began his PhD in the Morbidelli-Group at the ETH Zurich (Switzerland).
In 2017, he co-founded DataHow AG and has since held the position of Chief Technology Officer. In 2024, he completed an International Executive MBA program at the University of St. Gallen in Switzerland.
Connect with Fabian Feidl on LinkedIn.
David Brühlmann is a strategic advisor who helps C-level biotech leaders reduce development and manufacturing costs to make life-saving therapies accessible to more patients worldwide.
He is also a biotech technology innovation coach, technology transfer leader, and host of the Smart Biotech Scientist podcast—the go-to podcast for biotech scientists who want to master biopharma CMC development and biomanufacturing.
Hear It From The Horse’s Mouth
Want to listen to the full interview? Go to Smart Biotech Scientist Podcast.
Want to hear more? Do visit the podcast page and check out other episodes.
Do you wish to simplify your biologics drug development project? Contact Us
🧬 Stop second-guessing your CMC strategy. Our fast-track CMC roadmap assessment identifies critical gaps that could derail your timelines and gives you the clarity to build a submission package that regulators approve. Secure your assessment by joining the CMC Strategy Accelerator.

