Thursday, July 11, 2019

Part 5 of 6: Opportunities to accelerate projects

You may already know that the most commonly used noun in the English language is "time".  In today's world, many of us feel almost permanently under time pressure and we talk about not having enough time for all kinds of things we'd like to do.  Not having time takes on a whole new meaning for patients with life changing medical conditions, reminding us in chemical development and scale-up that opportunities to accelerate our work and commercialization of new medicines should be taken with both hands.

Achieving acceleration using modeling (e.g. Dynochem or Reaction Lab) is already well covered by extensive case studies from customers in Dynochem Resources.  Acceleration using automation of modeling and connection of modeling to other workflows is the subject of this post.  In our core software development team, we have thought a  lot about these future applications and taken steps to support their realization, providing a platform and the ‘hooks’ needed to link with other technologies.

A basic platform is the ability to automatically generate and run a large number of virtual experiments.  We use parallel processing to execute the simulations as illustrated in the short animation below.  The automation calls are exposed and may be 'scripted' and run by other programs (e.g. Python) as part of an integrated workflow.
Chemists and engineers can leverage automated generation and execution of a large set of virtual experiments with parallel processing and collation of results in convenient Excel tables and contour plots.
Tasks involved in model building may also be scripted/ automated in Dynochem 5 and Reaction Lab.  For example, area percent data may be entered in a model, a set of kinetic parameters fitted and many simulations carried out, all without human intervention.  To do this requires some scripting / code at several stages in the workflow.  Cloud computing resources (Azure or AWS) may be used for execution, leveraging our cloud licensing.

For example, the animation below shows scripted fitting of three UA (heat transfer characterization) values to three solvent tests using Dynochem 5.  This takes a short time to fit the parameters needed for each of three liquid levels in a reactor.  (The ‘fit’ button is just for demo purposes and normally the fit would be started from another scripted workflow process).
Scripted parameter fitting is possible using new function calls built into Dynochem 5 and Reaction Lab; this example illustrates automated heat transfer characterization (UA) and the techniques are equally applicable to e.g. chemical kinetics.
Additional opportunities exist in leveraging information from electronic lab notebooks (ELN) to create models for users that are already populated with features such as chemical structures and experimental data.  In a move beyond existing relatively crude self-optimizing reactor algorithms, customers are interested in closing the loop between modeling and experimentation, using model outputs to set up and execute the next experiment(s) in a fully automated loop.

Contact our support team if you'd like to discuss any of these applications further for use inside your organization.

Friday, June 7, 2019

Part 4 of 6: Where will the models come from?

If mechanistic modeling is to become a focal point in the project lifecycle, you have to address the question of where the models will come from.  In this context, by 'model' we mean i) the set of equations to be solved, ii) in executable form, with iii) initial values, iv) fitted parameter values where needed and v) experimental data to assess model accuracy.

Q: Who can create these models and when does it make sense for them to do so?
A: For tangible benefits the creators and users should be the same practitioners / project teams that own and run the development projects.  Not some specialists in an ivory tower that are focused only on modeling.  Model development should occur before and during experimentation.  Modeling should not be a 'post-processing' activity that occurs too late to add value or when the time window for data collection has passed.

In Dynochem 5 and Reaction Lab, we have streamlined the process in i) to v) so that this vision is achievable.  We include further notes on the individual steps below.

Steps i) to v) can be accomplished in a snap for chemical reactions using Reaction Lab.  The resulting model can be leveraged over and over during the project lifecycle.
Item (i) may be clear and simple for certain common unit operations like heating/ cooling and perhaps filtration; for many other operations, identifying which equations to solve may be iterative and challenging.  For models of fairly low complexity, like distillation, while the equation set may be obvious it is unwieldy to write down for multi-component systems including the energy balance.  For models of chemical reactions, the set of elementary reactions will not become clear until the full cycle i)-v) has been repeated more than once by knowledgable process chemists.

Unlike some other tools, we do not force users to populate 'matrices' just to define reactions and reaction orders (!)

Item ii) is an obstacle for practitioners who only have access to spreadsheets, or specialized computing/coding environments.  These force the user to develop or select a specific solution method and run risks of significant numerical integration inaccuracies.  Even then, simulations will lack interactivity and parameter estimations will require scripting or complex code.  Some 'high-end' engineering software tools present similar challenges, lacking comprehensive model libraries and forcing users to write custom models, delve into solution algorithms and confront challenges such as 'convergence' that feel highly tangential to project goals.

Item iii) should be easy for practitioners and in practice it can be so, if the software supports flexible units conversion (in and out of SI units) and contains supporting tools to provide initial estimates of physical properties and equipment characteristics.

Item iv) requires the model to be run many times and compared with experimental results.  Specialized algorithms are needed to minimize the gap between model predictions and experimental data.  When multiple parameters must be fitted to multiple responses in multiple experiments, this gets close to impossible in a spreadsheet model and general-purpose mathematical software environments.

Item v) is mainly the province of the experimenter and once each experiment has been completed, requires an easy mechanism for aggregating the data, with flexible units handling (including HPLC Area, Area%) being a major help.

And so to answer the question in the title of this post: You guessed it!  We expect the majority of chemical reaction and unit operation models in Pharma to continue to be developed using our tools in preference to home-made or overly complex environments.  As the volume of modeling activity grows with Industry 4.0 and related developments, we already see this trend becoming more pronounced, with many practitioners needing to use the same model over a project lifecycle, requiring speed and ease of use as well as accuracy and rigour.

Friday, May 10, 2019

Post 3 of 6: Central role of mechanistic modeling in Chemical Development

Chemical Development is a complex and challenging undertaking, involving a large effort from multi-disciplinary teams, sometimes battling Mother Nature, with compressed timelines and limited material for experimentation.  There is a broad spectrum of approaches to this challenge, including new lab instruments, use of robotics and automation, outsourcing certain types of development or operations and use of statistical and mechanistic modeling.  Companies also experiment to find the best organization structure for this function and frequently separate departments specialize in Analytical (Chemistry) Development, Chemical (Process) Development, Technology Transfer and preparation of Regulatory filings.  Collaboration among these groups helps achieve development goals.

Figure 1 (click to enlarge): A simplified representation of chemical development today, including the scale and locus of statistical and mechanistic modeling
Figure 1 is a much simplified graphical representation of the activities involved.  There is a large reliance on experiments.  Groups involved in process definition and optimization are currently the main users of both statistical and mechanistic modeling.  Technology transfer increasingly involves working with external partners remotely.  Data search and gather, including data integrity reviews and preparation of regulatory filings, are mostly manual processes.  The disparate nature of activities and the needs for specialization make them somewhat siloed, with risks of duplication and dilution of effort.  For example, an experimental program may be repeated if the first program missed some key information; or repeated by a CRO to answer new questions that have arisen; or repeated by a CMO in order to accomplish successful tech transfer.  None of these data may be harnessed effectively and shared to answer future questions.

Leading companies are changing their approach to chemical development and bringing mechanistic process modeling on stream earlier and more centrally than before.  The idea is not new but advances in a range of technologies (see earlier posts) and the momentum of 'Industry 4.0' are helping to fuel the transformation.  At a task level, using a model to design the right experiments reduces overall effort.  At a project level, the model provides a place to capture the knowledge and reuse it in future.  At an organization level, modeling provides a structured, reusable and digital approach to information sharing and retrieval.  For example, questions can be answered in real time, without experimentation, interactively when they arise, even live in a meeting or webcon, sparing delays, speculation and doubts, allowing faster progress.

Figure 2 (click to enlarge): Future shape of chemical development activities, with mechanistic process models as the focal point for information capture and reuse.
The pieces in Figure 1 are rearranged in a natural way in Figure 2 as a cycle that captures and makes the most of information generated during each chemical development activity, including modeling.  Additional items have been added to reflect technologies that are relatively new to Pharma, including continuous manufacturing and feedback process control; opportunities to apply either or both of these in chemical development or full scale manufacturing can be evaluated using a mechanistic process model.  Therefore the mechanistic model takes up a central position and is the focal point in the new chemical development process.

It will take some time before Figure 2 reaches its full potential.  The throughput of models in chemical development organizations is already increasing as model building tools become easier to use and more prevalent.  We're delighted to be able to lead the way with Scale-up Suite.

Figure 2 also includes some great opportunities to automate workflows.  We'll discuss some of these in a later post.  

Wednesday, May 1, 2019

Post 2 of 6: A brief history

The Wall Street Journal ran an article in September 2003, entitled "New Prescription For Drug Makers: Update the Plants", comparing and contrasting pharma manufacturing techniques with other industries.  The subtitle ran, perhaps unfairly, "After Years of Neglect, Industry Focuses On Manufacturing; FDA Acts as a Catalyst".

Our DynoChem software entered the industry a few years prior, the prototype having been developed as a dynamic simulator within Zeneca, so that users could "create a dynamic model without having to write differential equations".  We first proved that the software could be used to solve process development and manufacturing problems (e.g. with hydrogenations, exothermic additions), then rewrote the source code and began to add features that made modeling by non-specialists an everyday reality.

There have been many pharma industry leaders who have recognized the potential for modeling to help modernize development and manufacturing.  One example is Dr Paul McKenzie and his leadership team at Bristol-Myers Squibb (BMS) at the time, who cited the Wall Street Journal piece in an invited AIChEJ Perspectives article and also in presentations like this one at the Council for Chemical Research (CCR) in December 2005 - you can get the full slide deck here.

Cover slide from presentation by Paul McKenzie of BMS at CCR Workshop on Process Analytical Technology (PAT), December 13, 2005, Rockville, MD
Today, while the landscape for data storage, sharing and visualization has moved ahead significantly, with the emergence of ELN, cloud and mobile, the chemical and engineering fundamentals of defining and executing a good manufacturing process remain the same:

Some capabilities required to develop robust and scalable processes, from the 2005 CCR presentation
Our Scale-up Suite extends these capabilities to more than 100 pharma development and manufacturing organizations worldwide, including 15 of the top 15 pharmaceutical companies.  This broad and growing base of users, armed with clean and modern user interfaces, calculation power and speed in Reaction Lab and Dynochem 5, provides a firm foundation for the next wave of industry transformation.

We're always delighted to hear what users think.  Here are some recent quotes you may not have seen yet:

  • "If you can book a flight on-line, you can use Dynochem utilities" [we like this especially because we hear that using some other tools is like learning to fly a plane]
  • "Our chemists are thoroughly enjoying the capabilities of Reaction Lab software and are quite thrilled with the tool".

In the next post, we will look at the increasingly central role of mechanistic modeling in process development.

Monday, April 29, 2019

Post 1 of 6: Exciting times in Chemical Development

It's an exciting time to be part of the Pharma industry's chemical development ecosystem, with new opportunities being created and adopted to accelerate development of new medicines.  This is the first in a short series of posts that will focus on the role of predictive, mechanistic modeling in the industry's transformation.

The much-talked about 'Industry 4.0' phenomenon has led to the creation of awkward terms such as 'digitalization' and one positive consequence of the hype is that it has somewhat aligned the goals of senior managers, systems integrators, consulting companies and industry vendors.  We especially liked the review by Deloitte that uses the term 'exponential technologies' to group many of the developments that underpin current transformation opportunities:

Snapshot of exponential technologies covered in the Deloitte study, Exponential Technologies in Manufacturing
We'll be highlighting the role of digital design, simulation & integration, technologies that our customers have practiced on a growing scale for nearly twenty years.  We expect the rate of growth to increase quite sharply as new developments, like Reaction Lab, make adoption easier and simulation is integrated with other developing technologies.

If the above whets your appetite, watch this space for the next piece in this series.

As always, customers can contact our support team to discuss immediate applications.

Friday, February 22, 2019

Dynochem 5 released as part of the new Scale-up Suite

In our 14 February webinar, Scale-up Systems was delighted to announce the release of Dynochem 5 as part of the new Scale-up Suite, which also includes Reaction Lab and Numero Chem.
Scale-up Suite includes Dynochem and new products Reaction Lab and Numero Chem
Reaction Lab: Kinetics meets ELN

This is the culmination of great work by our software development team, inspired by customer feedback and led by Dr Steve Hearn.

High-level details about the components in Scale-up Suite can be found at the new look scale-up.com website.  Members of scale-up.com can get more detail and access to the tools via the Dynochem Resources and Reaction Lab Resources websites.

We've started a program of weekly 30-minute webinars to talk through the new features and hope that customers and prospective customers can make some of those live (or watch the recordings) over the next month or two.

Your Dynochem 4 content will work in Dynochem 5 and you should plan to upgrade as soon as practicable for you.  Expect a host of improvements in speed, ease of use and accuracy, the latter especially for material properties.

Use the links at the side of this blog to explore more.  As always, we'd love to hear your feedback to support@scale-up.com.

Wednesday, October 24, 2018

List of AIChE 2018 Annual Meeting highlights available now

We've uploaded our customary list of highlights at this year's AIChE Annual Meeting, taking place next week in Pittsburgh (Oct 28-Nov 2).

There's a DynoChem Reception this year and we'll be hosting another rematch of Jeopardy there.  If you're from our user community or a potential new customer, you can register here for a ticket.


The list of talks featuring DynoChem contains 14 interesting topics ranging from reactions, distillation, crystallization, drying to safety, stability and continuous processing / flow chemistry.  Presenting organizations include AbbVie, Dow, FDA, Hovione, Lilly, Merck, Pfizer, Zoetis.  Get the list here.

Our team on site this year includes Dr Andrew Bird, with whom many of you will have interacted beneficially during support cases. 

If you're travelling to AIChE, we look forward to seeing you there.

ShareThis small