Wednesday, January 24, 2018

Run typical crystallization experimental design in silico using DynoChem

Faced with challenging timelines for crystallization process development, practitioners typically find themselves running a DOE (statistical design of experiments) and measuring end-point results to see what factors most affect the outcome (often PSD, D10, D50, D90, span).  Thermodynamic, scale-independent effects (like solubility) may be muddled with scale-dependent kinetic effects (like seed temperature and cooling rate or time) in these studies, making results harder to generalize and scale.

First-principles models of crystallization may never be quantitatively perfect - the phenomena are complex and measurement data are limited - but even a semi-quantitative first-principles kinetic model can inform and guide experimentation in a way that DOE or trial and error experimentation can not, leading to a reduction in overall effort and a gain in process understanding, as long as the model is easy to build.

Scale-up predictions for crystallization are often based on maintaining similar agitation and power per unit mass (or volume) is a typical check, even if the geometry on scale is very different to the lab.  A first principles approach considers additional factors such as whether the solids are fully suspended or over-agitated, how well the heat transfer surface can remove heat and the mixing time associated with the incoming antisolvent feed.

The DynoChem crystallization library and the associated online training exercises and utilities show how to integrate all of these factors by designing focused experiments and making quick calculations  to obtain separately thermodynamic, kinetic and vessel performance data before integrating these to both optimize and scale process performance.

Users can easily perform an automated in-silico version of the typical lab DOE in minutes, with 'virtual experiments' reflecting performance of the scaled-up process.  Even if the results are not fully quantitative, users learn about the sensitivities and robustness of their process as well as its scale-dependence.  This heightened awareness alone may be sufficient to resolve problems that arise later in development and scale-up, in a calm and rational manner.  Some sample results of a virtual DOE are given below by way of example.

Heat-map of in-silico DOE at plant scale agitation conditions, showing the effects of four typical factors on D50
The largest D50 is obtained in this case with the highest seeding temperature,  lowest seed loading and longest addition (phase 1) time. Cooling time (phase 2) has a weak effect over the range considered.
Click here to learn how to apply these tools.

Thursday, December 21, 2017

Congratulations to Dr Jake Albrecht of BMS: Winner of AIChE QbD for Drug Substance Award, 2017

At AIChE Annual Meetings, Monday night is Awards night for the Pharma community, represented by PD2M.  This year in Minneapolis the award for Excellence in QbD for Drug Substance process development and scale-up went to Dr Jake Albrecht of Bristol-Myers Squibb.  Congratulations, Jake!

Winners are selected using a blinded judging panel selected by the Awards Chair, currently Bob Yule of GSK.  Awards criteria are:
  • Requires contributions to the state of the art in the public domain (e.g. presentations, articles, publications, best practices)
  • Winner may be in Industry, Academia, Regulatory or other relevant working environment
  • Winner may be from any nation, working at any location
  • There are no age or experience limits
  • Preference is given to work that features chemical engineering
Jake was nominated by colleagues for:
  • his innovative application of modeling methodologies and statistics to enable quality by design process development
  • including one of the most downloaded papers in Computers and Chemical Engineering (2012-2013), “Estimating reaction model parameter uncertainty with Markov Chain Monte Carlo
  • his leadership and exemplary efforts to promote increasing adoption of modeling and statistical approaches by scientists within BMS and without
  • his leadership in AIChE/PD2M through presentations, chairing meeting sessions, leading annual meeting programming and serving on the PD2M Steering Team
Scale-up Systems was delighted to be involved at the AIChE Annual Meeting this year in our continued sponsorship of this prize.  Some photos and video from the night made it onto our facebook page and more should appear soon on the PD2M website.

Jake is also a DynoChem power user and delivered a guest webinar in 2013 on connecting DynoChem to other programs, such as MatLab.

Wednesday, November 29, 2017

November 2017 DynoChem Crystallization Toolbox Upgrade

We're delighted that the number of DynoChem users getting value from our crystallization tools continues to grow strongly and we're grateful for the feedback and feature requests they provide to help us improve the tools.

New features released this November include:
  • One-click conversion of kinetic model into predictor of the shape of the PSD
  • High-resolution tracking of the distribution shape, to minimize error*
  • Extended reporting and plotting of PSD shape.

Sometimes practitioners that are unaware of crystallization fundamentals, crystallize too fast and with little attention to the rate of desupersaturation.  For such a rushed process, even when seeded (2%) the operating lines might look like the picture on the left below (Figure 1). A more experienced practitioner might operate the crystallization as shown on the right (Figure 3):

The particles produced by these alternatives differ greatly in size.  The rushed crystallization leads to a multimodal distribution (red in Figure 2) with low average size, due to seeded growth and separate nucleation events during both antisolvent addition and natural cooling.  These crystals will be difficult to filter and forward-process.

More gradual addition, with attention to crystallization kinetics and both the addition and cooling rates, leads to larger crystals (blue in Figure 2) and a tighter distribution that can be further enhanced by optimizing seed loading, seeding temperature and the operating profiles.

From November 2017, these types of scenarios can be set up, illustrated and reported in minutes using the DynoChem Crystallization Toolbox.

* We have implemented high resolution finite volume discretisation of the CSD, using the Koren flux limiter.

Wednesday, October 25, 2017

Simulating PFRs for flow chemistry under transient upset conditions

Readers of this blog will be aware of our RTD utility that helps characterize continuous manufacturing (CM) equipment trains and also simulate the impact of process disturbances, in the absence of chemical reactions.  Pharma CM processes typically have several layers of controls to help ensure that off-spec material is diverted when necessary and as far as possible that disturbances are minimized and detected early. 

For regulatory filings or other purposes, from time to time it may be necessary to simulate transient/ upset conditions in chemically reacting systems (e.g. making drug substance intermediates or final API) to understand the additional chemical effects and to define boundaries for acceptable levels of input variation.  We have been exploring such cases and the most effective way to model them in DynoChem.  Some interesting DC Simulator plots are shown below to illustrate when and for how long such upsets might affect the exit CQA (blue) and impurity level (green) from an example PFR (average residence time 30 minutes) with a ‘typical’ side-reaction. 

Simulation of plug flow reactor with significant and frequent fluctuations in four input variables. These unusually large variations if left unchecked would lead  in this example to a breach of the CQA limit (high impurity) twice during a 3 hour operating period. 

Simulation of plug flow reactor with a feed pump failure at 90 minutes, lasting for 30 minutes.  In addition to reducing  product output, depending on which feed pump fails, this may lead to a temporary increase in impurity level until the feed is restored.

Tuesday, October 3, 2017

DOE has "virtually no role at all" in Lyophilization

We've been working away for a little while now with a group of customers to develop improved models for Lyophilization.  The fruits of these labours are available as the current Lyo model in DynoChem Resources.  This handles multi-component (e.g. water, acetic acid) freezing (rate-based approach to SLE) and sublimation (rate-based approach to SVE), with pressure-dependent heat transfer, radiation and a sublimation rate that depends on the thickness of the dry product layer.  You can obtain a predictive model for your system using this template and a few key experiments.

In researching the field while putting this model together, among Mike Pikal's excellent writings we found this useful presentation from a meeting in Bologna, 2012 [The Scientific Basis of QbD: Developing a Scientifically Sound Formulation and Optimizing the Lyophilization Process] and our favourite slide from the deck is reproduced below.

We are used to delivering this message in the context of characterizing, optimizing and scaling other unit operations (e.g. reactions, crystallization) and it is no surprise to see that the same principles hold for Lyo.

Download the model to simulate Lyophilization, fit parameters, predict scale-up and optimize. Download the full slide deck for a good introduction to Lyo.

Wednesday, August 23, 2017

Finding the rate law / reaction mechanism: exercise shows the way

We highly recommend that chemists and engineers involved in kinetic modeling take our dedicated exercise that focuses on determining the correct rate law.

In DynoChem's Fitting window, it is easy to quickly try different parameter fitting options and especially select different groups of parameter to fit. When confronted with new data, models can be adapted and further developed, in this case to better capture the reaction mechanism.

This exercise takes you through that workflow using  the Menschutkin reaction of 3,4-dimethoxybenzyl bromide and 3-chloro-pyridine:

A handful of well-controlled experiments followed by sampling together with use of the DynoChem Fitting window allows the single-line reaction to be broken out into a series of elementary steps that better represent the chemistry.  On this foundation, users build a model suitable for reaction optimization and scale-up, saving unnecessary experiments and providing a sound basis for process decisions.

Go on - take 20 minutes to give it a try.  Then share the link with your colleagues so they can start saving time on their development projects.

Saturday, July 15, 2017

How to check the mole balance in your HPLC data and build better kinetic models

We've posted before on the topic of fitting chemical kinetics to HPLC data. Some good experiment planning and design can make this much faster, easier and more informative than a retrospective 'hope for the best' attempt to fit kinetics to experiments coming out of an empirical DOE.

Once the data have been collected from one or two experiments, it's time to check the mole balance. That means checking that your mental model of the chemistry taking place (e.g. A>B>C) and to which your DynoChem model will rigorously adhere, is consistent with the data you have collected. There's a nice exercise in DC Resources to take you through this step by step, using chemistry inspired by a reaction on which Mark Hughes and colleagues of GSK have published and presented.

The exercise starts with HPLC area (not area percent) and after correcting for relative responses leads directly to a new insight into the reaction, even before the first simulation has been run.  When the modeling and experiments are done alongside each other and at the same time, such early insight impacts subsequent experiments and makes them more valuable while reducing their number.

We encourage you to take the exercise to learn this important skill and how to build better, more rigorous and more reliable kinetic models.

ShareThis small