Upload
gbx-summits
View
114
Download
0
Tags:
Embed Size (px)
Citation preview
© Bioproduction Group. All Rights Reserved.
WHITEPAPER
BIOMANUFACTURING DEBOTTLENECKING AND PROCESS OPTIMIZATION
© Bioproduction Group. All Rights Reserved. 1
BIOMANUFACTURING DEBOTTLENECKING AND PROCESS OPTIMIZATION
INTRODUCTIONThere are two major unrsolved challenges associated with optimizing biomanufacturing operations today. The first is variability: how to understand and improve processes when there is significant variation in process times throughout all unit operations. The second is complexity: modern biomanufacturing facilities are complex and interconnected, with piping segments, transfer panels and valve arrays as well as WFI and other shared resource constraints. This complexity is becoming even greater with the need to process higher (and more variable) titers, additional products, and a lack of process standardization.
In such an environment, debottlenecking is becoming increasingly important as a means of quantitative process optimization. The technique allows biomanufacturing facilities to run new products with minimal retrofits, and also increase the run rate of existing legacy products without significant regulatory impact. Debottlenecking therefore allows a biomanufacturing facility to significantly extend its useful life, by making the facility more flexible and e!cient. Retrofits – when judiciously applied to the correct areas of the facility – can be an order of magnitude less expensive than building new capacity, and can be made in 1 year or less (versus 4-6 years for a new facility).
In this whitepaper we discuss the impact of variability and complexity on the debottlenecking and optimization process. We show historical approaches to debottlenecking based on resource utilization and why they can lead to incorrect assessments of the real bottlenecks in biomanufacturing facilities. We look at alternatives to collecting data solely from subject matter expert (SME) opinion, which may be biased estimators of facility performance. Bio-G has observed that SMEs in biomanufacturing facilities have detailed knowledge on one area of the process, but often find it di!cult to agree on the bottlenecks between multiple unit operations.
The “gold standard” for debottlenecking methodology today is based around perturbing cycle times or resources in a discrete event simulation model and observing the impact on some performance indicator like cycle time, labor, or throughput. This involves collecting data from process historians (DeltaV, Rockwell, PI, etc.) and performing sensitivity analysis on those cycle times. Debottlenecking is important in both resource-constrained facilities (where increasing throughput has a positive return on investment) and throughput-constrained facilities – where we are seeking to make the same amount of product with increased e!ciency and at lower cost. In either case, debottlenecking provides a quantitative method of accurately finding the critical processes in the facility and assessing potential improvements to overall facility performance.
FIGURE 1: SUBJECT MATTER EXPERTS ARE NOTORIOUSLY BAD AT COMPARING FACILITY PERFORMANCE IN THEIR AREA, TO THOSE IN AREAS OF THE FACILITY THEY ARE LESS FAMILIAR WITH.
BIOMANUFACTURING DEBOTTLENECKING AND PROCESS OPTIMIZATION
2© Bioproduction Group. All Rights Reserved.
BIOMANUFACTURING DEBOTTLENECKING AND PROCESS OPTIMIZATION
INTRODUCTION TO DEBOTTLENECKING Debottlenecking is the process of improving e!ciency through finding the rate-limiting steps in a facility, and fixing them. Addressing these rate-limiting steps will improve performance, while changing other (non-rate-limiting) steps will have no impact on performance. Thus, while the complete manufacturing system may have several hundred key activities, only a very small number of them (typically less than a dozen) define the run rate of the facility.
To understand this, consider a simple two-stage production process. In the first stage, a single bioreactor produces material, is cleaned, steamed and prepped for the next batch – a process that takes 300 hours. In the second stage, a single downstream train processes the material produced by the production bioreactor, a process that takes 72 hours.
Since each unit operation can only process one batch at a time, the maximum ‘velocity’, or processing rate, of batches through the first step is one batch every 300 hours (or about 12.5 days). The processing rate for the downstream is 72 hours, or about 3 days – so for the remaining 9.5 days, the purification skid will sit waiting for the bioreactor to complete. Any improvement to the purification train (say, by reducing its cycle time from 72 to 60 hours) will have no impact on throughput or run rate since the “rate limiting step”, or bottleneck, is the fermentation area. Since the bottleneck is the upstream fermentation process, fixing this step – by either adding additional equipment in parallel or decreasing the cycle time from 300 hours to some smaller number – is the only way to improve throughput.
Debottlenecking is typically thought of as a two-step process. First, we identify the rate-limiting steps amongst the hundreds or thousands of potential resources and activities in the facility (bottleneck identification). Second, we make changes to those rate-limiting steps – either by adding equipment, reducing cycle time, or some other means – to improve the process (bottleneck alleviation). Both are discussed below.
DATA, COMPLEXITY AND VARIABILITY IN BIOMANUFACTURING
In Figure 3 we show data from a biomanufacturer for CIP times from 2002 to 2008. The data is collected directly from the Manufacturing Execution System, or MES, to provide an unbiased estimate of facility performance. (Bio-G’s software directly integrates with most common control systems in biotech facilities.) Note that there is significant variability in this process step, which takes between 3 and 6 hours. There is also process drift, i.e. the amount of time that it takes to do the activity is not constant from year to year.
FIGURE 2 A TWO-STAGE SERIAL PRODUCTION PROCESS.
“Debottlenecking was critical since it gave our facility a non-biased list of issues that we needed to work on. It wasn’t the loudest engineer that got the most attention: the most important issue was.” Manager, Upstream, Large Scale Biomanufacturing Facility
FERMENTATION(300 HOURS)
PURIFICATION(72 HOURS)
3© Bioproduction Group. All Rights Reserved.
BIOMANUFACTURING DEBOTTLENECKING AND PROCESS OPTIMIZATION
This data suggests that using a single number – say, the average time – to do planning and optimization may not identify the correct bottleneck. Our experience in this area has been that running the same model, with and without variability, produces di"erent bottlenecks. Such an analysis is particularly concerning since finding and fixing bottlenecks can be a capital-intensive process, if retrofitting a validated facility. “Fixing” the wrong area of the facility will not improve run rate, no matter how much improvement in that area is made.
Another issue that must be overcome when doing debottlenecking and process optimization is process complexity. In Figure 4 we show a map of the places in one facility where people interact with the process. A delay in the availability of an FTE, or a piece of shared equipment like a CIP skid, transfer line, valve array or transfer panel, or even WFI or utilities systems, will delay the process. Such times are “hidden” since they are not normally considered in cycle time calculations. These interactions make it di!cult even for highly trained operators to understand where hidden cycle times and other waiting periods will cause the facility to slow down.
Bio-G’s software uses discrete event simulation and real-time data feeds to manage both the issues of variability and complexity. Discrete event simulators incorporate variability in their formulation, as well as the constraints around how an activity can start. (Just like an automation system, an activity will not start unless the prerequisite conditions are present). This approach allows very accurate models of biopharmaceutical facilities to be produced, that incorporate variability and “hidden” waiting times.
“We never thought variability was important, till we spent $500K on improvements... and got no improvement. It turns out variability was the reason why.” VP and GM of 60KL Biomanufacturing Facility
FIGURE 3: CIP TIMES, 2002 TO 2008
FIGURE 4: PLACES WHERE FTES INTERACT WITH THE PROCESS.
20O2(JAN)
JUN JUN JUN JUN JUN JUN JUN2003 2004 2005 2006 2007 2008
2
3
4
5
6
7
DATES
HOURS
4© Bioproduction Group. All Rights Reserved.
BIOMANUFACTURING DEBOTTLENECKING AND PROCESS OPTIMIZATION
DEBOTTLENECK IDENTIFICATION USING RESOURCE UTILIZATION? A traditional approach to identifying bottlenecks, popular in the 1960s, focused attention on the resources in the facility with the highest utilization. The theory was that those resources with the highest utilization, are those that are the “busiest” and therefore those where improvements would have the most impact.
However, such an approach is not guaranteed to locate the real bottlenecks in a facility, and appears from empirical evidence to be particularly ill-suited to biomanufacturing facilities. Consider as an example a facility with one large bu"er preparation tank, that is used to prepare bu"er for the first two chromatography steps in downstream purification.
Because the tank is only used in the first two high-volume steps of the process, its overall utilization is low. We show the utilization periods of the tank in Figure 6 below.
However, using the tank for two sequential chromatography steps may mean that the second step has to wait for the tank to be cleaned and bu"er re-prepped. If this process takes longer than the duration of the Protein-A chromatography step, it may cause a delay in the start of the cation exchange step. Thus, even though the tank has an overall utilization of 20%, it delays production and is therefore a bottleneck.
DEBOTTLENECKING DETECTIONAs the example above shows, a simplistic utilization-based approach to debottlenecking does not find the true process constraints in a biomanufacturing facility. The current “gold standard” is to use a simulation model to perturb – or make controlled changes – to a model of the facility and observe the results on some metric. One popular approach is to reduce operation times in each process step, and observe the impact on throughput. By simulating a facility where a particular process step (say, a CIP duration) takes zero time we can examine the e"ect of having that activity “for free”. By repeating this experiment for each and every activity in the facility, we can understand the impact of a potential project to reduce every activity’s cycle time. Such a sensitivity analysis can correctly identify even complex bottlenecks, since it makes an actual change to a model of the system and observes the impact of that change.
FIGURE 5: BUFFER PREP TANK SUPPORTS THE FIRST TWO PURIFICATION STEPS, PRO-A AND A CATION EXCHANGE STEP.
FIGURE 6: UTILIZATION OF A BUFFER PREP TANK (BUSY AND IDLE PERIODS).
PRO-A CEX VF AEX UFDF
IDLEBUSY 20%UTILIZATION
BUFFER PREP TANK A
IDLEBUSYIDLE
“We tried other software and got the wrong answers. It turns out, software designed for widget manufacturing or troop rotations just doesn’t cut it in biotech.” Analyst, Industrial Engineering group, top-10 biopharmaceutical company
7© Bioproduction Group. All Rights Reserved.
BIOMANUFACTURING DEBOTTLENECKING AND PROCESS OPTIMIZATION
In Figure 7 we show the output from such a sensitivity analysis. In this analysis, throughput was the metric chosen – so a higher number indicates better performance. Each point in the graph represents an activity, whose cycle time has been reduced to zero hours, and the resulting throughput observed over a 100-day campaign. The activities, or experiments, have been sorted to show those most impactful to the cycle time on the right and least impactful on the left.
This analysis shows that of the several hundred activities in this manufacturing facility, only a very small percentage actually improve thoughput. In Figure 8 below we show detail of the four activities that impacted throughput in this model (those on the far right in Figure 7). This approach allows very rapid bottleneck detection and is guaranteed to find real process constraints in a system since it simulates the actual impact of a change to each constraint in turn.
BOTTLENECK EVALUATION AND ALLEVIATIONAfter identifying bottlenecks, bottleneck alleviation becomes the central focus for analysis. The perturbation analysis above found that the key constraint in the process is the cycle time of the production bioreactor: however the proposed change (reducing the fermentation time to 0 hours) is not a feasible engineering solution. Fortunately, it may be su!cient to improve the cycle time of the production bioreactor by a smaller amount to achieve the desired result.
In order to quantify how much change is required before an activity or resource is no longer “on the critical path”, we can use a similar form of sensitivity analysis that makes changes to one or a small number of activities or resources but does so at a higher level of granularity. For example, we could choose to reduce the bioreactor’s setup time by 1, 2, 3,… etc. hours and observe the impact of each change on the throughput or cycle time.
FIGURE 8: DETAIL OF PARETO SPIKE CHART.
FIGURE 7: A SPIKE CHART SHOWING THE KEY BOTTLENECKS IN THE FACILITY.
KEY
PER
FOR
MA
NC
E IN
DIC
ATO
R (
KPI
)BH
T20
3 CI
P04
.PO
ROS
07. Q
-SEP
HAROSE
PRE
-USE
VF B
UFFER
BUFF
ER Q
A HO
LD
02.1
MED
IA P
REPA
RATI
ON 6
000L
BH T
302
SIP
PORO
S BU
FFER
PRE
P 2
POO
L TA
NK 01
CIP
CHRO
ME
SKID
02
POST
USE
120L
BRX
CIP
120L
BRX
SIP
CENTR
IFUGE
POST
USE
BH T
103
SIP
03. P
ORO
S BU
FFER
PRE
-USE
PORO
S BU
FFER
PRE
P 1
05.V
IRAL
PRE-
USE
UFDF
BUFF
ER P
REP
BH T
303
SIP
04. C
ENTR
IFUGAT
ION
6000
L BR
X SI
P
03A. N
-1 TR
ANSFER
BH T
206
SIP
MED
IA P
REP
TANK
CIP
POO
L TA
NK 01
SIP
PORO
S BU
FFER
A Q
A HO
LD
PORO
S BU
FFER
B Q
A HO
LD
PROA B
UFFER
A Q
A HO
LD
PROA B
UFFER
C Q
A HO
LD
Q B
UFFER
BUFF
ER Q
A HO
LD
UFDF
BUFF
ER Q
A HO
LD
WFI
DAILY
LIM
IT +
1000
0
WFI
DAILY
LIM
IT -1
0000
WFI
REP
LENDIS
HMEN
TBH
T20
3 SI
P
CHRO
ME
SKID
03
POST
USE
01. M
EDIA
PRE
PARA
TIO
N 120L
PROA B
UFFER
PRE
P A
BH T
201 S
IPBH
T10
3 CI
P
09. H
MAS
FINAL
UF/DF
PRE-
USE
PROA B
UFFER
PRE
P C
50
48
46
44
42
6000
L BR
X CI
P
Q B
UFFER
PRE
P
UF SK
ID P
OST
USE
VF S
KID P
OST
USE
06. V
IRAL
FILT
RATI
ON
BH T
303
CIP
BH T
206
CIP
VF B
UFFER
PRE
P
PROA B
UFFER
PRE
P B
BUFF
ER P
REP
TANK
CIP
02. N
-1 FE
RMEN
TATI
ON
08. Q
-SEP
HAROSE
BATC
H ARR
IVAL
BH T
105
CIP
BH T
201 C
IP
PROA B
UFFER
B Q
A HO
LDBH
T30
2 CI
P
10. F
INAL
UF/DF
CHRO
ME
SKID
01 P
OST
USE
01. P
ROA P
RE-U
SE02
. PRO
A
03B.
PRO
DUCTIO
N FER
MEN
TATI
ON
ACTIVITIES
KEY
PER
FOR
MA
NC
E IN
DIC
ATO
R –
(M
EAN
) A
-TH
RO
UG
HPU
T
50
48
45
46
47
49
44
BH T302 CIP 10. FINAL UF/DF CHROME SKID 01 POST USE
01. PROA PRE-USE 02. PROA 03B. PRODUCTION FERMENTATION
BIOMANUFACTURING DEBOTTLENECKING AND PROCESS OPTIMIZATION
7© Bioproduction Group. All Rights Reserved.
Such an approach gives valuable quantitative information to engineering teams since it gives an explicit tradeo" between the key performance indicator, or KPI, at various levels of change. In the graph above, a 6% reduction in cycle time improves throughput from 44.1 to 45.0kg – but any further reduction only increases the throughput by a little more (0.1-0.6 kg). If 45.0 kg were su!cient throughput, we would not pay any additional money to further reduce the cycle time. Such an approach gives engineering teams critical information to evaluate how significant facility modifications and investments will need to be.
ITERATIVE DEBOTTLENECKINGThe approach of bottleneck detection, evaluation and removal presented above is typically repeated iteratively in a biomanufacturing facility to provide a series of incremental improvements. As we identify and fix bottlenecks, these items move o" the critical path and other activities or resources become the rate-limiting steps.
The process of incrementally debottlenecking a facility – where we find the first bottleneck and fix it, then move on to the second and so on – produces what is known as a ‘debottlenecking pathway’. Such a pathway is a sequence of steps required to fix bottlenecks, along with changes in KPI that are achieved by each of the steps. We outline the results from one such analysis below in Figure 10.
In the debottlenecking pathway shown, a sequence of three engineering changes is made. Each debottlenecking iteration increases the run rate by a di"erent amount (and in general, multiple engineering changes may be required in di"erent areas of the facility to increase the run rate). It is important to note that the order of the pathway is important, i.e. in the diagram above moving to disposables without reducing bu"er prep times will yield no improvement. The pathway must be executed in the order shown to achieve the target run rate.
KEY
PER
FOR
MA
NC
E IN
DIC
ATO
R –
(M
EAN
) TH
RO
UG
HPU
T
45.0
45.2
45.4
45.6
45.8
44.4
44.6
44.8
44.2
44.0-100 -80 -60 -40 -20 0
% CHANGED
6% change,1 run improvement
50% change, 0.6 runadditional benefit
Baseline Run Rate
THR
OU
GH
PUT
(RU
NS
PER
WEE
K)
BASELINE ITERATION 1 ITERATION 2 ITERATION 3
Reduce bu!er prep times
by 10%
Move small scale preps
to disposables
Add another upstream CIP skid
0
1
2
3
FIGURE 9: LEVEL OF CHANGE REQUIRED TO ACHIEVE A TARGET RUN-RATE.
FIGURE 10: DEBOTTLENECKING PATHWAY
“Debottlenecking comes down to this; where do we focus, and how much do we need to change. No more “back of the envelope” calculations and second guessing.” Shift supervisor, bulk biopharmaceutical manufacturing facility
BIOMANUFACTURING DEBOTTLENECKING AND PROCESS OPTIMIZATION
7© Bioproduction Group. All Rights Reserved.
MULTI-FACTOR DESIGN OF EXPERIMENTWhile the approach outlined above is very successful in identifying and fixing bottlenecks in biomanufacturing facilities using an iterative approach, design-of-experiments can also yield similar results and are more suited for highly complex facilities. Design-of-experiments make multiple simultaneous changes to di"erent aspects of a facility, and observe the results of those changes.
In the earlier analysis, we saw that shortening Pro-A and Fermentation cycle times were likely to have the greatest impact on improving run rates. Now we want to see how a combination of improvements of these factors can help us achieve the targets we desire. In Figures 11 and 12, we examine the e"ect of reducing the duration of both of these two possible activities on the critical path simultaneously. Achieving a throughput of 51kg could be achieved through reducing fermentation time alone by 60%. The same result could also be achieved by reducing fermentation time by only 5%, if the Protein A cycle time is also reduced by 10%. As such, this approach can be used to understand the tradeo"s between various possible improvements, and combinations of improvements, which do not aggregate together in a simple manner.
One of the issues with design-of-experiment is that it produces a combinatorial number of scenarios that must be examined. For example, examining two possible factors (say, fermentation time and purification time) requires four scenarios while examining three factors requires 8, four factors requires 16, etc.. The advantage of such an approach is that it allows the explicit tradeo" between factors to be understood, provided the modeling engine can evaluate the scenarios in an automated manner. Bio-G’s Real-time Modeling System typically runs thousands or tens of thousands of such scenarios as part of a single debottlenecking exercise. In general, many factors are first analyzed for potential e"ect (using a screening design), and then a more detailed analysis is performed.
FIGURE 11: SURFACE PLOT COMPARING IMPROVEMENTS IN PRODUCTION FERMENTATION AND PROTEIN-A CYCLE TIME.
FIGURE 12: HEAT MAP COMPARING IMPROVEMENTS IN PRODUCTION FERMENTATION AND PROTEIN-A CYCLE TIME.
“Complex analysis, powerful engine, simple outputs. That’s what Bio-G’s software is. We love design-of-experiments, and that’s not something I ever dreamed I’d say.” Bio-G Customer Advisory Board Meeting
BIOMANUFACTURING DEBOTTLENECKING AND PROCESS OPTIMIZATION
7© Bioproduction Group. All Rights Reserved.
FEEDBACK Please provide your feedback at http://www.zoomerang.com/Survey/WEB22AYVM3VLFC
FURTHER READINGJohnston, Zhang (2009). Garbage In, Garbage Out: The Case for More Accurate Process Modeling in Manufacturing Economics, Biopharm International, 22:8
MORE INFORMATIONBIOPRODUCTION GROUP [email protected] WWW.BIO-G.COM
THE FUTURE OF BIOMANUFACTURING OPERATIONSBiomanufacturing facilities are being asked to produce increasingly high titer products, in shorter campaigns with a larger mix of products. The challenge in such an environment is to understand what the real capacity of the facility is, where the critical process constraints are, and what can be done to improve those processes.
The best-in-class biomanufacturing facilities Bio-G works with understand that bottlenecks are dynamic and move, as a facility evolves. Repeating debottlenecking analyses monthly, aligns manufacturing and operational excellence groups around a single goal. Those on the manufacturing floor, managers and the plant head can focus on the key areas of the process that are critical, and they avoid spending time, energy and investment in areas which do not impact performance. This allows for decision-making on the basis of quantitative data from models that incorporate the variability and complexity of the facility. The approach delivers continual improvement guided by factual observation, rather than SME opinion. This means biomanufacturing facilities that implement debottlenecking and process optimization using the methodologies described in this whitepaper, will consistently out-perform those that use more traditional approaches.