After upstream bioprocessing, such as developing a cell line or microorganism that can produce an intended product, a bioprocessor turns to downstream steps to capture and purify that product, as well as subjecting it to quality control (QC). Upstream and downstream processes both contain bottlenecks—in some cases, upstream improvements can even exacerbate downstream bottlenecks. GEN asked several experts to describe the most important downstream bottlenecks and how they address them.
“The first step is to identify the bottleneck and its root cause,” write Cytiva’s Silke Steinhilber, PhD, director of product management, filtration, hardware, and consumables, and Ruth de la Fuente Sanz, MBA, senior product manager of marketing in a joint email. Cytiva is a global provider of technologies and services related to the development and manufacture of therapeutics, with hubs in Marlborough, MA; Amersham, U.K.; Uppsala, Sweden; and Shanghai, China.
“The most common bottlenecks emerge where complexity intersects with speed—typically in purification and QC testing,” says Dexter Evans, MBA, vice president, global operations, biosciences, at Thermo Fisher Scientific, which is a global provider of scientific instrumentation, reagents and consumables, and software and services, headquartered in Waltham, MA. “These challenges often stem from factors such as workforce availability, material flow, variability in product quality, or equipment performance.”
A pair of constraints often create downstream bottlenecks. “If it is a capacity constraint due to an increase in demand, one possibility is to perform a manufacturing capacity constraint analysis to determine where we would need to add or reorganize resources at the site,” note Steinhilber and de la Fuente. “If it is a throughput constraint from an increased titer in upstream, from process development, or a new product, then the team would need to look into the process bottlenecks and either scale up with a larger amount of resin or filter area, or increase the number of cycles, scale out, or explore the use of newer resins and filters with higher capacity.”
Overall, a downstream bottleneck can cause a slowdown in a process or can even stop it. “We treat these as operational signals—such as an accumulation of a work-in-process—that prompt immediate response,” says Evans. “Once the affected step is stabilized, we conduct a root-cause investigation to help prevent recurrence.” As he adds, “Our approach focuses on viewing the system holistically—identifying constraints and restoring flow.”
Production depends on purification
Various biotherapies, such as gene therapy and RNA vaccines, depend on plasmid DNA (pDNA), which is a piece of double-stranded DNA linked to make a circle. Various species of bacteria, often E. coli, can be used to make pDNA. So, bacteria transformed to carry the desired pDNA are grown, but then the pDNA must be isolated.
“Growing is easy, because bacteria are good at that,” says Angelica Meyer, PhD, director of manufacturing, science, and technology at Aldevron, which is a global contract development and manufacturing organization (CDMO). “Purification is the biggest challenge.”
Somehow, a bioprocessor needs to get the pDNA out of the bacterial cells. Chemicals can be used to disrupt the cell wall, or pressure can be applied, and “everything explodes out,” Meyer says. Either way, the process produces a mishmash of pieces of cell wall and all of the molecular machinery from inside the cells, including the desired pDNA.
“We want to make sure—most importantly—that what leaves our shop is good and clean and doesn’t have immune-triggering molecules, like endotoxins,” Meyer says.
Scientists at Aldevron remove much of the cellular debris with one chromatographic process, and then a second chromatographic process further purifies the pDNA. Those steps, though, also remove some of the product. That’s a bottleneck, because a good recovery rate after all purification steps is currently around 50%. Meyer would like to see that increase to a recovery rate in the 80 or 90% range. “I think that that’s realistic, and that’s extremely achievable,” she says. The ultimate goal, according to Meyer, is: “Being able to make more product quicker, cheaper, faster, and keeping it at high recoveries at all steps is going to contribute to driving costs down.”

Recovering as much product as possible, making it as pure as possible, running it at high throughput, and keeping costs down, though, is very challenging. “Even though we have a really robust process, the materials that we use in that process are very expensive, and it’s not one material,” Meyer says. “It’s hundreds of materials.” That creates hundreds of opportunities to run into a bottleneck. “So, we’re tying together the entire logistics chain to make all of this possible, and our suppliers have to make sure that they’re getting stuff to us on time,” Meyer says.
If one supplier somewhere in the chain fails to deliver a material on time, the entire purification process could stop. “We have a very, very large warehouse where we keep as much as we can on hand, but we can’t anticipate everything,” Meyer says.
No matter how a bioprocessor purifies a product, “we develop a robust process so that we are guaranteed to make purity really high,” Meyer says.
Maps and more reveal opportunities
In addition to advances in purification tools and processes, some bioprocessors turn to other techniques, as well.
For example, a value stream map (VSM) is a diagram of a process, from start to finish, that points out the steps that add value and the ones that don’t. At Thermo Fisher, for example, VSM “and the theory of constraints provide insight into where effort may not be translating into customer value,” Evans explains. “We routinely use VSM to analyze workflows and identify opportunities for improvement.” In parallel, says Evans, “we leverage tools like throughput operating maps during daily accountability meetings to detect and address process disruptions in near real-time.”
As Cytiva’s two experts point out: “Value stream mapping can be useful for any process that has multiple handovers, because it provides valuable information on time spent waiting between teams, non-added value time, and where waste can be eliminated.”
VSMs make up just one approach to reducing downstream bottlenecks. Another tool is a time and motion study (TMS), often just called a time study or a motion study, which is pretty much just what it sounds like: looking at movements involved in a process and how long they take.
A TMS can play a role in a business strategy known as lean management, which aims to continually optimize processes by reducing waste and maximizing efficiency. The heart of lean management revolves around Gemba—a Japanese word for “the actual place.” In bioprocessing, for example, Gemba could be a QC lab.
“Any lean tool that provides Gemba insights is a good tool to use in a manufacturing environment,” Steinhilber and de la Fuente say. “Time and motion studies can be used to build accurate schedules, identify capacity bottlenecks or tasks with a high risk of error, and help standardize the work.”
Although Thermo Fisher looks at insights from time and motion to support the planning of a process, the company’s “approach to addressing bottlenecks focuses on systemic signals rather than stopwatch metrics,” Evans says. “Bottlenecks are usually self-evident through disruptions in flow, and we often combine operator feedback with process data to understand the root cause.” Emphasizing Thermo Fisher’s holistic approach to addressing downstream bottlenecks, Evans adds, “We prioritize structural solutions tied to staffing, material, quality, and equipment—not just isolated time measurements.”
Putting plans in action
The wide variety of potential bottlenecks in downstream bioprocessing forces scientists to find creative solutions. Here, a lean-management strategy keeps companies focused on always improving processes.
One fairly common bottleneck arises from manually packing chromatography columns, because it’s time consuming. Moreover, manually packed columns can decrease reproducibility, efficiency, and yields. “Setting up an automated column-packing procedure reduces this bottleneck,” Steinhilber and de la Fuente say. “It can be implemented quickly, reduces the risk for failure, and frees up time for the operators.”
Such a conversion from manual to automated, though, can only be easy if the technology is readily available, and it is. As Steinhilber and de la Fuente explain: “Cytiva is offering a column packing solution with the ÄKTA process CFG, AxiChrom columns, AxiChrom Column Controller, and a BioProcess Resin Mixer to streamline the packing procedure and increase reproducibility and yields.”
Recently, an unexpected surge in demand created a downstream bottleneck in a bioproduction process at Thermo Fisher. “Work orders began accumulating at a critical production step,” Evans says. “Through the use of VSM and the theory of constraints, we identified that the issue stemmed from resource limitations and downstream-testing inefficiencies.” To resolve this bottleneck, Evans says, Thermo Fisher transitioned “from a batch-based approach to a more dynamic, team-driven flow model, which increased throughput and enabled more consistent coordination with quality testing.”
Consistency is crucial when battling any downstream bottleneck in a bioprocess. That is, the process must run robustly and churn out the same pure product run after run. The bioprocessing industry, though, is still far from consistent, because advances in therapeutics never stop. So, downstream bottlenecks won’t stop either, but experts in this field keep coming to work, taking on new challenges, and finding solutions—just what the entire healthcare system needs.