Get In Touch

What Is Quality by Control (QbC) and How It Works

Updated - 04 Mar 2026 14 min read
profile icon
Dobrin Kolarov Healthcare Business Analyst
Close-up of teal medicine capsules in a production machine with text overlay: What is Quality by Control (QbC).
Key takeaways
  1. QbC is QbD made active. QbD defines the operating space; QbC keeps the process inside it under real manufacturing conditions.
  2. Three control layers do distinct work. Layer 0 stabilises equipment. Layer 1 measures and corrects CQAs in real time. Layer 2 predicts, optimises, and estimates unmeasured quality attributes.
  3. Unified data is a prerequisite, not a nice-to-have. Control models trained on fragmented sources carry blind spots that produce wrong corrections when conditions shift.
  4. Model dependency is the critical vulnerability. Layer 2 performance degrades outside the conditions it was trained on. Model maintenance is a continuous, GMP-regulated operational burden.
  5. Applicability is not universal. Continuous processes are the strongest candidates. Batch processes with short cycles and low variability offer a weaker return on the investment required.

Quality by Control (QbC) represents an advanced manufacturing paradigm that extends the Quality by Design (QbD) framework through active, closed-loop process control. Rather than relying on end-of-batch testing, QbC employs real-time measurements of critical quality attributes (CQAs) to continuously adjust process parameters and prevent deviations before they become defects. This article provides an evidence-based overview of QbC’s core principles, technical architecture, regulatory context, and demonstrated benefits alongside a frank assessment of its substantial implementation challenges. QbC is not universally applicable, and its successful deployment requires deep technical expertise, significant capital investment, and careful regulatory navigation.

What is Quality by Control (QbC)?

The pharmaceutical industry’s traditional approach to manufacturing quality has long relied on end-of-batch testing: samples are collected after production, analysed in off-line laboratories, and batches are either released or rejected based on results that arrive hours or days after manufacturing is complete. While this approach satisfies baseline regulatory requirements, it carries significant disadvantages: quality problems are discovered too late to correct, rejected batches waste materials and capacity, and the process provides no mechanism for real-time correction of process drift.

Process Analytical Technology (PAT) guidance from the FDA, issued in 2004, marked an early inflection point, encouraging manufacturers to deploy inline, online, and at-line measurement tools to gain real-time process understanding.[1] The subsequent ICH Q8–Q11 guidelines formalised the Quality by Design framework, requiring pharmaceutical developers to identify critical quality attributes, map their relationships to process parameters, and define a scientifically justified design space.[2,3]

Quality by Control emerges from this foundation as its active control counterpart: while QbD defines where to operate, QbC ensures the process actually stays there  even under raw material variability, equipment drift, and environmental perturbation. This article examines what QbC is, how it is implemented, where it has demonstrated value, and where its limitations must be taken seriously.

From Quality by Design to Quality by Control

QbC quality by control use cases and application

Quality by Design: the static foundation

Quality by Design, formalised in pharmaceutical development through ICH Q8 (Pharmaceutical Development), Q9 (Quality Risk Management), Q10 (Pharmaceutical Quality System), and Q11 (Development and Manufacture of Drug Substance), establishes that product quality should be designed in from the outset rather than tested in at the end.[2,3]
The framework requires manufacturers to define a Quality Target Product Profile (QTPP), identify CQAs and the Critical Material Attributes (CMAs) and Critical Process Parameters (CPPs) that influence them, and characterise their relationships typically via Design of Experiments (DoE).

The output of a QbD exercise is a Design Space: the multidimensional combination of CMAs and CPPs within which the product reliably meets its CQA targets. Operations within this space do not ordinarily require regulatory reporting. QbD is thus essentially a risk-based, knowledge-intensive map of the manufacturing process.

The limitation QbC addresses

QbD’s design space is, however, a static construct. Once defined, it assumes that the process will be operated within it  but this assumption does not hold automatically in a real manufacturing environment. Raw material properties vary between suppliers and batches, equipment performance drifts over time, and environmental conditions (temperature, humidity) fluctuate. Without active control, processes may drift outside their design space without detection until the next at-line or off-line measurement.[4]

QbC addresses this gap directly. QbC is a systematic, quantitative approach to designing and implementing active control architectures that maintain process operation within the design space under closed-loop conditions.[5]

Where QbD provides the map, QbC provides the navigation system.

Technical Architecture of QbC

QbC implementations in pharmaceutical manufacturing typically follow the ISA-95 Enterprise–Control System Integration Standard, which organises control functions into a hierarchy of layers.[6] Three layers are most relevant to pharmaceutical QbC:

Layer

Function

Primary tools

What it controls or enables

Layer 0: Unit operation controls

Maintains basic equipment setpoints

PLCs

Temperature, pressure, motor speed at individual unit operations (blenders, presses, dryers)

Layer 1: Real-time CQA monitoring

Measures CQAs inline and feeds feedback controllers

NIR spectroscopy, Raman spectroscopy, laser diffraction

Blend uniformity, moisture content, particle size, API concentration; enables real-time release testing (RTRT)

Layer 2: Model-based control and optimisation

Uses process models to predict CQA trajectories and compute optimal control moves

MPC, RTO, soft sensors, digital twins

Multivariable process interactions and constraints; throughput optimisation; estimation of unmeasured CQAs

Key distinction: QbC is not simply the deployment of PAT tools. It is the integration of real-time measurement with active feedback control which is the ability to measure a deviation and automatically correct it, not just detect it.

The data problem QbC doesn’t solve on its own

QbC’s control architecture is only as good as the data feeding it. In practice, that’s where most implementations run into trouble before the first feedback loop closes.

A pharmaceutical site generates process data from historians, MES, LIMS, ERP systems, and inline sensors. Each was built for a specific purpose, validated in isolation, and governed by its own data structure. The result is disconnected environments, each technically compliant, collectively unusable for real-time multivariate control.

Layer 2 model predictive control and soft sensing require simultaneous access to process parameters, quality measurements, material attributes, and equipment state. That data doesn’t live in one place. When a control model pulls from fragmented sources, it’s working with an approximation—and approximations make poor inputs for closed-loop corrections.

What unified data actually enables

Data unification in GxP isn’t a one-time migration. It’s the ongoing, compliant aggregation of data from every system involved in production—with full audit trail, ALCOA+ integrity, and real-time queryability across sources.

At Layer 1, PAT measurements need time-alignment with upstream parameters to distinguish a genuine CQA deviation from sensor drift. Without integrated data, those look identical. At Layer 2, MPC models trained on historian data alone without material attribute records or batch genealogy carry blind spots that produce wrong control moves exactly when conditions shift.

One documented ultrafiltration case is instructive. Over 100 parameters spread across Oracle, DeltaV, and LIMS were involved in a recirculation problem affecting nearly half of all batches. Centralising and aligning those sources was the first step. Only then could multivariate analysis isolate the single variable driving the deviation. The data infrastructure wasn’t support. It was the prerequisite.

Where this leaves implementation teams

Pharma’s silo problem isn’t accidental. Validation requirements and system-specific access controls have historically made cross-system integration risky so each system was validated in isolation. The consequence is an environment where the data exists but can’t be used across systems without significant effort.

A GxP-compliant data integration layer that ingests from historians, MES, QMS, ERP, and sensors simultaneously and makes that data queryable in real time—isn’t optional infrastructure for QbC. It’s what the control system stands on. Without it, Layer 2 models train on incomplete histories, Layer 1 controllers can’t separate signal from noise, and the model maintenance burden in Section 5 becomes unmanageable.

Organisations rightly focus QbC planning on PAT selection, model design, and regulatory strategy. The data integration question needs to sit alongside those—not after them.

QbC Benefits and Use Cases

Continuous tablet press

An use case demonstrated QbC implementation on a continuous rotary tablet press, integrating MPC with steady-state data reconciliation.[11] The closed-loop system significantly reduced tablet weight variation and controlled main compression force more tightly than open-loop operation, even under deliberate perturbations in powder compressibility. This study provided one of the clearest early demonstrations that MPC-based QbC delivers measurable quality improvement in a GMP-relevant unit operation.

Integrated filtration–drying

Another use case developed a multilayered QbC framework for a continuous carousel performing integrated filtration and drying of crystallisation slurries.[12] Benchmarked against open-loop operation, their closed-loop system increased the proportion of acceptable-quality product by approximately 40% under normal operating conditions and by up to 600% under abnormal conditions. The model-based Layer 2 contributed an additional 10–30% yield improvement over simpler Layer 1 control – a compelling quantitative argument for the investment in advanced modelling.

Other reported benefits

  • Reduced batch rejection rates and raw material waste through real-time deviation correction
  • Faster technology transfer and scale-up, as mathematical process models replace empirical batch-by-batch learning[4]
  • Reduced cycle times through real-time release, eliminating post-production quarantine periods pending laboratory results[8]
  • Greater process resilience to raw material variability, important as pharmaceutical supply chains span multiple global suppliers[5]

Claims of specific time-to-market improvements (e.g., frequently cited figures of 20–30%) should be treated cautiously, as they aggregate across very different process types and company contexts and are not consistently supported in peer-reviewed literature.

Critical Assessment: What QbC Does Not Resolve

QbC is a genuinely significant technical advance, but several substantive limitations and risks are frequently under-discussed in promotional literature. Practitioners should assess these carefully before committing to implementation.

Applicability is primarily to continuous manufacturing

QbC’s control architecture is most naturally suited to continuous manufacturing processes, where material flows steadily and real-time feedback can act on it continuously. The majority of pharmaceutical solid dosage manufacturing globally remains batch-based. Retrofitting sophisticated closed-loop control to legacy batch processes is technically feasible but considerably more complex, and the value proposition is less clear-cut where batch cycles are short and process variability is already well characterised.[13]

Model dependency is the critical vulnerability

Layer 2 QbC depends entirely on process models for MPC, RTO, and soft sensors that must accurately represent the real system. Models are calibrated against historical data and are valid within the envelope of conditions they were trained on. When the process encounters conditions outside that envelope (a new raw material supplier, equipment replacement, seasonal environmental changes), models can produce confidently incorrect predictions, potentially driving the control system away from, rather than toward, the target CQAs.[9]

PAT tool reliability in practice

Inline spectroscopic tools (NIR, Raman) are subject to sensor drift, fouling, window deposition, and calibration degradation. In a conventional QbD context, a failed PAT measurement means a delayed result; in a QbC context, it means the feedback control loop loses its primary input signal. Robust fault detection and failsafe logic including clearly defined responses to sensor failure  must be engineered from the outset and are often underspecified in early implementations.

Regulatory validation burden

The ICH Q8–Q11 guidelines provide a conceptual foundation for QbC, and both the FDA and EMA have expressed support for continuous manufacturing and advanced process control. However, the regulatory pathway for approving a real-time adaptive control system  particularly one where algorithms may be updated over time remains more complex than the supportive regulatory language sometimes implies. Each MPC model update or control algorithm revision may require a Prior Approval Supplement or equivalent, depending on jurisdiction and the nature of the change. Manufacturers should engage regulators early and plan for extended validation timelines.

Expertise and cost barriers are substantial

Implementing QbC requires simultaneous competence in process engineering, advanced control theory, spectroscopic methods, data science, GMP validation, and regulatory affairs  a combination that is scarce and expensive. For small and mid-sized manufacturers, the investment in personnel, technology, and validation may not be recoverable at their production volumes. Shared technology platforms and contract development organisations with QbC expertise are emerging, but the market remains immature.

QbC Regulatory Context

FDA and EMA regulatory agency logos on a futuristic pharmaceutical background.

Regulation / guidance Issuing body Relevance to QbC
ICH Q8 (R2) ICH Provides the QbD framework, including Design Space, within which QbC operates 
ICH Q9 (R1) ICH Quality Risk Management tools for identifying and mitigating risks in process control systems
ICH Q10 ICH Pharmaceutical Quality System includes expectations for continual improvement and process monitoring that QbC supports
PAT Guidance (2004) FDA Foundational document supporting inline measurement and real-time process understanding
Emerging Technology Applications Guidance (2015) FDA Establishes a pre-submission engagement pathway for manufacturers developing novel approaches, including QbC 
Reflection Paper on Continuous Manufacturing (2021) EMA Addresses real-time release testing and control strategy requirements for continuous processes
ISA-88 / ISA-95 ISA Not regulatory requirements, but widely accepted architectural frameworks that regulators recognise and that simplify control system documentation in submission packages

What to consider when implementing QbC?

Organisations considering QbC implementation should address the following in sequence:

Process suitability assessment

Not every process is a candidate for QbC. Continuous processes with well-characterised, measurable CQAs and sufficient production volume to justify the investment are the strongest candidates. Batch processes with short cycle times and inherently low variability may offer poor ROI.

Measurement system validation

PAT methods must be validated for accuracy, precision, robustness, and fault detection before they can anchor a closed-loop control strategy. Method transfer across sites or equipment generations adds further validation burden.

Model development and qualification

Control models must be developed with sufficient data coverage of the expected operating space, qualified against independent test sets, and accompanied by clear retraining and update procedures. The model’s domain of validity must be explicitly documented.

Regulatory strategy

Engagement with regulatory agencies through emerging technology programmes (FDA) or scientific advice procedures (EMA) before submission is strongly advisable. The control strategy documentation must clearly articulate how the QbC system maintains the Design Space.

Operational readiness

Staff must be trained not only in routine operation but in failure mode diagnosis — including what to do when a sensor fails, a model diverges, or the control system requests a correction outside physical equipment limits.

Conclusion

Quality by Control moves pharmaceutical manufacturing from reactive testing to active quality assurance. The case evidence supports real improvements in product consistency, yield, and release cycle time particularly in continuous manufacturing contexts.

But the control architecture only performs if the data underneath it is unified.
Layer 1 feedback controllers need time-aligned, cross-system data to distinguish signal from noise.
Layer 2 models need complete process histories, spanning historians, MES, LIMS, and material attributes to predict accurately and correct reliably.
Fragmented data doesn’t just slow implementation. It undermines the control logic itself.

QbC is most applicable to continuous processes with well-characterised CQAs and sufficient volume to justify the investment. For others, a rigorous QbD foundation remains the right starting point and a viable path toward more active control as data infrastructure and technical capabilities mature.

Every manufacturing environment is different. The right control architecture depends on your process maturity, data infrastructure, and regulatory position. If you’re evaluating where QbC fits in your operations or what it would take to get there we are here to support you.
Let’s map your current state and build a path forward together.

Frequently Asked Questions (FAQ)

What's the practical difference between QbD and QbC?

QbD characterises the process and defines a design space—the combination of material attributes and process parameters within which the product reliably meets quality targets. QbC actively keeps the process inside that space during production. QbD is the map. QbC is what ensures you stay on it when raw materials shift, equipment drifts, or environmental conditions change.

Does QbC require a continuous manufacturing process?

Not strictly, but it’s where QbC is most effective. Continuous processes allow feedback control to act on material as it flows, making closed-loop correction practical. Retrofitting QbC to legacy batch processes is feasible but considerably more complex, and the value case is weaker where batch cycles are short and variability is already well controlled.

Why does data unification matter for QbC specifically?

Layer 2 control models require simultaneous access to process parameters, material attributes, batch genealogy, and quality measurements—data that typically lives across historians, MES, LIMS, and ERP systems. A model trained on incomplete or misaligned data carries blind spots. When process conditions shift, those blind spots produce confidently wrong control moves. Unified, time-aligned data isn’t infrastructure support for QbC—it’s what makes the control logic valid.

How does QbC affect the regulatory submission process?

In a conventional QbD setup, a failed PAT measurement delays a result. In a QbC system, it removes the primary input to the feedback control loop. Robust fault detection and clearly defined failsafe responses must be engineered in from the start and not added later. This is one of the most frequently underspecified elements in early QbC implementations.

References

  1. U.S. Food and Drug Administration. Guidance for Industry: PAT  A Framework for Innovative Pharmaceutical Development, Manufacturing, and Quality Assurance. FDA, Rockville, MD, September 2004. Available at: https://www.fda.gov/media/71012/download
  2. International Council for Harmonisation. ICH Q8(R2): Pharmaceutical Development. Geneva: ICH, 2009.
  3. International Council for Harmonisation. ICH Q9(R1): Quality Risk Management. Geneva: ICH, 2023; ICH Q10: Pharmaceutical Quality System. Geneva: ICH, 2008; ICH Q11: Development and Manufacture of Drug Substances. Geneva: ICH, 2012.
  4. Lee, S.L., et al. Modernizing Pharmaceutical Manufacturing: from Batch to Continuous Production. Journal of Pharmaceutical Innovation, 2015, 10(3), 191–199. https://doi.org/10.1007/s12247-015-9215-8
  5. Gerogiorgis, D.I., & Barton, P.I. Steady-state optimisation of a continuous pharmaceutical process. Computer Aided Chemical Engineering, 2009, 26, 927–931. See also: Gerogiorgis, D.I. Systematic process design and operational optimisation methods for pharmaceutical manufacturing — A review. Organic Process Research & Development, 2024.
  6. ISA-95: Enterprise-Control System Integration Standard. International Society of Automation, Research Triangle Park, NC. Available at: https://www.isa.org/standards-and-publications/isa-standards/isa-standards-committees/isa95
  7. De Beer, T., et al. Near infrared and Raman spectroscopy for the in-process monitoring of pharmaceutical production processes. International Journal of Pharmaceutics, 2011, 417(1–2), 32–47. https://doi.org/10.1016/j.ijpharm.2011.02.022
  8. European Medicines Agency. Reflection Paper on the Use of Real-Time Release Testing and Parametric Release in the Manufacture of Medicinal Products (EMA/827082/2012 Rev. 1). EMA, Amsterdam, 2021.
  9. Qin, S.J., & Badgwell, T.A. A survey of industrial model predictive control technology. Control Engineering Practice, 2003, 11(7), 733–764. https://doi.org/10.1016/S0967-0661(02)00186-7
  10. Bhalode, P., & Ierapetritou, M. Using residence time distribution for process characterization and mass flow analysis in continuous powder processing operations. International Journal of Pharmaceutics, 2020, 584, 119417. https://doi.org/10.1016/j.ijpharm.2020.119417
  11. Rehrl, J., et al. Model predictive control of continuous pharmaceutical manufacturing: A model-based approach. International Journal of Pharmaceutics, 2017, 519(1–2), 283–295. https://doi.org/10.1016/j.ijpharm.2017.01.044
  12. Ottoboni, S., et al. Development of a Quality by Control framework for pharmaceutical continuous manufacturing: a case study of a continuous carousel for filtration-drying. Chemical Engineering Research and Design, 2020, 160, 106–119. https://doi.org/10.1016/j.cherd.2020.05.018
  13. Plumb, K. Continuous Processing in the Pharmaceutical Industry — Changing the Mind Set. Chemical Engineering Research and Design, 2005, 83(A6), 730–738. https://doi.org/10.1205/cherd.04359
  14. U.S. Food and Drug Administration. Guidance for Industry: Advancement of Emerging Technology Applications to Modernize the Pharmaceutical Manufacturing Base. FDA, Silver Spring, MD, December 2015. Available at: https://www.fda.gov/media/92649/download

 

profile icon

Dobrin Kolarov

Healthcare business analyst with expertise in marketing and business development, and holds an MPharm degree. He specialises in creating and executing communication strategies that make digital health solutions and pharmaceutical technologies clear, accessible, and resonation for their audiences.

link to the author’s linkedin profile

What’s your goal today?

Hire us to develop your
product or solution

Since 2008, BGO Software has been providing dedicated IT teams to Fortune
100 Pharmaceutical Corporations, Government and Healthcare Organisations, and educational institutions.

If you’re looking to flexibly increase capacity without hiring, check out:

On-Demand IT Talent Product Development as a Service

Get ahead of the curve
with tech leadership

We help startups, scale-ups & SMEs create cutting-edge healthcare products and solutions by providing them with the technical consultancy and support they need to break through.

If you’re looking to scope and validate your Health solution, check out:

Project CTO as a Service

See our Case Studies

Wonder what it takes to solve some of the toughest problems in Health (and how to come up with high-standard, innovative solutions)?

Have a look at our latest work in digital health:

Browse our case studies

Contact Us

We help healthcare companies worldwide get the value, speed, and scalability they need-without compromising on quality. You’ll be amazed of how within-reach top service finally is.

Have a project in mind?

Contact us
chat user icon

Hello!

Did you know that BGO Software is one of the only companies strictly specialising in digital health IT talent and tech leadership?

Our team has over 15 years of experience helping health startups, Fortune 100 enterprises, and governments deliver leading healthcare tech solutions.

If you want to explore your options, would you like to book a free consultation call today?

Yes

It’s a free, no-obligation, fact-finding opportunity. You’ll have a friendly chat with our team, ask any questions, and see how we could help in detail.