Most pharma quality teams spend 3–6 months writing IQ/OQ/PQ documents for a machine vision system they just bought. Most of that time is writing documentation that a good vendor should have provided on delivery day.
If your company manufactures pharmaceuticals, cosmetics, or medical devices, and you've just approved a machine vision inspection system, you're now staring at a regulatory mountain: Installation Qualification (IQ), Operational Qualification (OQ), and Performance Qualification (PQ). The scope feels vague. The timelines slip. Your validation consultant quotes you six figures and six months.
This guide cuts through the noise. We'll walk through what IQ/OQ/PQ actually means for a vision system, what documents you need (and who should provide them), and how smart vendor selection reduces your validation timeline from months to weeks.
Reduce Your Vision System Validation Timeline
Optomech supplies full IQ/OQ/PQ documentation packages with every pharma-specification system. Ask us what's included.
Get a Quote Explore Inspection Systems →What IQ/OQ/PQ Actually Means for a Vision System
The textbook definitions of IQ, OQ, and PQ are familiar to anyone who's worked in pharma QA. But vision systems are different from your tablet press or filling line. Here's what these qualifications actually look like when applied to an automated inspection system.
IQ (Installation Qualification): Proving the System is Installed Correctly
IQ is your proof that the vision system arrived as specified and is installed where and how it should be.
For a vision system, IQ verification includes:
- Camera positioning — Verify that cameras are mounted at the specified distances, angles, and heights from the product. Document the setup with photographs.
- Lighting configuration — Ring lights, backlight, or coaxial LED: confirm type, intensity (lux levels), and alignment. Photo-document each lighting setup.
- PLC/control system interface — Verify communication between the vision system and your production line controller. Check I/O mappings (triggers, alarms, rejection signals).
- Air supply and pressure — If the system uses pneumatic rejection gates, verify supply pressure and flow rates meet specifications.
- Enclosure rating and environmental conditions — Confirm IP class rating is suitable for your environment (dust, humidity, temperature). Record baseline environmental data.
- Software version documentation — Record the exact firmware/software versions installed. This becomes critical when you validate algorithm updates later.
IQ is where many teams stumble. A camera installed 2 mm too close to a high-speed line means the field of view is wrong, and every downstream OQ and PQ test fails. An LED that's dimmer than specified means your defect detection threshold is unreliable. IQ is not glamorous, but it's foundational.
OQ (Operational Qualification): Proving the System Works as Designed
OQ answers the question: "Does this system operate consistently within the range of conditions we expect it to encounter?"
For vision systems, OQ typically includes:
- Detection sensitivity across operating speeds — Run test samples at minimum line speed, nominal speed, and maximum speed. Verify defect detection rate does not degrade at higher speeds.
- Alarm and response verification — Trigger a simulated defect (marked test bottle, deliberately off-label, etc.) at various speeds and line positions. Confirm the system raises an alarm and the rejection mechanism activates within specified timing.
- Rejection gate timing — Measure the delay between defect detection and physical rejection. Verify it's consistent and doesn't cause bottle jams or misses.
- False reject rate at baseline conditions — Using known good product, run extended trials to establish the baseline false rejection rate. A good vision system should be below 1–2% with proper lighting and tuning.
- Algorithm stability testing — Run the same test over 8+ hours to verify the algorithm doesn't drift due to ambient light changes, temperature variation, or software drift.
OQ is your proof that the system is capable of doing the job consistently. If OQ shows that false rejects spike when ambient light changes, that's a signal to either improve environmental controls or recalibrate the lighting setup. This is detection work, not acceptance work.
PQ (Performance Qualification): Proving the System Works on Your Real Product
PQ is the final validation: "Does this system detect actual defects in our actual product at actual production speed, over extended runs?"
PQ typically includes:
- 3 consecutive production runs — Each run should process a predetermined number of units (often 5,000–10,000 units, depending on production speed and defect frequency).
- Documented detection rate — Record every defect detected and rejected. Cross-check against your own 100% manual inspection to confirm the system's sensitivity. If your manual inspection finds defects the vision system missed, you have a problem to solve before release.
- Statistical acceptance criteria — Establish what "acceptable" performance looks like. For example: "System must achieve ≥98% detection rate on critical defects, ≤2% false reject rate, with 3 consecutive runs meeting these thresholds."
- Batch traceability — Document which batches/lots were inspected, when, by which operator, and the results. This becomes your audit trail if a defective product escapes downstream.
PQ is where the rubber meets the road. It's where you prove the system can do the job on real product under real conditions. If PQ reveals that false rejects are 8% because your product has natural color variation that the algorithm can't distinguish from actual defects, you now know you need a different algorithm, better lighting, or both.
Insight: Vision Systems Require Algorithm Ownership
Unlike a traditional pharmaceutical equipment qualification, vision system validation must include algorithm qualification. The "system" is not just hardware; it's the trained neural network or rule set that makes the defect/no-defect decision. As a regulated manufacturer, you must maintain records of algorithm version, training data, and validation testing. If the vendor updates the algorithm (to improve detection), you're technically validating a new configuration. This is often overlooked and causes surprise requalification work.
The Documents You Actually Need
Here's where the rubber hits the road: what paperwork do you actually need to submit to regulators (FDA, WHO-GMP, Schedule M, etc.) and what should your vendor provide?
| Document | Who Provides It | Typical Length |
|---|---|---|
| User Requirements Specification (URS) | End User (You) | 5–15 pages |
| Factory Acceptance Test (FAT) Protocol | Vendor | 10–20 pages |
| Site Acceptance Test (SAT) Protocol | Vendor + User | 5–10 pages |
| IQ Protocol + Executed Report | Vendor + QA Team | 15–25 pages |
| OQ Protocol + Executed Report | Vendor + QA Team | 20–40 pages |
| PQ Protocol + Executed Report | End User + Vendor Support | 25–50 pages |
| Traceability Matrix (Requirements → Test) | Both | 2–5 pages |
| Change Control Procedure | End User | Per Your SOP |
URS (User Requirements Specification): This is your document. It defines what you need the system to do. "Detect and reject bottles with missing labels at line speeds up to 120 bottles/minute, with ≤2% false reject rate, integrated with our existing PLC, and compliant with 21 CFR Part 11." A good URS drives everything downstream. Weak URS = weak IQ/OQ/PQ.
FAT Protocol: The vendor runs tests at their factory before shipment. Typically includes unit test results, imaging samples, speed/accuracy data, and sign-off that the system meets specification before it leaves the dock. Many companies skip FAT; don't. A FAT protocol costs the vendor a day or two of labor and saves you months of troubleshooting.
SAT Protocol: The vendor or their commissioning team visits your site and runs a quick functional acceptance test. Essentially a condensed FAT under your environmental conditions. Confirms the system works after installation and transportation.
IQ Protocol + Report: Your QA team, often with vendor support, executes a detailed checklist verifying every hardware specification. The "protocol" is the checklist (drafted pre-installation); the "report" is the filled-in checklist with photos, measurements, and sign-offs.
OQ Protocol + Report: Similar structure. The protocol lists every operational parameter to test (speed sensitivity, alarm timing, false reject rate, etc.); the report documents the test results, raw data, and pass/fail status. This is where most requalification effort lands if something goes wrong.
PQ Protocol + Report: Your final gate. Protocol defines acceptance criteria (≥98% detection, ≤2% false rejects) and which product batches to test; report documents the 3 consecutive production runs, defect detection logs, and final sign-off. This is your regulatory submission evidence.
Traceability Matrix: A spreadsheet mapping each user requirement (from the URS) to the test that verifies it. For example: "Requirement 3.2: Detect missing labels" → "OQ Test 2.1: False negative rate at 120 bpm." Regulators love this because it proves you tested everything you promised to test.
Warning: Recipe and Reference Image Validation is Often Forgotten
Most vision systems allow you to adjust detection parameters or load custom "recipes" for different product variants (e.g., one recipe for 200 ml bottles, another for 500 ml). These recipes are not pre-qualified by the vendor. If you create or modify a recipe after validation, you've technically created a new validated configuration. Establish a procedure for recipe qualification and change control. Many companies miss this and run afoul of regulators who ask, "Who validated that recipe? Where's the testing?"
What Most People Get Wrong
After years of pharma validation work, two mistakes stand out:
Mistake #1: Validation Planned After Purchase
The typical sequence:
- Your operations team identifies a defect issue (label misalignment escaping to the market).
- They request a vision system from procurement.
- Procurement issues an RFQ based on budget, not validation requirement.
- A vendor is selected and the system is ordered.
- Only then does your QA team get involved and think, "Oh, we need to validate this."
The right approach is reversed:
- Define the problem and required system performance (URS).
- Ask vendors upfront: "What documentation do you provide for regulatory compliance? Do you have FAT protocols? IQ/OQ templates? 21 CFR Part 11 compliance matrices?"
- Include this in vendor evaluation and contract negotiation.
- Only then place the order, knowing that validation is baked into the project from day one.
The difference in timeline is dramatic. A vendor who says, "I'll provide FAT report, IQ/OQ protocols, and PQ execution support" compresses your validation by 8–12 weeks. A vendor who says, "You'll figure it out" adds 8–12 weeks of your QA team's time.
Mistake #2: Treating Vision System IQ/OQ/PQ Like Traditional Equipment Validation
A tablet press IQ/OQ/PQ looks like this:
- IQ: Verify the press is bolted down, electrical connections are correct, compressed air is piped in.
- OQ: Run the press at different speeds and pressures, verify weight uniformity and hardness meet specs.
- PQ: Run 3 batches of real tablets, verify the output still meets weight/hardness specs.
Vision systems require algorithm validation on top of hardware and operational validation:
- IQ: Verify camera position, lighting, PLC interface, and software version.
- OQ: Verify speed/accuracy stability, algorithm consistency under lighting/temperature variation, algorithm version sign-off.
- PQ: Verify real product detection, algorithm performance on representative defect types, reference image qualification.
If you copy-paste a traditional equipment IQ/OQ/PQ template and just change the equipment name, you'll miss critical vision-specific elements and end up redoing work when an auditor asks, "Where's your algorithm qualification?"
The 21 CFR Part 11 Intersection
If your company exports to the USA, or if you hold FDA approvals, 21 CFR Part 11 (Electronic Records; Electronic Signatures) applies to your vision system validation.
Here's what matters for OQ and PQ scope:
Audit Trail Requirements
Your vision system must create an electronic audit trail of:
- Defects detected and products rejected (timestamp, operator, reason).
- Changes to system configuration (recipe updates, threshold adjustments, algorithm version changes).
- System login/logout events.
Your OQ and PQ must verify that this audit trail is created, is immutable (cannot be retroactively edited), and is retained for the required period (typically 5+ years for pharma).
Electronic Signatures and Access Control
If your QA team signs off on validation documents electronically (using DocuSign, Adobe Sign, etc.), or if the vision system itself requires electronic sign-off on parameter changes, 21 CFR Part 11 compliance is in scope. Your OQ should test:
- Only authorized personnel can modify system parameters or sign documents.
- Signatures are unique and cannot be forged or reused.
- The system logs who changed what and when.
Global Regulatory Parallels
If you also supply regulated markets outside the USA:
- WHO-GMP: Requires audit trail and traceability for critical control points. Vision system validation must include these elements.
- Schedule M (India): Required for pharma imports into India. Similar to WHO-GMP; vision system must maintain batch traceability and change records.
- EU GMP Annex 11 (for European exports): Similar to 21 CFR Part 11. Covers electronic records and risk-based validation.
Don't wait until you're preparing a regulatory submission to discover that your vision system's audit trail is incomplete. Build these requirements into your URS and validate them in OQ/PQ.
How Vendor Documentation Support Changes the Timeline
Let's talk money and time, because these are what matter to your CFO and operations director.
Scenario A: Vendor Provides Nothing
You receive the vision system. The vendor hands you a user manual and a bill of lading. Everything else is your problem.
- Your QA team must draft URS, IQ/OQ/PQ protocols from scratch.
- You must hire a validation consultant or allocate internal resources to execute IQ/OQ/PQ testing.
- Timeline: 16–24 weeks (4–6 months).
- Cost: $50,000–$150,000 (internal labor + consulting fees).
Scenario B: Vendor Provides Pre-Drafted Templates and Support
The vendor supplies:
- FAT protocol and executed report (already signed off at factory).
- IQ/OQ protocols tailored to your specific system model.
- 21 CFR Part 11 compliance matrix (showing how the system meets requirements).
- Camera calibration certificates and lighting spec sheets.
- On-site commissioning support to execute IQ and SAT.
With these in hand:
- Your QA team focuses on URS and PQ; IQ/OQ templates are 70% done on arrival.
- Vendor provides on-site support, reducing trial-and-error in IQ setup.
- Timeline: 6–10 weeks (1.5–2.5 months).
- Cost: $15,000–$35,000 (internal labor + vendor commissioning fee, if not bundled).
The difference: 8–12 weeks saved, $35,000–$115,000 saved. This is why vendor evaluation matters. A vendor who invests in good documentation packages saves you a quarter's worth of QA time.
Practical Takeaway: What to Ask Vendors Right Now
If you're in procurement or QA, and you're evaluating vision systems, ask these questions:
- "Do you provide IQ and OQ protocols?" Look for pre-drafted, model-specific templates, not generic boilerplate. Good vendors have these written before you sign the purchase order.
- "Do you provide executed FAT reports?" If they say, "We'll do FAT if you pay extra," that's a red flag. FAT should be standard. A clean FAT report is worth weeks of troubleshooting on your site.
- "Do you have a 21 CFR Part 11 compliance declaration or matrix?" If they don't know what you're talking about, that's also a red flag. Even if you're not FDA-regulated today, your product pipeline may change.
- "Who provides ongoing algorithm support if we need to update the system post-validation?" Algorithm updates are inevitable (performance improvements, new defect types). Know upfront whether the vendor owns this or if you're on your own.
- "Do you have reference materials or case studies showing other pharma customers' validation documentation?" This gives you confidence that they've done this before and aren't making it up as they go.
If a vendor hesitates, equivocates, or says, "You'll need to hire a consultant to handle validation," the validation work lands squarely on your QA team. Budget 6 months and $100k+. If they say, "Here's our documentation package; here's our commissioning timeline; here's our support model," you're looking at 2–3 months and $30k.
Practical Summary: Your IQ/OQ/PQ Roadmap
- Start with a strong URS. Invest time here. A good URS drives every decision downstream.
- Evaluate vendors on documentation and support, not just price. A $50k system with $0 validation support becomes a $150k project. A $55k system with full validation support becomes a $35k project.
- Secure FAT and IQ/OQ templates before system delivery. These are the long pole in the tent. Don't wait until the system arrives to start writing.
- Allocate 2–3 weeks of QA team time for on-site IQ execution and 4–6 weeks for OQ testing. This is hands-on work that can't be outsourced (though vendor support speeds it up).
- Plan for algorithm qualification. Document software versions, validate any recipe/parameter changes, and build change control into your post-validation SOP.
- Build audit trail and electronic signature compliance into OQ test scope if you're FDA or export regulated. Don't discover 21 CFR Part 11 gaps at audit time.
- Budget 8–12 weeks if the vendor provides support; 16–24 weeks if you're starting from scratch.
Frequently Asked Questions
It depends on vendor support. With a responsive vendor who provides templates and commissioning support: 8–12 weeks. Without vendor support: 16–24 weeks. This includes URS drafting, IQ/OQ protocol execution, and 3 production runs for PQ. The critical path is usually OQ (identifying the right defect detection parameters under your real environmental conditions), not the paperwork.
Technically, you've qualified a specific algorithm version. If the vendor releases a new version (e.g., improved neural network for better detection), you have two options: (1) Change control it — run partial OQ/PQ retesting to confirm the new algorithm performs as well or better than the old one, or (2) Revalidate fully if the algorithm change is significant. Either way, this is a regulated change, and you need documentation. Many companies overlook this and run unvalidated configurations. Ask the vendor upfront about their algorithm update policy and how they support post-validation algorithm changes.
No. OQ proves the system works under controlled lab conditions. PQ proves it works on your real product, at your real speed, in your real environment, over extended production runs. OQ might show 98% detection with artificial defects, but PQ might reveal that your natural product color variation is confusing the algorithm, leading to high false rejects. Regulators expect to see PQ results, especially for critical defects. It's non-negotiable for pharma.
Yes, if it's a materially different product (e.g., 200 ml vs. 500 ml bottle). At minimum, you need to execute a new set of OQ and PQ tests for that bottle size. If the recipes are similar enough (same label area, same defect types, same line speed), you might get away with partial revalidation (abbreviated OQ, then PQ). But don't assume the old recipe works for the new size—test it. Create a recipe qualification SOP that defines when you need to revalidate and when you can get away with a brief validation update. This is a change control issue.