I still remember standing beside a Komori offset press in a folding carton plant near Monterrey, holding two logo proofs that looked identical under the fluorescent lights above the proof table, yet one went slightly dull on coated SBS and the other shifted warm on uncoated kraft. I was annoyed, honestly. Because that tiny shift is exactly the kind of thing that makes a brand manager say, “But it looked fine in the room,” right before a production manager starts rubbing their forehead. And that gap is exactly why a guide to algorithmic color matching matters so much for branding teams: once color leaves the screen and hits real paper, film, or corrugated board, the rules change fast, sometimes by delta E 2.5 to 4.0 on the same red.
For Custom Logo Things and for any brand trying to keep a logo faithful across boxes, labels, inserts, and mailers, a guide to algorithmic color matching is really about replacing guesswork with measured, repeatable data. Instead of depending only on a designer’s eye, the process reads spectral values, compares LAB numbers, and predicts how an ink formula will behave on a specific substrate under controlled lighting. That is a big deal when your brand red has to look like the same red on a rigid setup box, a flexo-printed mailer, and a pressure-sensitive label run a week apart. I’ve had buyers tell me, “It’s just red.” No. It is never “just red,” especially when the print run is 12,000 cartons in Dongguan and the reorder lands three months later in Mexico City.
Honestly, I think people underestimate how much money gets burned by “close enough” color decisions. I’ve seen a cosmetics client approve a logo by monitor preview, only to reject 18,000 folding cartons after the first press run because the magenta bias looked clean on screen but skewed pink on the actual board. That was a fun phone call for nobody. A proper guide to algorithmic color matching helps prevent those kinds of surprises by tightening the path from approved standard to finished print, usually with proof approval to production taking 12-15 business days when the substrate is already locked.
Guide to Algorithmic Color Matching: Why Brands Rely on It
A practical guide to algorithmic color matching starts with one simple truth: two logos can look identical on a calibrated display and still print differently on coated paper, uncoated stock, film, or corrugated board. Screen color is emitted light; print color is reflected light, and that difference alone can shift saturation, brightness, and perceived warmth by a visible amount. Once a brand begins using multiple substrates, the old “match it by eye” method becomes shaky fast. I’ve watched whole teams argue for 20 minutes over a shade that turned out to be a lighting issue in a sample room in Shenzhen, not a print issue at all.
Algorithmic color matching is software-driven color analysis that predicts, compares, and adjusts formulas across devices, substrates, and lighting conditions. In plain English, the system measures a color in numeric terms, compares it to a target, and recommends a formula or correction path based on data rather than instinct. That is why a guide to algorithmic color matching is useful to packaging buyers, prepress teams, and production managers alike. It gives everyone one shared version of the truth, which is refreshing because “I think it looks right” is not a production standard, and nobody in Qingdao or Guadalajara wants that as the final answer.
The business value is hard to ignore. Better matching usually means fewer remakes, less scrap, faster approval cycles, and more confidence when a brand rolls out the same package line across multiple factories or co-packers. I’ve seen a snack brand cut approval back-and-forth from four rounds to two by locking a measured standard for its logo green across SBS cartons, kraft wraps, and corrugated SRP trays. That kind of consistency saves real money, especially when a press hour can run anywhere from $450 to $1,200 depending on the plant and setup. Nobody wants to pay premium press time for a color that “almost” works, especially when a rerun of 20,000 pieces can add $1,800 to $3,500 in waste and freight.
Where does this show up most? Packaging, label production, promotional print, signage, and multi-site brand rollouts are the most common. I’ve also seen strong demand in rigid box programs, PET shrink sleeves, and specialty retail displays where a logo panel must hold color in three different print technologies. A solid guide to algorithmic color matching helps those teams keep the brand recognizable whether they are printing on SBS, kraft, rigid box board, or flexo-printed corrugate from a plant in Dongguan, Monterrey, or Warsaw.
The difference between visual matching and algorithmic matching is not subtle. Visual matching leans on human judgment under a light booth, usually with D50 viewing conditions and a trained eye, while algorithmic matching starts with measured data such as LAB values and spectral curves. Human eyes are still valuable, but they are not enough when a brand has six SKUs, three plants, and two printing methods in play. A good guide to algorithmic color matching uses both science and experienced press judgment, not one or the other. That balance matters. I trust the data, but I also trust the operator who has seen a thousand bad runs and can spot trouble before lunch.
That said, the process is not magic. It improves accuracy, but it still depends on calibration, substrate selection, ink limits, and press conditions. If the spectrophotometer is off by 0.4 delta E, or if the board has a higher brightness than the approved sample, the result can drift. A realistic guide to algorithmic color matching always leaves room for production reality, because perfect data on a bad substrate still gives you a bad print. Machines are helpful. They are not psychic, and they definitely do not care that the launch date is Friday.
“If the sample was approved on a coated sheet at D50 and you run it on a rough kraft mailer under warehouse LEDs, you are not matching the same thing anymore.” — a comment I’ve heard more than once from a veteran print supervisor in São Paulo, and it still rings true.
How the Guide to Algorithmic Color Matching Works
The core workflow in a guide to algorithmic color matching begins with measurement. A spectrophotometer scans the reference sample and captures reflected light across the visible spectrum, then software translates that reading into LAB values and spectral curves. Those numbers become the target standard, and from there the system can compare candidate formulas against the reference using delta E calculations, which show how far apart two colors are in a measurable way. No mystery. No hand-waving. Just the number everybody has to live with, usually with a target under delta E 2.0 for brand-critical packaging on coated board.
Standard illuminants matter more than most non-technical buyers realize. A color may be viewed under D50 in prepress, D65 in a design review, and retail LED lighting in the store, and each environment can alter perception. A well-run guide to algorithmic color matching evaluates how a color behaves under standard viewing conditions, not just one forgiving lamp in the sample room. That protects brands that have products moving from a warehouse in Hamburg to a shelf in Dubai to an end consumer’s kitchen table. I once had a client insist the sample was “perfect” under warm office lighting, then the same piece looked muddy under the store fixture. Lighting is a troublemaker like that, especially with neutrals and dark blue logos.
Profiles and calibration form the backbone of repeatability. Monitors need a known profile, presses need clean press curves, and measuring devices need regular verification so the color data is trustworthy. When I visited a label facility in Pennsylvania, their operators re-profiled every major press every two weeks, and the result was impressive: their day-to-day color variation was noticeably tighter than a nearby plant that calibrated only when a problem appeared. A guide to algorithmic color matching works best when the whole chain is maintained, not just the software dashboard, and their verification logs showed drift kept under 1.2 delta E across a 90-day audit.
Once the target is defined, the system searches a formula library and adjusts pigment selection, ink load, trapping behavior, overprint response, and substrate compensation. For example, a brand blue that prints beautifully on gloss-coated carton may need a different pigment mix on matte paper because absorbency and dot spread change how the blue reads. A strong guide to algorithmic color matching treats formulas like recipes with constraints, not fixed answers for every material. Which is annoying if you want one magic answer, but very useful if you want repeatable packaging.
Here is a simple packaging example. Let’s say a brand red is approved on coated carton with a LAB target that holds delta E under 1.5. The same red on a kraft mailer may need a slightly deeper formula, higher ink density, and a different white underbase strategy because the brown fiber shifts the perceived hue. A useful guide to algorithmic color matching does not promise the same formula will work everywhere; it promises the closest measured match for each real substrate, whether the order is 5,000 rigid boxes or 75,000 mailers shipped from Ho Chi Minh City.
Human oversight still matters. Press checks, proof review, and tolerance approval remain essential because there are limits to what software can predict. I’ve stood at a flexo press in a corrugated plant where the algorithm suggested a formula that looked numerically perfect but read too dark in the final board texture, and the operator caught it before a full run began. That saved everyone from a mountain of bad trays and a very awkward meeting. That is why I always tell buyers that a guide to algorithmic color matching should support pressroom judgment, not replace it, especially when a 14-point correction on ink key settings can make or break the run.
For teams that need a technical checkpoint, the process often sits inside a broader color management system using spectrophotometric data, device calibration, and controlled viewing booths. If you want to read more about the broader standards behind packaging and print, the Packaging Machinery Manufacturers Institute and the International Safe Transit Association both publish useful material on packaging performance and testing expectations, including shipping tests that matter when color has to survive transit from Leipzig to Dallas.
Key Factors That Affect Algorithmic Color Matching
Any honest guide to algorithmic color matching has to start with substrate influence, because the surface matters as much as the ink. Absorbency, coating, brightness, texture, and whiteness all change how a pigment behaves once it lands on the sheet. A 350gsm C1S artboard with a smooth clay coat will hold a brand cyan very differently than a rough recycled kraft mailer, and that difference can push delta E readings outside a tight tolerance if you do not compensate. I learned that the hard way early on, standing next to a stack of “same color” samples that looked nothing alike once they were packed and inspected in the dock area in Ningbo.
Ink chemistry is the next major variable. Offset, flexographic, digital, and gravure systems all use different delivery methods and ink behaviors, and a formula that is excellent in sheetfed offset may look muddy in flexo because the anilox and drying profile are different. In my experience, buyers who ask for a single universal formula usually end up revising the brief twice. A practical guide to algorithmic color matching respects process-specific behavior from the beginning. It saves time, which is rare and beautiful, and it avoids the classic “why is the label brighter than the carton?” debate.
Then come press variables, and there are plenty of them. Dot gain, anilox selection, drying speed, heat, pressure, and registration stability all influence final color, especially on long runs. I once watched a box plant lose half a shift chasing a deep burgundy because the pressroom had changed the impression pressure by a small amount after a doctor blade adjustment; the color shift was not dramatic from the console, but the printed cartons were visibly darker by the third pallet. A reliable guide to algorithmic color matching includes those mechanical realities in the model, because a 0.3 mm registration drift can wreck a tight logo edge.
Lighting and viewing conditions can make two correct samples appear mismatched. D50 is common for print evaluation, D65 appears in many general viewing environments, and warehouse or retail lighting can introduce a warmer or cooler cast that changes how a neutral or pastel appears. Metallics, neons, dark blues, and warm neutrals are especially sensitive, and the same sample may pass in the lab but fail on the shelf. That is why a guide to algorithmic color matching always ties measurement to viewing context, whether the final check happens in Chicago, Kraków, or a warehouse outside Mumbai.
File quality causes more trouble than people expect. Poor source artwork, embedded profiles from unknown sources, low-resolution references, and uncalibrated monitors can all distort the starting point before any ink is mixed. I’ve had design teams send over a logo pulled from a web image at 72 dpi and ask why it did not match a physical PMS swatch; the answer was obvious, but the damage to the timeline was real. A serious guide to algorithmic color matching starts with clean data and a verified reference, ideally a measured file exported from the same system used by prepress in Milan or Atlanta.
Tolerances round out the picture. Some brand colors can hold a tight delta E target of 1.0 to 2.0 on stable coated stock, but that is not always realistic on uncoated board, kraft, or mixed-production environments. In packaging, a slightly wider range may protect production efficiency without meaningfully hurting brand recognition. A smart guide to algorithmic color matching recommends tolerances based on what the substrate and process can actually hold, not on wishful thinking. Wishful thinking is not a color strategy. I wish it were, because it would save a lot of meetings.
Step-by-Step Guide to Algorithmic Color Matching
The first step in a guide to algorithmic color matching is collecting the reference color from an approved swatch, existing package, or brand master. Measure it with a spectrophotometer, not just a phone camera or a monitor preview, because those tools cannot give you trustworthy color data. If the reference is already old or damaged, it is worth replacing it with a fresh, controlled sample before you begin. I know everyone wants to save time here, but starting with a faded sample is how you buy yourself a future headache and at least one supplier argument in Guangzhou.
Step two is calibration. Monitors, presses, and measuring devices all need to be calibrated so the numbers mean something across departments. I’ve seen a prepress team blame the formula when the real issue was an uncalibrated display that had drifted by almost 8 percent in the blue channel; once that was corrected, the “mystery mismatch” disappeared. A usable guide to algorithmic color matching always checks the tools before it checks the formula, and that check should happen before the $0.15 per unit print quote gets signed off for 5,000 pieces.
Step three is choosing the correct substrate and print process. The target on folding carton will not behave like the target on labels or corrugated mailers, and it certainly will not behave like a PET film sleeve. If a buyer asks me to match a red on 18pt SBS and then move it to a recycled kraft shipper, I expect a separate approval path. A practical guide to algorithmic color matching recognizes that the print surface is part of the formula, whether the stock is 350gsm C1S artboard from Shenzhen or a 28pt corrugated liner from Monterrey.
Step four is formula selection and algorithmic adjustment. The color management system compares the target against its database, then suggests pigment changes, opacity adjustments, ink strength corrections, and in some cases a different ink series altogether. For example, a packaging red might need more warm pigment on uncoated stock and more transparency control on coated paper. A dependable guide to algorithmic color matching uses the database to narrow the field quickly instead of mixing blind. That usually saves one or two full proof cycles, which matters when approvals are due in 48 hours.
Step five is proofing or sample pulling. Run the sample, compare it against the reference under controlled lighting, and document the accepted tolerance. On a rigid box program for a luxury accessories client, we ran three rounds of samples because the foil stamp, soft-touch lamination, and deep navy logo all interacted in the booth, and the final approval required a little more black control than the software initially predicted. That kind of back-and-forth is normal in a real guide to algorithmic color matching; it is not a failure, just part of dialing in the truth. If anyone tells you every luxury box is a one-and-done miracle, they are either selling something or they have never been on a press floor in Suzhou at 7 a.m.
Step six is approval and storage. Once the formula is accepted, lock the profile, record the substrate, note the lighting standard, and store the winning settings for future production. If that information is buried in one operator’s notebook, you will pay for it later. A good guide to algorithmic color matching turns one successful job into a repeatable reference for the next reorder, the next SKU, and the next plant, even if that plant is in Ho Chi Minh City and the original approval happened in Rotterdam.
For teams trying to standardize this work, a written checklist often helps more than a long discussion. A clean production handoff might include:
- Measured reference sample with LAB and spectral data.
- Calibration record for monitor, press, and spectrophotometer.
- Approved substrate name and GSM or caliper.
- Accepted delta E tolerance and viewing illuminant.
- Final formula code and storage location.
If you want to see how environmental and material considerations fit into packaging decisions more broadly, the U.S. Environmental Protection Agency has a useful packaging and waste resource section at epa.gov, and FSC guidance can help brand teams align sourcing with responsible fiber choices at fsc.org. Those references matter if your cartons are running in Mexico, your inserts are sourced in Vietnam, and your brand team expects the same color story everywhere.
Common Mistakes in Algorithmic Color Matching
One of the biggest mistakes in a guide to algorithmic color matching is relying on screen color alone. A monitor can preview a shade, sure, but it does not reproduce the behavior of ink on board, film, or paper fiber. I’ve watched people approve a bright orange from a laptop, then act surprised when the printed carton looked more muted after varnish and drying; the screen was never the real target. That moment usually comes with a lot of silence, which is how I know the room has realized the mistake, usually after 2,000 sample cartons are already on the pallet.
Ignoring substrate differences is another frequent error. A formula that works on coated SBS does not automatically transfer to uncoated paper, kraft board, or corrugated linerboard without compensation. The absorbency, brightness, and texture are all different, and those differences alter how the pigment reflects light. A practical guide to algorithmic color matching has to account for each substrate rather than treating packaging materials like they are interchangeable, which is a lovely theory until the tray arrives in the wrong shade.
Skipping calibration can create errors that look like formula problems. Presses drift, cameras age, measuring devices require verification, and monitors gradually shift. If those tools are not maintained, the entire color workflow becomes suspect. In one supplier negotiation I handled, the vendor wanted to argue about formula cost, but their spectrophotometer had not been checked against a standard tile in months; that conversation ended quickly. A trustworthy guide to algorithmic color matching starts with disciplined equipment maintenance and a documented check every 7 to 14 days.
Overly strict tolerances also cause trouble. Some buyers demand a delta E target so tight that the process becomes expensive and slow, especially on rough or recycled materials that naturally vary a bit from sheet to sheet. There is a difference between brand control and unrealistic perfection. A sensible guide to algorithmic color matching balances visual quality, process capability, and budget. Otherwise you end up chasing ghosts while the production schedule burns, and your freight quote doubles because the rerun has to leave from Dallas instead of the original plant in Puebla.
Lighting assumptions can create false alarms, too. Deep blue, metallic silver, fluorescent pink, and warm beige can all shift under different light sources, which means a sample may look off in one room and perfect in another. I always prefer a controlled booth with defined illuminants because it removes a lot of arguments before they start. A careful guide to algorithmic color matching documents the light source used for final approval so nobody is guessing later, and so the next plant in Cleveland or Eindhoven can reproduce the same evaluation.
Finally, people forget to document the approved formula. That is a costly mistake. Without a stored formula, future reorders take longer, and the next production team may start from scratch even though the right answer already existed. A reliable guide to algorithmic color matching ends with a record that another plant, another shift, or another vendor can actually use, preferably with substrate, batch, and tolerance details attached.
Algorithmic Color Matching Cost and Pricing Factors
Pricing for a guide to algorithmic color matching project usually depends on the number of target colors, the complexity of the substrate, the number of print stations, and whether custom profiling is needed. A single brand red on a known coated stock may be straightforward, while a full packaging line with five spot colors, metallic accents, and multiple materials will take more time and test iterations. The work is rarely priced as a flat, universal number because the setup effort can vary so much. Anyone who has quoted a job like this knows the price conversation can get weird fast, especially if the buyer wants one number for carton, label, and corrugate all at once.
First-run setup is often the most expensive part. Calibration, formula development, proofing, and press verification all take labor and machine time, and that effort has to be paid for somewhere. I have seen simple proofing engagements land around $250 to $500 for a small SKU family, while more complex packaging profiling projects can run into the low thousands depending on the number of substrates and approval rounds. A transparent guide to algorithmic color matching should ask suppliers exactly what is included before the quote is accepted. Otherwise, you get the classic “Oh, that’s extra” speech, which never makes anyone happier, especially when the revised invoice includes a $180 reprofiling fee.
Savings usually show up later. Fewer remakes mean less wasted board, less ink, fewer freight surprises, and faster turnarounds when a reorder arrives. On a multi-SKU food packaging program I reviewed, the brand saved roughly two press days per quarter after moving to measured color standards, which mattered because those presses were booked constantly. A good guide to algorithmic color matching often pays back through avoided waste rather than through a cheap setup fee, and the avoided waste can easily be 3,000 to 8,000 sheets on a single bad run.
Specialty inks and premium finishes raise the stakes. Metallic effects, soft-touch coatings, opaque whites, and high-end rigid packaging all add complexity, and complexity usually adds cost. If a package needs a tactile varnish plus precise logo color on top, the matching task is more demanding than a standard one-color mailer. A realistic guide to algorithmic color matching treats these finishes as part of the pricing model, not as afterthoughts, because a soft-touch laminated box from Poland behaves differently than a plain folding carton from Vietnam.
| Matching Option | Typical Setup Scope | Approximate Cost Range | Best Fit |
|---|---|---|---|
| Basic visual matching | Single sample, limited profiling, manual approval | $150-$400 | Short runs, low-risk print, simple substrates |
| Algorithmic matching on one substrate | Spectrophotometer measurement, formula search, proofing | $350-$900 | Repeat packaging with one board or paper type |
| Multi-substrate algorithmic matching | Custom profiling, multiple proofs, tolerance documentation | $900-$2,500+ | Brand rollouts across carton, label, and corrugate |
The table above is a practical starting point, not a universal price card. A custom rigid box with foil stamping, embossing, and soft-touch lamination can cost far more to match than a plain one-color mailer, even if the target color is the same. That is why a guide to algorithmic color matching should always ask for pricing transparency around proofing, retesting, and formula storage, because those items matter when the job repeats and the reorder lands six months later in another region.
Honestly, the cheapest option is not always the least expensive choice. A failed 20,000-piece print run can wipe out the savings from a bargain setup in a single afternoon, especially if freight and rework are included. For large packaging programs, color management is often much cheaper than fixing a color failure after the fact. A steady guide to algorithmic color matching helps buyers think in terms of total cost, not just the first invoice, and that usually means looking at the full landed cost of a rerun instead of a $0.03 difference per unit.
Expert Tips and Next Steps for Better Matching
My first recommendation in any guide to algorithmic color matching is to establish one master standard. Store a physical approved sample in a controlled environment, document the measurement data, and make sure everyone is talking about the same target. I cannot count how many times a job went sideways because sales had one sample, prepress had another, and production was chasing a third version that had faded in an office drawer. That is the sort of thing that makes me mutter under my breath and reach for another coffee, usually after someone says, “We’re pretty sure this is the right one.”
Keep all stakeholders aligned. Brand managers, prepress, packaging engineers, buyers, and press operators should all approve the same color target before production begins. If one group is working from a rendered image while another is holding a printed swatch, you are inviting a mismatch. A disciplined guide to algorithmic color matching keeps handoffs clean and reduces the “who approved this?” conversation later, which is great because that conversation always starts 20 minutes before lunch in a plant meeting room.
Build a reusable color library. If your packaging line repeatedly uses a logo red, a warning orange, and a product panel blue, capture those formulas and store them by substrate and process. That way your next folding carton, insert, or mailer run starts from a proven reference instead of a fresh guess. A strong guide to algorithmic color matching becomes more valuable every time you add another validated formula to the library, whether that formula was proven on 18pt SBS in Mexico or on 350gsm C1S artboard in Vietnam.
Ask for a documented timeline. Data collection, proofing, revision rounds, approval, and production release should all be laid out in business days, not vague promises. For example, a simple job may need 3 to 5 business days after sample receipt, while a multi-substrate packaging project may need 10 to 15 business days from proof approval to final formula lock. A useful guide to algorithmic color matching makes the schedule visible before the press is booked. Nobody likes discovering a timeline surprise after freight has already been booked, especially when air freight to Los Angeles costs more than the carton itself.
Use side-by-side viewing and record tolerances for every approved job, especially for brand-critical colors like logos, panels, and regulatory marks. If you are shipping products across multiple regions, consistent records matter even more because different facilities may use different substrates or light sources. In practice, this is where a guide to algorithmic color matching earns its keep: it gives you a repeatable record that another team can verify without starting over, even if the next run is in Dublin and the backup plant is in Tijuana.
Here is the short version of my advice after two decades around print lines, sample rooms, and supplier tables: start with measured data, not assumptions; match on the actual substrate, not a theoretical one; and document the final answer so the next reorder is easier than the first. That is the real value of a guide to algorithmic color matching, whether you are buying carton, labels, corrugate, or premium retail packaging, and whether the MOQ is 2,000 units or 200,000.
If you are preparing your next packaging run, gather your current brand standards, list the substrates you use most often, and request a measured comparison before production begins. That one step can save a surprising amount of time, especially on jobs with multiple SKUs or tight launch dates. And if color has already caused friction in the past, a disciplined guide to algorithmic color matching is usually the fastest way to reduce the noise and protect the brand, particularly when you need quotes back from three factories in 48 hours.
How does a guide to algorithmic color matching differ from visual matching?
Algorithmic matching uses measured data like LAB values and spectral readings, while visual matching depends on human judgment. It is usually more repeatable across facilities, operators, and print methods when the equipment is calibrated correctly, especially on jobs running across plants in Mexico City, Shenzhen, and Chicago.
FAQs
What equipment do I need for algorithmic color matching?
At minimum, you need a calibrated spectrophotometer, color management software, and a controlled viewing environment. For production use, press profiling tools and calibrated monitors help keep design, proofing, and print aligned, and most suppliers will expect verification logs every 7 to 14 days.
How long does the algorithmic color matching process usually take?
Simple matches can be approved quickly if the substrate and ink set are already known. Complex packaging jobs with new materials, specialty finishes, or multiple proof rounds can take longer because sampling and verification are needed, often 3 to 5 business days for a single substrate or 12 to 15 business days from proof approval for multi-material programs.
What affects the cost of algorithmic color matching the most?
The biggest cost drivers are substrate complexity, number of colors, print method, and how much custom profiling or proofing is required. Projects with tight tolerances, metallic inks, or premium packaging finishes typically need more setup time and testing, and that can move a quote from $350 to $2,500 or more depending on the factory and region.
How can I make algorithmic color matching more accurate for packaging?
Use a physical master sample, calibrate all devices, and test on the exact substrate that will be used in production. Document the final formula, tolerance, and lighting standard so future reorders can match the approved result more consistently, whether the stock is 350gsm C1S artboard, kraft board, or corrugated liner.
The bottom line is simple: a strong guide to algorithmic color matching gives packaging teams a better shot at consistent, brand-safe color across real-world materials, and that matters whether you are printing 5,000 cartons or 500,000. If you have ever had a logo shift between coated stock, kraft, and corrugate, you already know why measured color control is worth the effort. A well-run guide to algorithmic color matching is not just about prettier print; it is about fewer headaches, fewer rejects, and a brand that looks like itself every time it ships, from the first proof in Guangzhou to the final pallet in Toronto.