March 26, 20264 min read

A/B Testing QR Codes — Optimize Scan Rates with Data

How to A/B test QR code designs, placements, and calls-to-action using dynamic codes. Includes real test examples with results.

qr code a/b testing optimization scan rate marketing
Ad 336x280

Most marketers A/B test their email subject lines, their landing pages, their ad copy — but then slap a QR code on printed material with zero testing. That's leaving scans on the table. QR codes are testable just like any other marketing element, and the results can be dramatic.

What You Can A/B Test

There are more variables than you'd think:

  • Design — color, shape, logo placement, style (rounded vs square modules)
  • Size — 3cm vs 5cm vs 8cm
  • Placement — top-right vs bottom-center vs next to the CTA text
  • Call-to-action text — "Scan for details" vs "Scan to save 20%" vs "Watch the demo"
  • Destination — video landing page vs product page vs discount code page
  • Frame/border — with decorative frame vs naked code

How Dynamic Codes Make A/B Testing Easy

With static QR codes, you'd need to generate two different codes, print them separately, distribute them to different groups, and somehow track both. Painful.

Dynamic QR codes simplify this massively. You create two dynamic codes pointing to the same destination but with different UTM parameters. Each code gets its own scan counter and analytics. Print version A on half your flyers and version B on the other half. Done.

Alternatively, if you're testing the destination rather than the design, you can use a single QR code and rotate the destination URL partway through the campaign.

Real Test: CTA Text Matters More Than Design

A chain of 12 fitness studios ran this test on their front-desk tent cards:

Version A: Plain black-and-white QR code. Text: "Scan QR Code" Version B: Same plain QR code. Text: "Scan to Get Your Free Class Pass"

Same code design, same placement, same size. Only the CTA text changed.

Results over 30 days across all locations:


  • Version A: 189 scans

  • Version B: 612 scans

  • Version B outperformed by 224%


The takeaway is clear: people don't scan QR codes for the novelty. They need a reason. The more specific and valuable the CTA, the higher the scan rate.

Real Test: Color and Branding Lift

An e-commerce brand tested two versions on their product packaging:

Version A: Standard black-on-white QR code, no logo Version B: Brand-colored QR code (deep blue modules, white background) with small logo in center

Results across 5,000 units per version:


  • Version A: 312 scans (6.2%)

  • Version B: 418 scans (8.4%)

  • Branded version lifted scan rate by 34%


Branding doesn't just look better — it builds trust. Customers are more willing to scan a code that looks intentional and professional.

Setting Up Your Test

Here's the step-by-step:

  1. Pick one variable to test — don't change three things at once
  2. Create two dynamic QR codes at QRMax with different tracking names
  3. Tag destination URLs with distinct UTMs (e.g., utm_content=version-a and utm_content=version-b)
  4. Split your distribution evenly — if printing 1,000 flyers, 500 get version A and 500 get version B
  5. Run for at least 2 weeks — QR scan data needs time to accumulate
  6. Compare scan rates in your QRMax dashboard and conversion rates in Google Analytics

Sample Sizes and Statistical Significance

QR code tests typically need larger sample sizes than digital A/B tests because scan rates are lower. Rule of thumb: you want at least 100 scans per variation before drawing conclusions. At a 3% scan rate, that means distributing roughly 3,500 impressions per version.

For quick-and-dirty testing, 50 scans per variation gives you directional signal, not certainty. Good enough for a food truck testing two tent card designs, not rigorous enough for a $50,000 print campaign.

Variables Ranked by Impact

Based on aggregated test data from marketing campaigns:

VariableTypical Lift When Optimized
CTA text50-300%
Placement on material30-100%
Size20-60%
Design/branding15-40%
Frame/border5-15%
CTA text is almost always the highest-impact variable. Test it first.

Iterative Testing

Don't stop at one test. A/B testing works best as a cycle:

  1. Test CTA text → find the winner
  2. Test placement using the winning CTA → find the winner
  3. Test size using the winning CTA + placement → find the winner
  4. Deploy the optimized combination
Each round compounds. A 200% lift from CTA + 40% lift from placement + 30% from size can multiply into a dramatically better-performing QR campaign than where you started.
Ad 728x90