eye one iO optimal patch size

Hi, standard test charts that come with Profilemaker for i1 iO have patch size of 6mm x 7mm.

But the npes.org/pdf/cgats05tool.pdf
Specify that minimum should be +2mm to aperture of measuring device. Eye One Pro has aperture of 4.3mm +2 on both sides (since it’s a ring) = 8.4mm

Why then standard Xrite charts do not stick to the standard and what patch size is optimal since wasting expensive paper is not an option.

Bill Atkinson (makes his own targets) for eye one iO he uses patches 8mm x 8.5mm but I read he also reads some charts in patch mode for VIP clients.

BTW profilemaker does not allow to make 8.4 x 8.4 target the closest is 0.9mm x 0.85mm

So that means 1A4 sheet:

0.9mm x 0.85mm = 616 patches
0.65mm x 0.73mm = 988 patches (I’m using this now)
0.6mm x 0.7mm = 1107patches (default for i1 iO)

Paper/Chart/Positioning i1iO table:
Positioning Area: 32 x 23 cm, 12.6 x 9 (Width x Height)
Media Thickness: Max. 10 mm, 0.39
Patch Size: Min: 6 mm x 7 mm (Width x Height)

I am having trouble finding the exact reference you’re referring to in the pdf you linked to. Much of it is relating to translucent material - is that what you are working with? A lot would have to do with what instrument you are using, and how careful the operator is.

You could take a pragmatic approach to this question and determine how small a patch size you can get away with. Take multiple measurements of a single target with small patches and bring them into a tool like MeasureTool, ColorThink Pro or Maxwell and see how consistent the different measurements are. Then you would have a better idea of what can work on your actual equipment. A robotic arm as the IO has would tend to be slightly less accurate (one would think) than an I1Pro with a carefully placed ruler. But then there is also some talk that the heat of a hand will affect the measurement of a hand-held i1Pro.

“…irradiated area of the specimen shall be greater than the sampling aperture, and its boundary shall lie at least 2 mm beyond the boundary of the sampling aperture.”

I was thinking the robotic arm can’t be less slightly less accurate than human hand, because there is no point in using one if it’s true.

Like I said I’m using 0.65mm x 0.73mm = 988 patches

observer 2
D50

delta E 2000
Total = 0.09
Best 90% = 0.08
Worst 10% = 0.26

Sigma
Total = 0.08
Best 90% = 0.03
Worst 10% = 0.15

Maximum
Total = 0.91
Best 90% = 0.15

Would you increase patch size ?

These numbers are pretty good. There does not seem to be a problem there. I would expect that if you were to increase your patch size, you would not get any better readings.

I also think that the numbers are good, and reducing the number of patches will not make a better profile (for single A4 sheet).

I also can’t understand why if you measure with a ruler by hand the aperture of the measurement device 1i1Pro touches the paper, but when the device is moved by i1 iO then the aperture is about 1.5mm above the paper.

If the aperture is not touching the paper shouldn’t this make worse readings then measuring by hand? Numbers say it doesn’t. So why then have +2mm on measurement device aperture as a ISO 5-4 standard?

Furthermore I don’t understand why the standard test charts that are made for i1Io by Xrite like:

TC9.18 RGB
IT8.7-3 CMYK

They all use 6x7mm patches, why not 8.4x8.4mm like ISO 5-4 standard specifies for specific i1Pro device with aperture of 4.3mm?

Isn’t a target that is held a “standard” should be quality oriented ?

I would think that GretagMacbeth & now X-Rite would & should (do?) know their own creation better than anyone else. Gretag invented/created/designed the i1 & therefor one would guess that the designers know what is the most suitable size for their device & I would just about bet my house on it that they have done extensive testing to discover the ideal patch size.

The ISO specification is just that - a specification, not an absolute “this is the ONLY way to do this & no-one should ever question what we decide. Just follow us blindly”.

I have tested this myself with my i1 Rev A & using X-Rite ColorPort 1.54 I can safely take my patch size down to 7x7mm (width x height) without causing any mis-read rows/patches. If I take it down to 7x6mm, the rows are too difficult to scan as it very tricky to get the backboard ruler into the exact correct position for the row so that none of the patches from the row above or below the row I’m trying to scan get (mis)read. I constantly have to repeat the rows, sometimes taking 3 or 4+ tries before the row will be read without errors. The size also affects the speed which I can scan each row but I understand the newer i1 devices, such as Rev D, are much less sensitive to this & can be used to scan rows a decent bit faster than my old Rev A.

Incidentally, the i1 IO is much faster than anything that can be done by hand, no matter which i1 is being used. To me this shows that the robotic arm of the i1 is (obviously?) much more accurate than hand scanning. And I always read my targets a minimum of 2 times to make sure everything is consistent between measurements & more often I measure each target 3+ times, again for accuracy & to average the measurements.

I agree with Pat - simply try it out for yourself using your own hardware/software & see what works best for you. The ISO create standards for the entire planet, not 1 person so what might be best for them may not be best for you, me or anyone else. It also doesn’t mean that just because they have created a standard that what they say is the absolute most accurate way something can be performed.

What do you do when 2 standards organisations create conflicting standards - who do you follow or trust? What is ISO says 8.4x8.4mm but NIST says 7x7mm & the manufacturer says 5x4mm - 10x12mm? These are all fictitious numbers but hopefully you understand what I’m saying.