Evaluate Profile Proofing

I created a Print Profile using iPro2. I wanted to try to use the Evaluate Profile Proofing as part of learning CTP. I found the post regarding how to get the measurement and reference data from the profile. Dropped all three files into ColorSmart and I got the results.

What I don’t understand is the reference data and measurement data appears to be the same but yet I get a dE difference in the worksheet. Is there a simple explanation on how this works or a good reference to explain it? It seem I am measuring how it works against the same data that made the profile.

Thanks,
-tony

This is the Post I made on Luminous Landscape.

Digging into CTP, I am trying to understand the Evaluate Profile Proofing. I made a custom print profile(iPro2). I saved the measurements as a CGATS file, this is the source for both the measurement and reference data (Im using the ColorSmarts UI to automate). Initially, I thought since the reference data and measurement data are the actual bits that made the profile, how could it be measuring any dE(which there is, albeit small (0.0 - 0.93) since it’s all the data. Then I thought about this some more and came to this conclusion:

The Evaluate Profile Proofing is measuring the errors round tripping through the PCS due to quantization errors.

Can someone confirm this? Is there a good reference someone can share with me?


The replies I got basically agreed with my finding but suggested that I post here to possible get more comments from Steve.

You’re close. There is such a thing as using ColorThink to round trip through a profile, but this is sort of half of that.

Instead this procedure is guessing what colors you’ll end up with if you use your profile to predict what Lab values you’ll get when you run your device values (ie RGB) through the profile … and then compare those Lab values with the actual Lab values from the measurement.

This of course is only relating to the proofing direction of a profile, how accurate your soft-proofing will be.

There is a section in the manual that talks about the function of the ColorSmart Guide:
www.colorwiki.com/wiki/ColorThink_Pro_- … arts_Guide

And here is the section just on this topic:
colorwiki.com/wiki/Evaluate_Profile_Proofing

It is not quite so simple to get quantifiable numbers relating to the quality of the printing direction of a profile. The closest we could come up with is this procedure for visually analyzing the quality of the printing effect of a profile:
youtu.be/sTxaHhMYSYE?list=PL064AE0D476E8CC58

But back to your overall question: Yes, the small differences you see are there because of rounding errors in the creation of a profile, but also the smoothing function of a profile. A good profiling engine will naturally introduce some intelligent smoothing so that image gradients will have smooth transitions of color. A bad profile, and also a good profile that has to do a lot of work to yank the color around to where it is supposed to be, will likely have larger numbers in this test.

Great question. Thanks for asking it.