I’ve done a fair bit of reading on color management but wanted to get some feedback on a workflow issue I am having.
I have a Dell 30" high gamut LCD display and a 17" Mac Book Pro Display, both of which have been Calibrated and profiled with a Gretag Eye-One. I use Lightroom to manage and adjust RAW photos taken with my D70 and CS3 for more compex editing. I use AdobeRGB as my working colorspace in CS3.
Most of my work I share online, so I save images using sRGB profiles. For years I used dual CRT Viewsonic monitors, which had a profile slightly larger than sRGB, so when I was editing in AdobeRGB the color clipping that occured when saving to sRGB was not too significant as the display itself could not display much more than sRGB.
The problem I am running into now is using my Dell high gamut Monitor. If you look at the gamut plot of this display, it is virually identical to AdobeRGB. This is great ,as you see a lot more of your image, but I’m finding it problematic when I save my images to sRGB as the color shift is so signficant the original color edits are vastly affected.
When I used my old CRTs, or if I use my MacBook Pro display which has a profile that is slightly smaller than sRGB, the workflow is relatively simple, as througout the entire process what you see and what you export for all intents is within the range of sRGB.
So my question is what is the best way to deal with this? The only solution I can come up with is to export from Lightroom to CS3, soft proof the image to MonitorRGB in CS3 and tweak the colors before saving out to a JPG with an embedded sRGB profile. Obviously the is far from ideal.
Any thoughts or suggestions would be greatly appreciated.