Will sRGB profile on aRGB monitor give me banding?

Hi guys,

Will I loose color resolution (and possibly get banding), if I use sRGB (via a software profile) on a monitor with e.g. AdobeRGB color space?


At 1:12 PM -0800 3/2/09, fkj wrote:

I’m not sure what “via a software profile means”…

If you mean to assign the sRGB profile to a display that has an Adobe RGB gamut then hmmm…

I suspect you will see the following things:

  • over saturated color. Your application will convert color into the sRGB gamut and then those RGB values will be sent directly to the screen. The aRGB screen will display them as if they were aRGB values and the colors will appear more saturated than they were intended to be. A simulation in Photoshop would be to take an sRGB image and assign aRGB to it.

  • clipped color. When your application converts to your display profile it will convert to the sRGB profile. Chances are it will use the rel col rendering intent (as that’s the only one available in sRGB). As the gamut of sRGB is smaller than aRGB, unnecessary clipping will occur. It will also be oversaturated as mentioned above…

hope this helps



I must admit, that I am a bit of a rookie, when it comes to color management. But from what I understand (and please correct me if I’m wrong), if I have an aRGB monitor, I can force it to only show sRGB via a software profile (whether this is done in the graphics card LUT or in the application itself, I dont know). I want to do this, because the majority of people use sRGB, and I want to see what they see.

Since the sRGB color space is smaller than and included in aRGB, this should be possible, but I suspect that I will loose color resolution, since the 3x8 bit send to the monitor covers aRGB, and I only use sRGB, which must give me a lower color resolution than if I use an sRGB monitor.

I am asking this because Im looking for a new monitor, and most non-TN panels are have wider than sRGB gamuts, but I think that I will be using sRGB 98% of the time, which makes me wonder if I should go for the now standard wide gamut monitors or if I should insist on an sRGB.


I understand your question completely. But in most cases, you wouldn’t have anything to worry about. It’s the ones who are trying to define the aRGB gamut with only the 256 steps (per color) in their graphics cards who would be on the lookout for banding issues. For this reason, most of the high-gamut displays I’ve seen put built-in 10-bit or higher graphics processors into the display itself to help reduce the chance of banding.

In your case, you would most likely be telling the software to emulate the smaller sRGB space and it will then have 256 steps to define that color space. That’s the way it works with the Eizo CG222W for example. In the case of an HP Dreamcolor display with LED backlighting, you can specify the actual color temperature of the backlight. So you can literally make it into an sRGB display at the backlight level.

Now, you should be aware that there are many good quality displays that are still of the sRGB variety. Call Rick and ask him what’s available these days. 877-265-6743

But if the monitor is an aRGB monitor and can not be “changed” to an sRGB monitor via e.g. a LUT, then I can tell the software on my computer to restrict the colors to sRGB, and hence it can not use the full signal range (256 steps pr. color). Let me explain what I mean; when the monitor receives e.g. (R,G,B)=(0,255,0) it would show pure green, the green primary for aRGB which is outside sRGB. So instead the computer must send e.g. (0,220,5) to the monitor to have it display the sRGB green primary, and thus can not use the full signal range (256 steps/color), because this would take it outside the sRGB colorspace.

If the monitor has a LUT, then it would be able to be “transformed” into a “true” sRGB monitor, in which (0,255,0) would display the sRGB green primary, because the monitor would look up (0,255,0) and get e.g. (0,220,5), which would be sent to the panel instead.

Does this make sense, or am I totally off my rocker?