From: Lukas S. <som...@gm...> - 2020-08-29 16:12:48
|
Hi Martí. Thanks for the feedback! If I understand correctly, then … 1.) My first approach (relying on unbounded mode, and filtering our RGB colors who’s components are not between 0…1) is not reliably. It works with LittleCMS’ build-in sRGB profile, but con break on other, LUT-based profiles because they deliver always RGB values between 0…1. 2.) The approach using a proofing transform is not reliably either, because its out-of-gamut detection will fail on LUT_based profiles. Did I get understand you correctly there? In the meantime, I've tried your suggestion with the GBD functions. This is my code: cmsContext ContextID = cmsCreateContext(NULL, NULL); cmsHANDLE h_gamutBoundaryDescriptor = cmsGBDAlloc(ContextID); qreal h; qreal s; qreal v; constexpr qreal step = 1; cmsCIELab lab; // cmsCIELab lab; QColor color; for (h = 0; h < 359; h += step) { s = 255; for (v = 1; v <= 254; v += step) { lab = colorSpace->colorLab(QColor::fromHsv(h, s, v)); cmsGDBAddPoint(h_gamutBoundaryDescriptor, &lab); } v = 255; for (s = 1; s <= 254; s += step) { lab = colorSpace->colorLab(QColor::fromHsv(h, s, v)); cmsGDBAddPoint(h_gamutBoundaryDescriptor, &lab); } } cmsGDBCompute( h_gamutBoundaryDescriptor, 0 // As of LittleCMS documentation: dwFlags: reserved (unused). Set it to 0 ); It uses (missuses?) HSV model to walk through the outer shell of the RGB gamut. HSV is converted to RGB by Qt’s build-in functions. Then, this is used by cmsGDBAddPoint. The result is however quite surprising: https://github.com/sommerluk/perceptualcolor/raw/1ca8c9ce1575e5c2715184f162b3021aebdbab63/other/gbd.png When I raise the precision (using 0.1 instead of 1.0 of distance), so adding more points, it still gets a similar result. And it gets really very slow. What am I doing wrong here? Best regards Lukas PS: In the meantime, I've uploaded a small video showing the color selector in action (with the code that relies on the unbound LittleCMS mode): https://github.com/sommerluk/perceptualcolor/blob/31defb5aec031e9e84f4b89702788fc128750896/other/video.mkv?raw=true Am Di., 18. Aug. 2020 um 08:20 Uhr schrieb Marti Maria < mar...@li...>: > > Hello Lukas, > > In the last months, I've started to develop a colour picker widget based > on the LCh model: > > > Great! > > > What is the best way to know the gamut of a given colour space? > > > Gamut is a 3-D thing because we have 3 kind of color receptors in the eye, > so the best way to represent a gamut is by using a 3D solid. > > What I try to get is this: > > > This may work to some extent on very regular color spaces, but is not > suitable for real device spaces. Slides on different L* may exhibit very > different shape on certain device spaces, like prepress CMYK, for example. > > In general, there is no way to know the gamut of a profile, and this is > why there is a special tag to include the gamut. But nonetheless, you can > use some tools to approximate the gamut. But please keep in mind the > behaviour of the ICC profiles is to map colours inside gamut, no matter > which intent. Absolute colorimetric often clips the colours, which at the > end is a sort of gamut mapping. So, the only reliable way is to use the > inverse direction, from device obtain the Lab and then build the gamut hull. > > The code you sent, may work on matrix-shaper profiles, but will not work > on CLUT. An option would be to use Jan Morovic’s segment maxima gamut hull > descriptor (see cmsGBDxxx routines) You need to do a coarse sampling on the > device space, RGB or CMYK, get the Lab values, populate the GDB and after > compute it you can check for individual points. In the cmsm.c source code, > there is (commented out) a routine to dump the 3D solid as VRML. I guess > converting that to openGL would be easy and this would be a very accurate > representation of the gamut. Then you can slide this volume at any L* to > show the shape. > > Otherwise, INTENT_ABSOLUTE_COLORIMETRIC is ok, but in V4 ICC is same as > INTENT_RELATIVE_COLORIMETRIC. If you want to distinguish between D50 and > D65 white points you have to use INTENT_ABSOLUTE_COLORIMETRIC and > adaptation state set to zero (unadapted). The effect of this is only to > displace the gamut volume in chromaticity space, and only is useful the > comparing spot colours side-by-side. If you are using this for normal > usage, chromatic adaptation applies and it is better to use > INTENT_RELATIVE_COLORIMETRIC or to set adaptation state to 1.0 (fully > adapted) > > Regards > Martí > > > > > > > On 16 Aug 2020, at 10:11, Lukas Sommer <som...@gm...> wrote: > > Hello. > > In the last months, I've started to develop a colour picker widget based > on the LCh model: https://github.com/sommerluk/perceptualcolor > > It displays a given colour space gamut (sRGB or something other) within > LCh diagrams, to provide a perceptually uniform colour picker with a nice > UI. It's under MIT licence. > > However, I'm new to colour management and don't have much experience with > it. It would be great if I could get some help here. > > 1.) > ========== > > What is the best way to know the gamut of a given colour space? What I try > to get is this: > https://raw.githubusercontent.com/sommerluk/perceptualcolor/master/other/expected.png > > The colour blob in the middle is a cut through the LCh model at a given > lightness (the gamut is the sRGB gamut here, but it should work also with > other gamuts). The code works like this: > – Create a buffer with a Lab image > – Transform it with LittleCMS like this: > cmsHPROFILE labProfileHandle = cmsCreateLab4Profile(NULL); > cmsHPROFILE rgbProfileHandle = cmsCreate_sRGBProfile(); > m_transformLabToRgbHandle = cmsCreateTransform( > labProfileHandle, // input profile handle > TYPE_Lab_DBL, // input buffer format > rgbProfileHandle, // output profile handle > TYPE_RGB_DBL, // output buffer format > INTENT_ABSOLUTE_COLORIMETRIC, // rendering intent > 0 // flags > ); > – Test for each pixel if one of R, G or B component is outside the range > 0–1. If so, treat is as transparent. If not, paint it actually on the > screen. > > This approach works. However, there are also problems. > > a) If I understand correctly the paper “Unbounded Color Engines” than > LittleCMS might switch to bounded mode if the profile does not support > unbounded mode. So comparing the output channels to the range 0–1 would be > useless, and the approach might not be reliably. > > b) It is slow, and does not provide proofing. > > So I tried something different: > cmsHTRANSFORM xform = cmsCreateProofingTransform( > labProfileHandle, // Input profile handle > TYPE_Lab_FLT, // InputFormat > rgbProfileHandle, // Output > TYPE_BGRA_8, // OutputFormat > rgbProfileHandle, // Proofing > INTENT_ABSOLUTE_COLORIMETRIC, // Intent > INTENT_ABSOLUTE_COLORIMETRIC, // ProofingIntent > (cmsFLAGS_SOFTPROOFING|cmsFLAGS_GAMUTCHECK) // dwFlags > ); > It relies directly on LittleCMS for the out-of-gamut detection. It seems > that out-of-gamut colours are automatically transparent. The problem is > that the out-of-gamut detection seems to be far less exact than the other > method. Therefore, the gamut image has a very strange shape: > https://raw.githubusercontent.com/sommerluk/perceptualcolor/master/other/actual.png > > What would be the recommended approach here? > > 2.) > ========== > I've used INTENT_ABSOLUTE_COLORIMETRIC as rendering intent. It this the > correct choice for this use case? > > 3.) > ========== > The build-in Lab profile has D50, but the build-in sRGB has D65. Does this > matter? Or can I safely ignore that? > > Best regards > > Lukas Sommer > _______________________________________________ > Lcms-user mailing list > Lcm...@li... > https://lists.sourceforge.net/lists/listinfo/lcms-user > > > |