I was wondering if anyone has dealt with this or would have a solution:
I have a manufacturing company that deals with creating items of different colors, but each product has a specific color tone it must meet or it could be found as defective. Currently, they compare newly manufactured goods against a previously “passed” finished-good to make sure the colors are consistent. They’ve asked for our applications to have a color block that displays the true color of the finished good for comparison.
My dilemma is, how can you control different device’s displays and color settings to all be a uniform color profile? Is it possible? Any suggestions?
First of all, you can’t realistically provide an precise color value on screen that will match a real world object.
There are so many different steps in going from rgb( 1.0, 1.0, 1.0, 1.0 ) to actually seeing that value on screen, each step in the chain can cause a variance.
That being said, there are products out there that will attempt to color calibrate screens, providing your users is using one that your company has tested and recommended, you can reduce the variance from the OS to the screen (but it still may never be precise).
The next step is to ensure that your workflow in your software is color profile aware, this would probably require more information from Xojo as they currently do not provide access for us to specific color profiles when creating colors or working with images.
We faced a similar issue with our printing software, when people would complain that the images looked different on screen than they do when printed, some people would understand it the end, others wouldn’t.
As Sam mentioned, you will need to make sure the screens are calibrated accurately (more than likely with a colorimeter).
If the manufacturing company can provide you with the CMYK colour values and an ICC Profile from their output device you might be able to convert the colours to RGB using the MBS LCMS2 plugin. In this scenario you would more than likely use the manufacturer’s ICC Profile as the source profile and the monitor profile (created by the colorimeter) as the destination profile.
This idea is off the top of my head so untested but I think it will work for very basic inks. If they are using more specialised inks (for example, reflective or metallic) then it may be extremely difficult to do this.
Try impossible. This goes for the fluorescent colors as well.
Monitor gamut is but a small slice of the visual range of what human eyes can see. If they have a specific Pantone color for their product, a monitor may not be able to reproduce the colors at all.
I did a bit of this years ago when I worked in the printing industry. A quality color calibration system could cost the company hundreds if not thousands to install and requires vigilance on the part of the users. Changes in the environment (whether the office lights are on or off, whether theres light coming in from outside) affect the eyes perception of color while simply adjusting the brightness of the screen can completely invalidate the calibration.
Thanks, everyone. They do have specific Pantone colors to identify the products’ desired colors. And while they rely on human evaluation today, I do feel its an inconsistent method. And moving it to a software system, in my opinion, added another level of ambiguity since it’d still be human evaluation between a screen and a finished material rather than two finished materials in person. It sounds like you all confirm my beliefs. I recommended using Pantone (PMS) scanners to get a value to determine if it’s a valid match, which removes the variation-in-device and the human factors.
I’ll do some further research into Pantone (as I’m not very well versed on the subject), but can Pantone variations have a calculable difference like RGB? If I expect Pantone Red and get Pantone Light Red, I can tell the Pantone color is different, but because of all the variations of Red, can I tell how far off it is on the Pantone scale? Some way to show how far the color has deviated from the expected?
To put this into context: they’re given a material that’s already colored. It goes through thermal processing, potentially changing its chemical compound, but the color is to retain its properties. So they want to compare the starting material color against a “passed” completed material previously manufactured, as well as those values to a newly manufactured material, to make sure the Pantones are consistent or have little deviation.
My 2c since this is similar to some software I work with.
Its impossible to get it ‘right’ - there is no right.
The reason is that color on a monitor is formed from emitted light, where color of an object is the effect of reflected from
The way the light is formed is different: Pantone cannot be described in RGB terms.
Compare a color slide photograph with a 6x4 picture.
Compare a television image with a book.
Or any 20 televisions/ monitors in a shop.
As as other have said, what your brain sees will vary according to whether the curtains are open, the color of the walls, the calibration of the screen…
And dont get me started on what happens if you try to print something to paper!
I agree with most of the replies so far, though perhaps not so much with the “it can’t be done” parts.
Color matching can be done between screens and physical items but it’s a non-trivial matter. It’s something that my company CHROMiX does for customers.
In order to achieve a screen-to-print match you’ll need the following things set up:
the screen should be calibrated and profiled
the application (perhaps using the OS) should use the profile to convert its color values through the screen’s profile
the color values should be constrained to the gamut of the print prior to the above step
the lighting used to illuminate the print should be controlled and at the same brightness and white point as the display
Sound complicated? It can be, and that’s just for comparing to print. If you’re comparing to manufactured goods, then it can be a bit more involved and in either case there are screen gamut issues to consider.
That said, the eye is quite good at relative color differences. If you display before and after colors on the same screen, absolute color accuracy may not be required. In fact, it’s interesting how often absolute color accuracy is not required.
A good test / demonstration for lighting is to display a white document on screen and hold up a white sheet of paper to the screen. If the two don’t match in brightness and color, adding images or color swatches will only confuse you…
For a quick lay-person description about the basic idea behind color management, take a look at The Color of Toast
I suspect that if you were to use a Color Spectrometer that youd be able to measure the difference between the two.
Something to remember… especially if they are counting on a reference object. All pigment colors fade over time, some faster than others depending on the type of light and other environmental factorsthey are exposed to. Make sure your customer swaps out the reference object once in a while for a brand new one.
Indeed. It’s called delta-E and is a measurement of color difference. 1 unit of dE is supposed to be the minimal amount of color shift perceptible to the human eye. It’s a bit more involved than that but overall dE values of 2-4 are typically considered acceptable and values over 8-10 are typically considered unacceptable.
Feel free to contact directly if you need advice for your customer.