Forums >
Photography Talk >
iMac monitor vs iPad for color accuracy
Is anyone willing to make some generalizations? I've used all sorts of calibration devices and the calibrations drift so fast that I found I got better results with the regular iMac calibration. I'm always having problems with skin tones looking too warm compared to the back of the camera, which I've always attributed to lightroom not rendering Canon accurately. But I sometimes connect to my iMac from my iPad remotely and I just did it while next to the iMac and could see that the iMac was very yellow compared to the iPad. I then tried editing a few photos on the iMac while viewing them on the iPad and my initial impression is that it's way more accurate. I did the Apple calibration and when it got to the WB I match the iMac to the iPad as close as I could, so at this point it's basically a question of seeing how it goes - except I'm sure there are people here far more familiar with the color accuracy of the two and I'm curious about that. 27 inch late 2009 iMac vs iPad 4. Jun 10 17 03:40 am Link I found the Apple Thunderbolt screen is too colorful for me, almost saturated and lacking shadow detail being on the contrasty side. It was awful for printing and had that saturated HDR effect on everything on the screen. While in Hollywood, one of the colorists told me to look at the Eizo screen instead of the Apple. He had the same issue with the Apple being too saturated and lagging shadow detail. He said he was replacing his studio's Apple screens with the Eizo 4K monitors. Expensive option to Apple though. There is an app using the x-rite "i1 Display Pro" that can bring your color into being closer to their standard - or so they say - but I tried it on my Android tablet and it still looks odd in the reds. I also have an iPad Air 2 and just tried it on that, but the reds and yellows seem dull and the yellows seem to feed into the blacks to warm them up and lower the iPad's contrast both. I guess you're right in that the Apple does seem warmer overall for some reason even using the x-rite calibration tool. My iPad's Image Gallery photos look duller now compared to the original, and hint of yellow too. Their ColorChecker chart doesn't look right with ColorTRUE running. On the Android, I leave it switched off as I like the stock setting more. Jun 10 17 08:28 am Link You can use the most precise measurement instrument with your monitor but that per se doesn't make the monitor better or more accurate. Device calibration makes sense only for devices which: 1. Have the necessary hardware to be calibrated properly 2. Have the necessary hardware to maintain the calibration settings for a fairly long term 3. Have the necessary hardware to maintain the calibration settings when environment changes (e.g. room temperature, voltage) 4. Have the necessary hardware to maintain uniformity over the viewing area Without these you can make all kinds of color profiles with all kinds of software but in 30 minutes they will be irrelevant because the device electronics simply cannot maintain the accuracy. Obviously it makes no sense to expect a pocket device to have the same electronics as a professional monitor for $5k even if it has a wide gamut. Jun 10 17 10:09 am Link I'll make a few generalizations, sure. Canon cameras are always red. Always. Most also have more saturated reds, which trend toward magenta. Apple screens, especially the portable type, are usually magenta by default. IOS stuff is often yellow too. So between that and the camera, you're looking at pretty much every kind of warm tint. Portable devices very rarely calibrate properly. Reason is, they're not made for it; they're made to be bright, contrasty, colorful, and make movies look good - all under various lights. And possibly for battery life. The bit about the battery doesn't seem like a big deal, until you consider that displaying 10-240 instead of 0-255 saves a tiny bit of processing power; not much, but enough to claim another few minutes of battery life - every bit helps when the brightness and contrast are jacked up. Andthen you may as well use a cheaper panel since you don't have 0-255 anyway. You can calibrate these devices only so far. At a certain point, the design becomes the limiting factor - like trying to turn a pair of scissors into two knives. Two good rules: devices that make movies and games look awesome are almost always worse for accuracy. The potential accuracy of devices usually goes(worst to best) portables -> laptops -> work stations -> full sized monitors. Jun 10 17 01:12 pm Link Zack Zoll wrote: Interesting. Mine is black Jun 10 17 01:27 pm Link anchev wrote: That's a great theory, and would be true if every sensor were designed the same. Jun 10 17 04:01 pm Link Zack Zoll wrote: It is not theory but how technology works and how one should use it in order to avoid confusion. Canon has a shift to red/away from cyan due to software, yes. I own a Canon and I work with raw files from all kinds of Canon bodies. I haven't noticed any red shift in raw data, so yes - it may be due to the software you (and others) use. Personally I work in uniwb. But red is more saturated because the sensors they use are better at/more sensitive to/less balanced/whatever you want to say when it comes to red. The red to magenta shift ... I'm not smart enough to say why that is. Different sensors have different color matrix but that is normally color corrected during raw conversions. If the raw converter doesn't do it properly - that is a software issue. In any case if you shoot a color target (e.g. color checker) you can profile the image accurately. But it's not all software, no. Whether you have a Bayer or non-Bayer pattern, sensor or screen, each 'block' is divided into sub-pixels that actually have colour data. If one of those colours is slightly stronger than the rest, you can fix it via software. If it's much stronger, you can't. Of course color filters on sensors are different from camera to camera and that's why there is a different color matrix which the raw converter should handle. Obviously color is not entirely a matter of software. I am rather saying that the concept of color is simply inapplicable before raw conversion because before demosaic all that is there is just 3 separate black and white channels. The sensor knows only light, converted to voltage, converted to 0 and 1 by the ADC for each separate R, G, B channel. In that sense "saturation" in that process can be only the signal saturation, i.e. exposure clipping and that is not the same as color saturation which has entirely different meaning. Obviously lenses and lens filters can influence the ratio between different channels due to chromatic shifts but raw data is still without actual color until it is converted. And as I said - you can profile your image for each light situation, each body, each lens etc. by shooting a color target. After you do that for a few cameras you will know that there is no such thing as "good color", especially without profiling. There is only good raw data in the sense: properly exposed. The rest is post-production. Ditto for monitors. If all it took were software, then why isn't there an app to make my Droid look like an Eizo? For that matter, why does Eizo make more than one model? Monitors are a different matter. The problem with so called "calibration" is that people rarely understand that in most cases all they do is profiling, which is a software thing. Surely tuning the 0 an 1's which a video card outputs by using an ICC profile doesn't calibrate a monitor which has no hardware calibration controls and doesn't make it accurate or stable. Personally I have experimented some years ago with a spectrophotometer on some low end monitors and the results after creating a color profile were far worse visually than without that - they simply don't have the physical qualities to do it. Jun 10 17 04:48 pm Link While I respect that many ways your knowledge is vastly superior to my own, I cannot agree with someone that doesn't admit there are flaws in the system. Colour film has existed for almost a century, and it's still not 100% accurate, even if shot at tested speeds and under ideal WB conditions. I suspect that your ability to fix problems is such that you have forgotten that those problems even existed in the first place. Jun 10 17 05:59 pm Link Zack Zoll wrote: What do you mean by flaws in the system? If you could please clarify I would know what I am supposed to admit (or not) Colour film has existed for almost a century, and it's still not 100% accurate, even if shot at tested speeds and under ideal WB conditions. Let's not forget that accuracy is defined by the acceptable delta (error) between measured and reference value. Film is very different from digital as a process. Being chemical and including more phases makes it more prone to influence by various factors, so accuracy and precision become much more difficult to control. It is really not a matter of century but of physics. I suspect that your ability to fix problems is such that you have forgotten that those problems even existed in the first place. Fixing problems is part of my work, so it is impossible to forget them Jun 11 17 02:01 am Link Zack Zoll wrote: That doesn't match my specific experience. Jun 12 17 06:51 am Link anchev wrote: I think he's talking about when you open a Canon RAW photo in Lightroom. Jun 12 17 06:59 am Link Although I really like Apple products and use MacPro, MacBookPro, and iPhone, their monitors are far from accurate when we talk about color or contrast. I opted to go with a Dell WA2709 that can be calibrated and is very budget friendly, because I can't justify the expense for top of the line monitors but want a very good (if not perfect) way to control the color of my images. Coupled with EyeOne software and for the price of about 1k altogether, I found this setup well worth it over the years. Jun 12 17 07:20 am Link Mikey McMichaels wrote: Yes, I understand what is meant perceptually. And here there are 2 factors: I do have a question that can probably only be answered in a ballpark way - what's the effect of ambient light temperature on your perception of a monitor's colors? In general you would like to have your room light temperature match the temperature of your monitor, otherwise not only perception may be influenced but also your eyes will get tired quickly. I know brightness will be a variable, but if you have a monitor and you don't recalibrate for the ambient light, say daylight through windows vs tungsten at night? If you change any monitor setting (brightness, temperature or other) you must recalibrate, unless your monitor can maintain built-in hardware calibration presets. Are you going to make things cooler if your ambient is tungsten compared to if it's daylight, or do you go the same direction as the ambient light? I do the second but I don't recommend it unless you have a monitor that can be hardware calibrated at 4000K. Ideally, if you do things by the book, you would have a lab room with 5000K room light, with CRI 100 and with neutral gray walls. In any case make sure that your environment and screen brightness are neither too low, nor too high as a whole because this will cause quick eye fatigue which is also a factor for color perception. Never work in a dark room, this is very bad practice. Jun 12 17 09:49 am Link anchev wrote: Mikey McMichaels wrote: Yes, I understand what is meant perceptually. And here there are 2 factors: I do have a question that can probably only be answered in a ballpark way - what's the effect of ambient light temperature on your perception of a monitor's colors? In general you would like to have your room light temperature match the temperature of your monitor, otherwise not only perception may be influenced but also your eyes will get tired quickly. I know brightness will be a variable, but if you have a monitor and you don't recalibrate for the ambient light, say daylight through windows vs tungsten at night? If you change any monitor setting (brightness, temperature or other) you must recalibrate, unless your monitor can maintain built-in hardware calibration presets. I appreciate your answers guiding me to the ideal. Unfortunately, for a lot of reasons I'm not going to be able to pursue that right now. Jun 12 17 09:58 pm Link Mikey McMichaels wrote: 1. Keep your eyes Jun 13 17 01:26 am Link Yes, in fact that explains some things I was experiencing years ago totally unrelated, but from using monitors in low light. It also explains why I can't stand looking at laminated menus in dark restaurants. It seems like the simple answer is that ambient light color has a masking effect which makes you add more of that temperature. On of the big problems I have is that I have windows behind me, which create glare on my monitor. For a while I angled it up, but I eventually got a blackout curtain for the window. I haven't done much editing since making the post, but I can see that using the iPad as a WB reference was a good choice. The whites on the iMac are clearly whiter than before. I'm sure that I'm far from the ideal, but I think that was a huge problem. If you have more information on monitoring environments and other issues that affect perception, I'd be interested. There's an audio plugin company that's made a timer for ear fatigue. It runs until you reset it by taking a two minute break. The premise being, if timer has hit zero, your ears are not in a state for making critical decisions. Jun 15 17 10:43 am Link Mikey McMichaels wrote: Well, I wouldn't use these words as a mask implies something like a gobo and that is not the same (at least to my mind). I haven't done much editing since making the post, but I can see that using the iPad as a WB reference was a good choice. The whites on the iMac are clearly whiter than before. I don't know what "whiter" means - brighter? Less yellow? Less blue? It really makes no sense verbally. You see - what we call white (or rather gray) is a particular wavelength. A monitor in general is not a good reference for white balance because the spectrum of its light source is with low CRI. Also the monitor flickers. As discussed previously - mobile devices are even worse. So for reference you need a stable thing. If you have more information on monitoring environments and other issues that affect perception, I'd be interested. You can refer to wikipedia as a starting source to learn about color management and find your way for more info as you need it. Jun 15 17 11:08 am Link I have a Asus 29" monitor that looks amazing (costed $430 for the exact same specs as the apply thunder and cinema display but is matte), never used a color calibrator for it. With that said, I edit mostly on my 15" macbook pro still lol. I don't have any issues with it (maybe a hair off with highlights and shadows). Unless you work primarily for print, I kind of laugh at the monitors some photographers get just for web use, some spend thousands! Because no matter how accurate YOUR monitor is, someone else is using a cheap dell for their internet browsing or a samsung phone, which isn't as good as a iphone's color accuracy (shots fired). Someone will always see it as too contrasty or saturated, or not enough saturation. Technology gets better every year, and the color spaces are almost identical. I use the standard color space, on everything, and on web viewing on peoples devices im usually happy with the results. Its not that off that I need to custom calibrate everything and spend the money that a web developer or graphic designer would who deals with printed material. If it is print, i'll convert the color space. Why do photographers make it so complicated :p Jun 15 17 06:56 pm Link Glamour Alternative wrote: In colorimetry and color management there is no such thing as "looks amazing" and "I don't have issues with it". There is only measurement and without measurement you cannot possibly evaluate the accuracy of your device. That would be like saying "my car is very fast" without a speedometer. It may be something which gives you pleasure personally but for everyone else it is meaningless. Unless you work primarily for print, I kind of laugh at the monitors some photographers get just for web use, some spend thousands! Because no matter how accurate YOUR monitor is, someone else is using a cheap dell for their internet browsing or a samsung phone, which isn't as good as a iphone's color accuracy (shots fired). Someone will always see it as too contrasty or saturated, or not enough saturation. 1) "a samsung phone" and "iphone's" really mean nothing. There are so many models and they are not identical Technology gets better every year, and the color spaces are almost identical. Just because a device can reproduce colors within a particular color space does not mean it will reproduce them accurately. I use the standard color space, on everything, and on web viewing on peoples devices im usually happy with the results. There is no such thing as "the standard color space". The only color space that is considered standard (in the sense of reference) is CIELAB and you cannot possibly use this "on everything". Its not that off that I need to custom calibrate everything and spend the money that a web developer or graphic designer would who deals with printed material. If it is print, i'll convert the color space. You seem to assume that print per se is something super accurate - it is not. Converting your working color space has absolutely nothing to do with accuracy of reproduction. Why do photographers make it so complicated :p It is not photographers who make it complicated. Color *is* a complicated matter and quite a technical one. It is not something that you just plug-n-play. When you work professionally and supposedly your work matches criteria which are higher, you have to understand the process and use it properly, not merely fall for the latest marketing of products. Of course - at least if you are serious about what you do. Jun 16 17 02:01 am Link Apples iDevices are factory set for sRGB color. Some do better than others. The last couple of iPad pros even adjust your screen's color to accommodate for ambiant light. Not sure how I feel about that. Jun 16 17 10:22 am Link tenrocK photo wrote: This is very true. The OP's 2009 iMac screen will be inferior to other screens in general. I own a 27-inch Apple display and the color range and brightness downright sucks compared to my NEC Multisync 27-inch. Plus, the Apple display doesn't have any manual controls to adjust color correction, so even if you had a screen calibrator it won't do you much good. Jun 16 17 11:21 am Link Kris Krieg wrote: People keep repeating this mantra. Please understand that wide gamut has nothing to do with accuracy. AMOLED screens for phones are wide gamut but that doesn't mean they are accurate. Jun 16 17 11:46 am Link anchev wrote: True, but a larger color gamut is more capable of being color accurate. It's my personal opinion that a photographer will be better off looking for a monitor that is close to 100% AdobeRGB as opposed to just 100% sRGB. Especially if that photographer has any hopes to get their work printed in a magazine or other offset press application. Jun 16 17 12:44 pm Link Kris Krieg wrote: Not necessarily. The ability a quantity to have a wider range of values normally means it is more error prone. Look at it this way: If you have a monitor which can physically output only 2 values, say "black" (e.g. 5cd/m2) and "white" (e.g. 100cd/m2) your error is limited only to those 2 values. For similar reasons (and others of course) the high class monitors (like Eizo CG) have a huge palette from which the actual working palette is chosen during calibration. In that sense - they have intermediary values which they are able to produce and that makes them even more accurate. A pocket AMOLED obviously doesn't have this. It's my personal opinion that a photographer will be better off looking for a monitor that is close to 100% AdobeRGB as opposed to just 100% sRGB. Especially if that photographer has any hopes to get their work printed in a magazine or other offset press application. Wider gamut actually means that the device is able to produce more saturated colors. CMYK (which as you know a print press uses) in general is not that saturated (note that it is smaller than AdobeRGB): I've just been reading about Apple's new P3 color gamut (MacBook Pros), which seems like a departure from the Apple's emphasis of a graphic design community to a video editing community. As a still photographer, I think I would still purchase a monitor that is specified for AdobeRGB as opposed to a P3 monitor. It appears that the new LG UltraFine 5K monitor (27MD5KA) has a 99% P3-gamut color range, but only a 92% AdobeRGB color range. Both color ranges are large, but the P3 is a color shift. You may want to read this: All this talk of color ranges might be too detailed for this post, but I think a photographer that is serious about their craft should be working with a monitor that is capable of reproducing accurate color that is targeted towards a print medium. The problem with that is that you cannot possibly target a print medium without receiving an ICC profile from the print press. And considering what I wrote above - you will very rarely get one. There are printers who don't even know what this is. A photographer should understand the differences of monitors and not use some "off the shelf" Apple display or cheap Samsung display purchased at BestBuy. Yes, understanding is important. But I would say - understanding beyond and without brands and marketing. After all the way color works and what it means to us is not invented by Apple or by Samsung. Jun 16 17 02:45 pm Link |