New LCD monitor

I finally entered the 21st century and bought an LCD monitor. There’s a lot of debate among photographers about color depth, gamut, and the like with LCD monitors. Most people agree that CRT monitors, properly calibrated, have better, more accurate color and uniform brightness than LCDs. These issues were on my mind as I shopped for an LCD. The trouble with CRTs, of course, is that they are ginormous, power-hungry beasts with color convergence and geometry issues—if you can even find a shop with a decent selection. My old 21″, 72 pound, 1998 vintage Cornerstone CRT was showing it’s age.

I ended up buying a 20.1″ Samsung SyncMaster 204B. It’s a 24-bit panel which means it’s only capable of showing 16.7 million colors (rather than 32-bit “true” color). It’s also true that brightness is not completely uniform across the entire panel and the brightness (and color) shifts slightly as you look at the screen from different positions.

But does it really matter? In my experience, practically, no. When you are seated in front of the monitor, those issues evaporate. If you sit on the floor looking up at your desk or you like to operate your computer from two feet to either side then you might have problems. Photographs on my LCD, to my eye, look just as good as they did on my CRT, with one caveat: because the digital image from the LCD is so sharp and crisp, I see JPG compression artifacts everywhere that I never noticed before on my analog CRT.

My impression is that a lot of the problems people talk about when they discuss the merits of CRTs over LCDs are ghosts from the LCD’s past. In early models, all of these problems were much more pronounced and the LCD gained a bad reputation which has stuck but which is now much less deserved. Modern LCD panels are getting better and better. And for most photographers, will do the job just fine.

By John Watson

John Watson is the original founder of Photodoto. If you're interested in what John has been up to, you can browse his personal blog.

0 comments

  1. To be fair, what lighting conditions did you look at your monitor under? Natural daylight through a window, flourcent tubes in the ceiling, incandecent bulbs at a lamp on your desk?

    Did you do a hardware calibration (via eye-one, spyder, etc.) to calibrate your monitor? If you didn’t, and you don’t know the temp of the lights in the surrounding area, your monitor could be giving you false readings and / or you may not see all the detail in areas like shadows, or highlights could appear blown out. Most all monitors, especially LCDs come out of the box way to bright and with too much contrast.

    I highly recomend that people shopping for LCDs read http://www.shootsmarter.com/monitorcentral.html & if you are specificly looking for one to do color correct photo editing on, reading http://www.shootsmarter.com/infocenter/wc041.html (requires you to join, it is free, they do not spam you).

  2. I do eyeball calibration. 🙂 Prints from my lab look exactly like the images on my CRT. And images on my LCD look almost exactly like they do on my CRT. That’s good enough for me. Monitors ship in nuclear-demo-mode so the brightness did need to be turned down significantly.

  3. I’m not a color scientist, but I’m surrounded by them.

    A comparison of consumer-grade CRT vs. consumer-grade LCD is going to favor the CRT out of the box if you’re looking for color accuracy (though most people don’t) and LCD if you’re looking for sharpness (which most people do).

    I know of one CGI house which is desperately seeking new LCDs for their animators because their current CRTs are expected to go belly up in the middle of production of their next feature. They currently leave their CRTs on 24×7 (again, for maximum color accuracy) and the power savings would be huge in switching to LCDs.

  4. Just thought I’d mention that a 24-bit LCD panel *is* “true” colour. 32-bit colour is almost always RGBA, which means that 8 bits is used for each of the colours, and 8 bits for the alpha channel. Since LCDs only have to display the final composed image, they don’t need the extra 8 bits for the alpha.

    In short, 32-bit video card works with 24-bit display.

    I understand though, that the next generation of displays are using 10 bits per channel, to give a 30-bit panel. If the video card supports it, I’m sure that this would lead to better colour.

  5. Dave, the reason for the desperate search of a replacement is because Sony closed down production on all tubes used in the Trinitron monitors in Japan more then a year and a half ago.

    The company that is looking for the best of the best like the CGI house you speak of should be looking at http://www.eizo.com/.

  6. > because the digital image from the LCD is so sharp and crisp,
    > I see JPG compression artifacts everywhere that I never noticed
    > before on my analog CRT.

    Too funny! I noticed *exactly* the same when I switched from a 12 year old 17″ CRT to a new Dell UltraSharp 19″ LCD. The digital noise was much more apparent.

    I’d held off changing from a CRT for the exact same reason as you, expectation that colors wouldn’t be as accurate. But I think the new LCD makes most of my shots *look a lot better* than on the CRT! Sweet!

    Next thing for me will be to buy a calibration tool and compare the LCD with prints.

Leave a comment

Your email address will not be published. Required fields are marked *