Nice review, and if you mention it, I remember the nice sounds the CRT made when starting, stopping and changing graphic modes. I'm looking forward to more articles from you
That's what a REAL monitor does. My 22" Sony CRT will crush any LCD out there (literally) When I get on an LCD and turn it on to no sound I get pissed. When you turn a CRT on it lets you know it means business with the *BWHHH* it makes, and the click when you change resolutions is another thing missing from LCD.
That's what a REAL monitor does. My 22" Sony CRT will crush any LCD out there (literally) When I get on an LCD and turn it on to no sound I get pissed. When you turn a CRT on it lets you know it means business with the *BWHHH* it makes, and the click when you change resolutions is another thing missing from LCD.
Right on, also real monitors (like a pair of 21" CRTs) give you sun tan. Only your closest friends will know that you never step outside of your house during the day.
I have an older 22 inch version that looks the same. I love it. The only one thing that bothers me is the frame! Can't they make the screen and the frame flat. The frame with that shape collects so much dust it's unbelievable. That's not the problem - the problem is the cleaning then!
With AMD video cards you can get rid of black borders all around as follows: bring up the latest CCC, go to Desktops & Displays, RIGHT-CLICK ON THE DISPLAY ICON AT THE BOTTOM under "Please select a display" and choose Configure, then select Scaling Options and set Underscan-Overscan to 0%.
Glossy screens are asinine on a laptop you take outdoors. Monitors are usually indoors and it makes sense for them to have some glatte or moss (half gloss, half matte, as seen on LCD TVs) to it. Full gloss really depends on the lighting of where it is.
Time was a "green screen" meant just that - a text terminal with green phosphor on the front and nothing else. Nowadays you only see them attached to obsolete mainframes.
I have two of the G2410's (non-H) version that I got with a discount code from dell.com for only $200 each, after shipping. I guess I got really lucky because I watched the price and discounts daily hoping to get more and it never got down that low again. A $140 price hike though for an adjustable stand just isn't worth it. Get these down below $250 and it would be worth it. The screen is really nice, though I wish they'd do a 27" led monitor and/or up the res to 1900x1200 or higher. I do miss that extra 180pixels in height on this screen
I have two of the non-H as well. I missed the $200 sale, but got them for ~$250 last fall. Definately NOT the monitor you want for professional graphics design, but great for mixed use productivity, casual streaming/movies, and gaming (esp. at $250 or less). The auto mode was a little psychotic (brightness would vary in constant room lighting), so I moved to standard mode and manual settings. 15-17W consumption.
Nvidia Geforce video drivers have the "Create Custom Resolution" and "Use NVidia Scaling" options that allow (with digital output) creating and scaling any custom or missing standard resolution at the correct aspect ratio. The trick is that scaling is done in the videocard, while a native-resolution signal is sent to the monitor. In essence, the monitor is displaying at its native resolution and doesn't "know" it is showing a lower custom resolution.
For example, I find it more comfortable to use a custom 1536x960 custom resolution on a 24 inch (16:10) monitor
For my parents' setup, I have created a custom 1080x864 resolution that is comfortably bigger for their old eyes while respecting the 5:4 ratio of their 19 inch LCD.
I think it is unacceptable that a new monitor, even if it is just an update on an existing model, to not have HDMI.I use monitors now for my laptop at home and my laptop like many others does not have DVI out but rather VGA and HDMI. I can not use VGA if I am going to watch HDCP movies or use anything that requires HDCP. Sure I could purchase an HDMI-DVI converter but why not just add that extra HDMI spot to give everybody the chance to use their connections straight out of the box without having to purchase an adapter or look for a software solution.
"unless you’re using an HDMI to DVI cable, you should be running the LCD at native resolution."
I am currently using two HDMI to DVI cables in my HT set-up. HTPC (DVI) to receiver (HDMI) and receiver (HDMI) to old Sony XBR 1080i (DVI). This is all HDCP comliant so I don't have any issues. Since I'm moving very far away I will be leaving my tried and true Sony and in the interim between moving and getting a NEW TV I picked u a rather cheap OLD TV (Dell W3000). This older model does not have HDMI either so I will still be using an HDMI to DVI cable. Is this a bad thing?
I'm running at 720p out of the PC on this 13//x768 dislay? Will this be a problem? Should I set the PC to the native rez? Thanks for the article BTW. Great job. I have loved my CRTs, (the XBR plus a behemoth 2048x1536 NEC model for my PC) and hung to them more because of the bargains that they were. I gladly traded the exertion required to lift them for the cheap price and better PQ was just bonus. But the prospect of moving them has lessened their appeal so I will be tube free from now on.
I'll state what I stated before when I commented on the release of the G2410 and heard about these "letdowns".
Ahem...
If there are no speakers, WHY DO YOU NEED HDMI? It has HDCP compliant DVI-D. Just spend the whopping $5 and get the DVI-D to HDMI adapter. It's not like this thins is 2560x1400. Its a run of the mill 1080p screen.
Why do I need HDMI if I don't need audio? Because I don't want to spend the extra money if I don't have to. If I have the cable already or want to spend money on the cable I can find hdmi cables cheaper than a converter or even an hdmi-dvi cable.
Why do I need HDMI if I don't need audio? Because I don't want to spend the extra money if I don't have to. If I have the cable already or want to spend money on the cable I can find hdmi cables cheaper than a converter or even an hdmi-dvi cable.
I don't know that anyone has done IPS (or PVA/MVA for that matter) at anything above 60Hz. I could be wrong, I freely admit this, but virtually every 120Hz display I've heard about was a TN panel. Blech. When you consider the pixel response times, though, it starts to make a bit of sense. 120Hz should be doable with the 6ms IPS panels, but the 2ms TN panels might switch a bit faster. (Note that in our testing, even the 2ms TN panels still show much 2-3 frames.)
Anyway, I'm with you. I'd like 120Hz IPS, with a high resolution 30" panel. LOL. I think I'd need quad-link DVI to do that.
for off griders like myself this is sooooo awesome. 13 watts! Holy crap thats such low power consumption I can not feel bad about draining down my batteries!
Out of curiosity, are you on a PV system or something else? I'm moving into a place that's entirely solar powered with a 1:1 PV offset. I'm glad there's someone out there excited by the prospect of a low-power monitor, I mean, 11 watts still is impressive to me as well. Especially considering the brightness you get for that amount of power.
Thanks for putting the panel type TN on the first page. $339 for a TN with no HDMI, no thanks. The 1080P isn't puzzling at all, it's all about saving on panel cost. Anyone feel like doing the math on how long you would have to use this "green" LCD to make up for the sticker bloat?
Using the native software packages for both I liked the i1D2 better than the Spyder3. But when my i1D2 dies after 8 months and xRite/Pantone wouldn't do anything for me I wasn't about to buy another.
Oh man, it died? That sucks. I hope ours doesn't give out. Thus far I agree with you - the i1D2 is producing better looking/more consistent results than the Spyder 3 subjectively, but I haven't been really good about testing, just initial messing around.
I don't care overly much about the actual power consumption, but would like a display that draws less power just to keep the heat down in the summer. My Gateway FPD2485W draws about 100W regardless of brightness setting and is quite toasty.
If you really want to tell us how green it is, you also need to tell us how much CO2 was generated during its manufacture in comparison to other screens. How much oil was used in its making? Does it have a thinner plastic shell to reduce oil use? You also need to tell us whether the finished article is particularly easy to recycle in some way. Does it have a longer warranty than normal - if I can keep it in use for more years before replacing it, then it would be greener. Does it have components that are easier to fix? It's much greener to fix faulty things than to chuck them out and buy a new one. Has Dell undertaken to make spares available for more years to help with this? Don't suppose so...
Yeah, after about 8 months it wanted to calibrate everything to a very green hue and would flat out refuse to even run the calibration on my laptop at settings I had previously used (and had a screenshot of). It was stored in a drawer with the clip-on diffuser cover on when not in use. I know hardware goes bad, the lack of support is what disturbed me.
TOTALLY AWESOME that you are adapting this new CRT comparison technique. :) This is by far one of the most important elements of a display to me, so I'm happy to see a solid site like AnandTech giving some solid data.
But, two comments that struck my eye about this test:
1) It seems a little strange that you're using an HDMI to DVI adapter, instead of just going straight form a DVI off the graphics card, but it shouldn't make a difference anyway.
2) WOW. 9ms total latency is VERY low for an LCD. I usually find around 30+ms when looking around (which is totally unacceptable to me). This is a nice monitor -- I'm looking forward to seeing some more data to verify that your test methods are solid. Man, that's low latency. :)
The human eye can see apprx. 85Hz out of the rods (greyscale) and 60Hz out of the cones (colour). So this means that the center of your vision sees at around 60Hz and 85Hz at the edges (due to a lot more cones in the center).
I assume that this would mean that we can therefore discern between 16.66ms and 11.76ms of lag. (Please correct me if my assumption is wrong... I'm sure I'm a bit off on that) 9ms is nicely below that threshold, which is quite impressive for an LCD.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
39 Comments
Back to Article
Calin - Friday, May 7, 2010 - link
Nice review, and if you mention it, I remember the nice sounds the CRT made when starting, stopping and changing graphic modes.I'm looking forward to more articles from you
QueBert - Friday, May 7, 2010 - link
That's what a REAL monitor does. My 22" Sony CRT will crush any LCD out there (literally) When I get on an LCD and turn it on to no sound I get pissed. When you turn a CRT on it lets you know it means business with the *BWHHH* it makes, and the click when you change resolutions is another thing missing from LCD.*HUGS his CRT* Nice and warm too :D
QueBert - Friday, May 7, 2010 - link
That's what a REAL monitor does. My 22" Sony CRT will crush any LCD out there (literally) When I get on an LCD and turn it on to no sound I get pissed. When you turn a CRT on it lets you know it means business with the *BWHHH* it makes, and the click when you change resolutions is another thing missing from LCD.*HUGS his CRT* Nice and warm too :D
pjconoso - Friday, May 7, 2010 - link
Now that is one manly comment right there!niva - Monday, May 10, 2010 - link
Right on, also real monitors (like a pair of 21" CRTs) give you sun tan. Only your closest friends will know that you never step outside of your house during the day.Brian Klug - Friday, May 7, 2010 - link
Thanks for the encouragement guys, I've got another monitor on my desk right now I'm working on as well, and the Spyder 3 just came in ;)There's a bunch more coming!
-Brian
Soulkeeper - Friday, May 7, 2010 - link
Still looking for an LCD that won't be a downgrade from 10yr old CRT technology.1920x1080 for a 24" display doesn't get me excited.
dieselJosh - Friday, May 7, 2010 - link
For the price $339, one can acquire one or more flagship CRT units, such as the fw900 or a7217a.Any event horizon guesses for when flat panel displays will be able to compete with such CRTs performance for gaming?
Zingam - Friday, May 7, 2010 - link
I have an older 22 inch version that looks the same. I love it. The only one thing that bothers me is the frame! Can't they make the screen and the frame flat. The frame with that shape collects so much dust it's unbelievable. That's not the problem - the problem is the cleaning then!blueeyesm - Friday, May 7, 2010 - link
The problem isn't it collects dust easily; it's that there's too much dust in that room.Run the vacuum more often.
TechnicalWord - Friday, May 7, 2010 - link
With AMD video cards you can get rid of black borders all around as follows: bring up the latest CCC, go to Desktops & Displays, RIGHT-CLICK ON THE DISPLAY ICON AT THE BOTTOM under "Please select a display" and choose Configure, then select Scaling Options and set Underscan-Overscan to 0%.strikeback03 - Wednesday, May 12, 2010 - link
Thank you very much! That has been bothering me since I built my HTPC in January. Wonder why AMD set it that way by default.Stokestack - Friday, May 7, 2010 - link
Bring back common sense. Glossy screens are asinine. GJ on that at least, Dell.quiksilvr - Friday, May 7, 2010 - link
Glossy screens are asinine on a laptop you take outdoors. Monitors are usually indoors and it makes sense for them to have some glatte or moss (half gloss, half matte, as seen on LCD TVs) to it. Full gloss really depends on the lighting of where it is.chromatix - Friday, May 7, 2010 - link
Time was a "green screen" meant just that - a text terminal with green phosphor on the front and nothing else. Nowadays you only see them attached to obsolete mainframes.jonyah - Friday, May 7, 2010 - link
I have two of the G2410's (non-H) version that I got with a discount code from dell.com for only $200 each, after shipping. I guess I got really lucky because I watched the price and discounts daily hoping to get more and it never got down that low again. A $140 price hike though for an adjustable stand just isn't worth it. Get these down below $250 and it would be worth it. The screen is really nice, though I wish they'd do a 27" led monitor and/or up the res to 1900x1200 or higher. I do miss that extra 180pixels in height on this screencasteve - Friday, May 7, 2010 - link
I have two of the non-H as well. I missed the $200 sale, but got them for ~$250 last fall. Definately NOT the monitor you want for professional graphics design, but great for mixed use productivity, casual streaming/movies, and gaming (esp. at $250 or less). The auto mode was a little psychotic (brightness would vary in constant room lighting), so I moved to standard mode and manual settings. 15-17W consumption.BernardP - Friday, May 7, 2010 - link
Nvidia Geforce video drivers have the "Create Custom Resolution" and "Use NVidia Scaling" options that allow (with digital output) creating and scaling any custom or missing standard resolution at the correct aspect ratio. The trick is that scaling is done in the videocard, while a native-resolution signal is sent to the monitor. In essence, the monitor is displaying at its native resolution and doesn't "know" it is showing a lower custom resolution.For example, I find it more comfortable to use a custom 1536x960 custom resolution on a 24 inch (16:10) monitor
For my parents' setup, I have created a custom 1080x864 resolution that is comfortably bigger for their old eyes while respecting the 5:4 ratio of their 19 inch LCD.
It's too bad ATI is not offering these options.
Guspaz - Friday, May 7, 2010 - link
I did try that; I have a G2410 (hey, I bought it because it was on sale for dirt cheap at the time) and it's useless for 4:3 games.The G2410 has the annoying tendancy to stretch *ANY* 4:3 resolution that I've tried up to wide. WarCraft 3 doesn't look so hot.
I did mess about with the nVidia drivers to try to create a custom resolution that would let me run 4:3 games at actual 4:3, but didn't get anywhere.
aftlizard - Friday, May 7, 2010 - link
I think it is unacceptable that a new monitor, even if it is just an update on an existing model, to not have HDMI.I use monitors now for my laptop at home and my laptop like many others does not have DVI out but rather VGA and HDMI. I can not use VGA if I am going to watch HDCP movies or use anything that requires HDCP. Sure I could purchase an HDMI-DVI converter but why not just add that extra HDMI spot to give everybody the chance to use their connections straight out of the box without having to purchase an adapter or look for a software solution.tno - Friday, May 7, 2010 - link
"unless you’re using an HDMI to DVI cable, you should be running the LCD at native resolution."I am currently using two HDMI to DVI cables in my HT set-up. HTPC (DVI) to receiver (HDMI) and receiver (HDMI) to old Sony XBR 1080i (DVI). This is all HDCP comliant so I don't have any issues. Since I'm moving very far away I will be leaving my tried and true Sony and in the interim between moving and getting a NEW TV I picked u a rather cheap OLD TV (Dell W3000). This older model does not have HDMI either so I will still be using an HDMI to DVI cable. Is this a bad thing?
I'm running at 720p out of the PC on this 13//x768 dislay? Will this be a problem? Should I set the PC to the native rez? Thanks for the article BTW. Great job. I have loved my CRTs, (the XBR plus a behemoth 2048x1536 NEC model for my PC) and hung to them more because of the bargains that they were. I gladly traded the exertion required to lift them for the cheap price and better PQ was just bonus. But the prospect of moving them has lessened their appeal so I will be tube free from now on.
tno
quiksilvr - Friday, May 7, 2010 - link
I'll state what I stated before when I commented on the release of the G2410 and heard about these "letdowns".Ahem...
If there are no speakers, WHY DO YOU NEED HDMI? It has HDCP compliant DVI-D. Just spend the whopping $5 and get the DVI-D to HDMI adapter. It's not like this thins is 2560x1400. Its a run of the mill 1080p screen.
aftlizard - Friday, May 7, 2010 - link
Why do I need HDMI if I don't need audio? Because I don't want to spend the extra money if I don't have to. If I have the cable already or want to spend money on the cable I can find hdmi cables cheaper than a converter or even an hdmi-dvi cable.aftlizard - Friday, May 7, 2010 - link
Why do I need HDMI if I don't need audio? Because I don't want to spend the extra money if I don't have to. If I have the cable already or want to spend money on the cable I can find hdmi cables cheaper than a converter or even an hdmi-dvi cable.james.jwb - Friday, May 7, 2010 - link
Next monitor upgrade: 24-inch or larger, IPS or better, 120 Hz performance.When that comes I'll be happy ;)
JarredWalton - Friday, May 7, 2010 - link
I don't know that anyone has done IPS (or PVA/MVA for that matter) at anything above 60Hz. I could be wrong, I freely admit this, but virtually every 120Hz display I've heard about was a TN panel. Blech. When you consider the pixel response times, though, it starts to make a bit of sense. 120Hz should be doable with the 6ms IPS panels, but the 2ms TN panels might switch a bit faster. (Note that in our testing, even the 2ms TN panels still show much 2-3 frames.)Anyway, I'm with you. I'd like 120Hz IPS, with a high resolution 30" panel. LOL. I think I'd need quad-link DVI to do that.
james.jwb - Friday, May 7, 2010 - link
display port should do it?svojoe - Friday, May 7, 2010 - link
for off griders like myself this is sooooo awesome. 13 watts! Holy crap thats such low power consumption I can not feel bad about draining down my batteries!Brian Klug - Saturday, May 8, 2010 - link
Out of curiosity, are you on a PV system or something else? I'm moving into a place that's entirely solar powered with a 1:1 PV offset. I'm glad there's someone out there excited by the prospect of a low-power monitor, I mean, 11 watts still is impressive to me as well. Especially considering the brightness you get for that amount of power.-Brian Klug
Porksmuggler - Saturday, May 8, 2010 - link
Thanks for putting the panel type TN on the first page. $339 for a TN with no HDMI, no thanks. The 1080P isn't puzzling at all, it's all about saving on panel cost. Anyone feel like doing the math on how long you would have to use this "green" LCD to make up for the sticker bloat?strikeback03 - Monday, May 10, 2010 - link
Using the native software packages for both I liked the i1D2 better than the Spyder3. But when my i1D2 dies after 8 months and xRite/Pantone wouldn't do anything for me I wasn't about to buy another.strikeback03 - Monday, May 10, 2010 - link
should be died, not diesBrian Klug - Monday, May 10, 2010 - link
Oh man, it died? That sucks. I hope ours doesn't give out. Thus far I agree with you - the i1D2 is producing better looking/more consistent results than the Spyder 3 subjectively, but I haven't been really good about testing, just initial messing around.Hmmz, this could be a review of its own... ;)
-Brian
strikeback03 - Monday, May 10, 2010 - link
I don't care overly much about the actual power consumption, but would like a display that draws less power just to keep the heat down in the summer. My Gateway FPD2485W draws about 100W regardless of brightness setting and is quite toasty.ctsrutland - Monday, May 10, 2010 - link
If you really want to tell us how green it is, you also need to tell us how much CO2 was generated during its manufacture in comparison to other screens. How much oil was used in its making? Does it have a thinner plastic shell to reduce oil use? You also need to tell us whether the finished article is particularly easy to recycle in some way. Does it have a longer warranty than normal - if I can keep it in use for more years before replacing it, then it would be greener. Does it have components that are easier to fix? It's much greener to fix faulty things than to chuck them out and buy a new one. Has Dell undertaken to make spares available for more years to help with this? Don't suppose so...dragunover - Monday, May 10, 2010 - link
Mine supports like 150 or so hertz at that resolution...NEC MultiSync 90
ReaM - Tuesday, May 11, 2010 - link
THANK YOU SO MUCH FOR TESTING THIS ONE!!!I had it in the shopping cart for a month now in an online shop. Very low power consumption!
strikeback03 - Wednesday, May 12, 2010 - link
Yeah, after about 8 months it wanted to calibrate everything to a very green hue and would flat out refuse to even run the calibration on my laptop at settings I had previously used (and had a screenshot of). It was stored in a drawer with the clip-on diffuser cover on when not in use. I know hardware goes bad, the lack of support is what disturbed me.AllenP - Sunday, May 16, 2010 - link
Concerning the processing and input lag:TOTALLY AWESOME that you are adapting this new CRT comparison technique. :) This is by far one of the most important elements of a display to me, so I'm happy to see a solid site like AnandTech giving some solid data.
But, two comments that struck my eye about this test:
1) It seems a little strange that you're using an HDMI to DVI adapter, instead of just going straight form a DVI off the graphics card, but it shouldn't make a difference anyway.
2) WOW. 9ms total latency is VERY low for an LCD. I usually find around 30+ms when looking around (which is totally unacceptable to me). This is a nice monitor -- I'm looking forward to seeing some more data to verify that your test methods are solid. Man, that's low latency. :)
The human eye can see apprx. 85Hz out of the rods (greyscale) and 60Hz out of the cones (colour). So this means that the center of your vision sees at around 60Hz and 85Hz at the edges (due to a lot more cones in the center).
I assume that this would mean that we can therefore discern between 16.66ms and 11.76ms of lag. (Please correct me if my assumption is wrong... I'm sure I'm a bit off on that) 9ms is nicely below that threshold, which is quite impressive for an LCD.