Comments Locked

24 Comments

Back to Article

  • guidryp - Monday, September 10, 2018 - link

    Nano-IPS is obviously LG's marketing for Quantum dot back-light filtering, Same as Samsung's QLED.
  • zodiacfml - Tuesday, September 11, 2018 - link

    But I haven't seen a technical review of that. LG also doesn't seem too proud of it unlike Samsung with QLED products matching the price of LG's OLED.
    I have the LG TV with the Nano but the effect is subtle but has very good viewing angles for an LCD.
  • a5cent - Tuesday, September 11, 2018 - link

    This is technically incorrect.

    LEDs don't emit the same amount of light across the their entire color spectrum (too much blue, not enough red and green). This limits the colors a typical monitor panel can reproduce to around 20% of what the human eye can see. Nano-IPS and QLED both attempt to improve upon this, but do so in very different ways.

    Nano-IPS (which has absolutely nothing to do with the IPS TFT technology) is a marketing term for what is more accurately called KSF LED. This works by capping the panel's backlight LEDs with films made of fluorescent nanomaterials which converts a portion of the bluish wavelengths into red and green wavelengths. The chemical formula for the film to enhance reddish wavelengths is K2SiF6:Mn4+, which is the origin of the widely used acronym “KSF”.

    For QLED, Samsung takes a flexible and clear substrate and coats it with a matrix of green and red dots made of photo- and electroluminescent nano-particles. One pair of red and green dots corresponds to one pixel. This is the quantum dot film, which is mounted in front of the backlight. Each quantum dot is able to convert bluish wavelengths into more or less red and/or green wavelengths (as desired) on a pixel-by-pixel basis.

    Both technologies work in very different ways and perform differently. They are definitely not the same.
  • HStewart - Monday, September 10, 2018 - link

    Physically these monitors look the same as my LG 34U88 - same ports and resolution - I been happy with it but I yet use the USB hub on my system. Cool software on it - great for web site creation stuff.
  • Morawka - Monday, September 10, 2018 - link

    This isn't really HDR with only 400 nits max brightness. It's faux HDR
  • HStewart - Monday, September 10, 2018 - link

    Plus the resolution of 3440 x 1440 does not support full 4K - so 4K HDR videos may not work.
  • GreenReaper - Tuesday, September 11, 2018 - link

    Well, they'll be scaled, and if you were expecting literally pixel-sharp rendering, you won't get it. But this isn't a problem for *most* video, in large part because the heavy compression used for them renders them unable to reliably display such detail in any case.

    I watch Full HD videos on a 1680x1050 monitor as well as a Full HD tablet and while they both have their benefits, all things equal I think the lower-resolution but higher-size monitor wins out.
  • Valantar - Monday, September 10, 2018 - link

    Even if >$1000 with only HDR400 is a let-down, this is getting dangerously close to "actually worth the upgrade" territory. _Almost_ every feature I want. Now if only they could ditch the external power brick ...
  • EnzoFX - Monday, September 10, 2018 - link

    Almost worth it being a good jump from the typical widescreen panel? I have a u88 and I'd upgrade to this once the price would come down if for the broader range of refresh rates alone. The faux hdr might be a plus as well.

    This is basically a doubling of refresh rate for a IPS panel at this res right?
  • Valantar - Tuesday, September 11, 2018 - link

    For me "worth the upgrade" would pretty much mean ticking every "wanted feature" box, considering the cost of these panels. I have an old Dell U2711, which means that I could either pay less for a "comparable" panel (same size and resolution, worse gamut and accuracy), pay the same as I did in 2011-ish for a roughly equal panel (27" 1440p 60Hz IPS, but smaller case/stand, more efficient and with a less grainy coating, possibly FreeSync - but still not good enough for me to replace a fully functional monitor), or pay a lot more for something giving me an actual improvement. For that improvement to be worth the (very significant) outlay, I'd need more than a few extra boxes ticked. FreeSync >100Hz, 1440p 21:9, VESA mount, USB hub, good gamut and accuracy, preferably HDR, ~1800R curved, and I'm _really_ reluctant to add a power brick to the snake's nest of wires beneath my desk. I'd be willing to pay for that at some point, but until that monitor exists, I'll likely stick with what I have.
  • Lolimaster - Monday, September 10, 2018 - link

    Everything on this monitor is just worthless when the panel is an IPS 400nits just for you to watch an even lighter grey when you want to display "black".

    VA
    VA-FALD
    OLED
    nothing else
  • C@mM! - Tuesday, September 11, 2018 - link

    Do remember this is a gaming line, and VA panels are renowned for smearing. ULMB can clear this up, but without 1000 nit panels, the hit to brightness is usually too much for most.
  • Valantar - Tuesday, September 11, 2018 - link

    OLED is garbage for PC use, entirely unsuited to static UI elements. Burn-in would significantly shorten the lifespan of that monitor, regardless of any steps taken to alleviate it. I agree that the superior contrast of VA panels would be better, but VA panels have issues of their own. It'd need to be a _really_ good VA panel.
  • R3MF - Tuesday, September 11, 2018 - link

    I'm pretty sure the F has displayport 1.4, not 1.2 like the g-sync model...?
  • Patongo - Tuesday, September 11, 2018 - link

    Pretty sure it's 1.4 as 144hz would exceed the DP 1.2 bandwidth
  • u.of.ipod - Tuesday, September 11, 2018 - link

    I agree, the FreeSync version is DP1.4
  • a5cent - Tuesday, September 11, 2018 - link

    Yes. The spec sheet provided in the article is incorrect.

    @Anton Shilov likely received junk info from LG, who so far have been unable to privide accurate specs for this product.

    G = DP1.2 G-SYNC
    F = DP1.4 FreeSync 2

    The F and G versions both use the same LM340UW5 panel, with a *native* 144Hz refresh rate. However, the G version uses nVidia's DP1.2 G-SYNC module, which bottlenecks it to 120Hz.

    Unfortunately, for nVidia's (so called premium) DP1.2 G-SYNC module, at 3440*1440, the nominally supported refresh rate is 100 Hz. Achieving 120 Hz required an OC.

    In summary, neither the F or G version overclock the panel. The G version only overclocks the G-SYNC module.

    The lack of DP1.4 also means the G version can't support HDR and will not be DisplayHDR 400 rated like the F version.

    The G version could have been the F's equal, if nVidia had engineered a decent DP1.4 G-SYNC HDR module that deserved to be called "premium (rather than the overpriced, actively cooled FPGA crap they have now).
  • Drazick - Tuesday, September 11, 2018 - link

    Give us 32" with 2560x1600 (16:10) resolution.
    Can we get that LG / Dell?
  • Valantar - Tuesday, September 11, 2018 - link

    >30" below 4K resolution is likely not seeing much development these days.
  • nerd1 - Tuesday, September 11, 2018 - link

    16:10 panel is DEAD. just get 34" 4K display instead.
  • DanNeely - Tuesday, September 11, 2018 - link

    4k is 16:9 and would be 32". 34" monitors only come in 21:9 ultrawide, which is a 3" downgrade in vertical space vs 30" 2560x1600; and it's the increased vertical space that is generally why people want 16:10. You'd need to go up to ~41" in an 21:9 ultra wide to match the height; or 60" in 32:9. AFAIK neither of those options exist yet.
  • Ray_Tracing - Wednesday, September 12, 2018 - link

    What is really needed at this juncture is for the 38" monitors to increase in refresh rate and response times. All of the 38"'s are LG panels that have a native ms rate of 8. They claim 5 in fast-mode but that is not working out well by all accounts. So these 38s are 3840x1600, this is a sweet spot for pixel density, being around the 111 range. None of the 38's have G-Sync and i believe none have FreeSync either. There are only 4 of them. (Dell, LG, ASUS, Acer). There have been accounts of ghosting and banding and vidoes to prove it during video playback. This would leave a concern with games as well and that's what I would be using the monitor for. I agree that the 3" is a downgrade in vertical space on the 34 and thus I will not be interested in the 34 at all. I really do like the idea of a 3840x1600 because I am currently using a 27" ASUS 16:9 2K and the visuals are stunning in games and normal desktop usage. I want as close to that resolution (or slightly above 2K) that I can get. If they could get G-Sync, 4ms (native), 144Hz refresh, good # of nits, decent stand, thin bezel, & good benchmarks out of a 38" (37.5" true size), I would be ecstatic. Right now however, it's a frustrating waiting game. I'm guessing 2Q 2019 before i see anything like that.
  • sadsteve - Friday, September 14, 2018 - link

    Um, why are there LEDs on the back of the sets? I usually only see the back of my monitor when I'm first setting it up or the unlikely event that I'm dusting back there.
  • a5cent - Wednesday, September 19, 2018 - link

    The point is for the lights to reflect off the wall behind the monitor. It will make any professional office setting look like a super cheap strip club.

    It's a check-box "feature" for 12 year old "gamerz". I can't figure out why, but I assume since all OEMs are adding these garish lights it boosts sales somehow.

Log in

Don't have an account? Sign up now