I think it could have a place in very large displays that are being used as PC monitors or for massive TVs, but 8K is going to remain super niche for a long time. It's only just now that we have the computing power and bandwidth to do 4k60 on most devices, and 4K120 is still a ways off.
For PC monitors? I suppose, but they'd either be for people doing photo work and similar intensely detailed visual work, or in huge sizes to fit a lot of information at once. 54" to replace 4 27" 4K monitors? Given that 4K is challenging to use at 100% scaling at 27", that would be the lower useful limit of such a use case.
As for "massive TVs", no. There's no reason to get a TV so large that you have to move your head to see all of it, which means there's a limit to how large a TV most people can fit in their actual living rooms - and to how much people are willing to make TV placement the chief determinant of their interior design. For 8K to be useful (or even noticeable) for TVs, you'd need to be sitting very close to a very large TV, which is neither comfortable nor useful. Heck, at normal viewing distances the vast majority of people can't tell the difference between FHD and 4k on ~60" TVs. To make 8k a noticeable upgrade over 4k, you'd need a TV approaching 200" (which would be ~2.5m or 8' 2" tall - that's floor-to-ceiling in most homes, if not more).
That 80 inch display will be like 4 of the 40 inch 4k screens. These types of displays will be very useful in a number of fields such as anything that displays engineering data/telemetry, financial systems/market tracking, or anyone working on visual content creation.
I'm personally not going to be able to afford one of these anytime soon, but this continues the match forward in technology.
You have to envision a future in which at _least_ one whole wall in your home is a walk-up touch display. It needs to have a resolution that makes sense at touch distance and wall scale.
I have a 50"(49.5" viewable...damn TV specs) 4k TV I use as a monitor. It does 4k@60/HDR so good enough for me. I previously had 3 24" 1080P Dells. I sit a bit back from the screen too.
Pixel pitch, is marginally bigger so readability is the same. Great for games/CAD/visual work.
Still, I can see where an 8k monitor would be beneficial. NOT for gaming IMO. For some of the work I do, it would be nice.
Agreed. Even then, if you are sitting 6 or more feet away, you still will not see the difference. I also wonder if the TV supports HDMI 2.2. If it doesn't, it is just another pass on the generation since it can't receive a source with 8k+HDR+Deep Color to take advantage of the display. It is like having a 800+ HP Ferrari engine on a car that only has a single fixed gear that will redline the engine well before taking full advantage of it's capabilities.
I think you are confusing HDMI 2.1 (which is the latest HDMI spec featuring up to 48Gbps) and HDCP 2.2 (copy protection). But yeah, unless they state otherwise, assume that it does NOT support HDMI 2.1 so you'll be limited to 4K60 input. I hope that's not the case...
Ultra High Resolution TVs like that often have multiple inputs you can drive simultaneously. Of course thats another big headache to solve. The first gen Sharp 8K screens had quadruple HDMI that you could run to deliver 4 4K images to make up a 8K image.
I get what you're saying but that's a horrible analogy. If you're able to run the engine to red line, then with that single gear you've already use the engine to its full advantage. Redline is rarely if ever the peak of an engines power band so you've actually ran past it. ;)
It's really not. For people who like to play back older content, 640x480 fits into a 7680x4320 display in 12x magnification, with zero interpolation. Every resolution monitor up until 8k needs to interpolate probably the most common video resolution people own, the quintessential video standard of DVDs.
I get what you're going to say next though: >Are you really touting playback of DVD format video on 8k displays as a "feature"?!
In most respects, yeah, most people don't care, won't notice, but I do appreciate the option of having 8k as the one resolution where every smaller resolution can be scaled up to without interpolation. 8k is also basically the point where I think resolution wouldn't be appreciably better anymore, even to desktop users looking at a screen from 2 - 3 ft. I could tell the different in font rendering in 1080p -> 1440p -> 4k.
A distant dream now would be an Open Source Scan Converter (OSSC) that can take in any video game console (modern or old) and do an integer scale conversion of it up to 8k (with a powerful enough FPGA) in real time/no lag, and it'd work for upscaling popular content like the 480p consoles, like PS2, GCN, etc. Even if such a device existed now, without 8k as an option, you're not going to get a integer scale that fits the screen exactly for 480p on 4k. Likewise, with 8k and better contrast and better input lag TVs, you get even closer to not needing to have a PVM/BVM high-end CRT to get a retro look. While there are CRT-like filters, the problem is that even at 4k, the resolution isn't fine enough to properly imitate the CRT shadow mask. You can darken every 3rd line or so to imitate the scanline look, but the shadow mask is a really fine detail that can't really be emulated right now with 4k.
I would hope 8k would really be the last point of increasing resolution, since we'd be hitting a point where anything higher doesn't really matter, and more focus would be put on achieving better contrast, out-of-box calibration, off-axis color accuracy, input lag, display response times, etc.
do you realize how insanly difficult it is to produce an image in 8k. even upscaled video will be a pain to run quietly because of the gpu noise. i understand what you mean by older games thats really great... but. unless we are talkin quality its unimportant. upscaling old consoles on your 8k tv... really man?
"even upscaled video will be a pain to run quietly because of the gpu noise."
Sure... if you're using ancient hardware. My mom's notebook can play 8k video using only 5 watts (no videocard, just using the built intel graphics) and she bought that last year....
All it takes for 8k to "take off" is for Netflix, amazon, etc. to flip a switch and you're off to the races. The built in TV decoder should do the rest with no effort.
A lot of content is already recorded in 8k, and places like youtube have had 8k 60 with HDR video available for quite a while.
Now, as for the usability of 8k on a small living room... that's a different thing altogether. You probably wouldn't see any benefit unless it was a gigantic, curved screen that would envelop your whole FoV.
Used to be people whined a lot about the lack of 4k and HDR content... but all it took was for netflix and amazon to flip a switch, and now most of the new content is in both 4k and HDR. The same thing would be possible with 8k video with a 60 - 90 mbps bitrate.
These are probably the same people who said they couldn't tell the difference between 720p and 1080p and it was a total waste of money in every possible way. They're grumpy when others do something they wouldn't or can't afford.
"all it took was for netflix and amazon to flip a switch, and now most of the new content is in both 4k and HDR. The same thing would be possible with 8k video with a 60 - 90 mbps bitrate."
Of course, then there's the pathetic reality that at least in the USA, with very few exceptions most people's "broadband" Internet runs at about 0.1x that raw bandwidth (without even taking any communications overhead, or sharing of bandwidth between different data streams, into account...) Maybe in more developed countries, streamed 8k content could be a viable option in the near term. But not in the pitifully backward U.S. "free market"...
This is misleading. You can't say "a lot of content is already recorded in 8k" without the caveat that digital intermediates are still very often finished at 2k. So the content is then scaled up from there. The source doesn't matter if the finished content below even 4k.,
True, a lot of movies are filmed at resolutions below 4k and I'm surprised when a movie is filmed entirely at 6k. Though you have movies like Avengers Infinity War filmed entirely at 6.5k and formatted for IMAX then the movie is finished at 2k and scaled up. Like what the heck is that?
I frankly wonder what sort of notebook your mom uses, could you let us know? Oh, and please specify the fps, memory and CPU utilization while playing 8K video (presumably Youtube short clips). My ~3 year old Intel iGPU-only notebook can also do it but at 2 or 3 fps and it has a nasty habit of crashing after a while, because the memory (4 GB) wants to exceed 100% utilization.
By the way, you are grossly underestimating the entire 8K chain, from source to screen, with your "flipping a switch" and "the TV decoder can do the rest with no effort" remarks. 8K cameras have been in the (professional cinematography) market only in the last couple of years, to my knowledge only from Panavision (DXL & DXL2) and RED (Weapon Helium, Dragon and recently Monstro).
While films are usually shot at 8K with these cameras, 100% of the time they are mastered at a 4K or sometimes even a mere 2K digital intermediate. This downsampling is done to increase clarity and sharpness, to reduce file sizes, and due to the simple fact that no cinema or TV network on the planet (barring some tests by NHK in Japan) projects or streams at 8K at the moment.
8K Youtube short clips are at 8K simply because they do not have to be mastered (among other things go through an extensive post-production, editing, visual effects, color grading etc stage). Netflix and others, in turn, when films and TV shows start being mastered at 8K (I don't believe there is even a single such content, globally), will need to heavily upgrade their upstream bandwidth, by renegotiating contract terms with the likes of Verizon and co, and increase their storage capacity to host the much heavier content.
So no, it is *really* not as simple as "flipping a switch".
I am also curious about this Intel GPU that fully accelerates 8k video--as far as I know the only gpus on the market that can fully accelerate Youtube vids above 4k are the Nvidia 10-series.
dont get me wrong im a huge ps2 emulator guy. it would be beautiful to see but its going to be about 14 years before the technologies are able to keep up with 8k. like for newer games. maybe even longer than that.
More than focusing on resolution, contents needs to focus on the actual video quality, which means better cameras, standarized quality for many light scenarios (real life content), start to ditch 4-2-2 chroma for 4-4-4.
Only with a naive measurement of acuity (ignoring everything except line-contrast acuity), and only with overly long viewing distances (i.e. too small viewing angles). Same 'problem' as when HDTV started rolling out and people 'couldn't see the difference', because they were sitting at the same distance from the same size SDTV set (whose recommended subtended angle was based on the capabilities of the older standard).
Why 60, 70 and 80 inch? 8K is great, but these physical sizes are just completely wrong - too big for a PC monitor and too small for a TV.
8K would be amazing for a 46" PC monitor - it will have the same viewing area and elements on the screen will be the same size as that of of 4 traditional 23" 1080p monitors (but with the benefit of no bezels) and it can be run at 200% Windows Scaling/macOS Retina mode for perfect HiDPI mode pixel doubling for super crisp text/icons/UI. That would be exactly 192 DPI which is the same DPI as the Microsoft Surface Studio's screen which is a beauty.
For an 8K TV, it will have to be much bigger than even 80 inches - I haven't done the exact calculations based on typical TV viewing distances, but I'm thinking a least 100" if not significantly more.
I'm planning on getting an 86" 4K TV for my new apartment and the living room isn't really all that big. If only I could afford a 100"+ TV, I would get it and it would probably fit just fine. :)
When you go to the cinema and look at the screen, it will typically fill your entire field of view - which is great for immersion and I haven't heard anyone complaining that cinema screens are too large. In fact, everybody loves a classic IMAX theatre.
But then for some reason, the majority of people get TVs that are significantly undersized for their rooms and the distance they would be viewing from, yet they're fine with them.
Now, I understand that large TVs are an expensive luxury and not everybody can afford one, but what I don't understand are the typical rationalisations such as that a large TV is pointless unless your living room is the size of a football field and you're sitting a hundred meters away from the TV.
An 80" screen won't even fill your entire field of view when sitting just 2m away from it (and the sofa in most living rooms that I've been to has been further away than that), so TVs can grow even more beyond the current sizes that are commonly deemed as being too large today. Just look at the history of TV sizes and what was considered as "humongous" just a decade or two ago is now just considered as mediocre or even small.
But then for some reason, the majority of people get TVs that are significantly undersized for their rooms and the distance they would be viewing from, yet they're fine with them.
That part made me laugh. Only one time in my life has my wife said I got a TV that was too small. I bet I am in the minority with that.
Yes, 4x23" 1080p monitors would be 46" 4K, however at only standard (legacy) 96 DPI.
46" 8K would still give you the same working area/physical size of UI on the screen as that of 96 DPI, but at double the sharpness/clarity with 192 DPI - i.e HiDPI/Retina.
This is perfect pixel doubling on both axes i.e 200% scaling, where one physical pixel on the legacy 96 DPI will be drawn by 4 smaller pixels on 192 DPI.
"Meanwhile, it is unclear whether a wider color gamut means a better coverage for the BT.2020 color gamut by a new 10-bit panel, or usage of a 12-bit panel."
The bits for processing does not really mean gamut. It means the resistance to banding. Yes, more colors are technically available with less banding but the actual gamut (the color space) can be small even with 12-bit processing. It's just finer gradations of a smaller overall space.
One could have 32-bit processing within the very small sRGB color space, for instance.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
49 Comments
Back to Article
austinsguitar - Friday, August 31, 2018 - link
unless this is 70' plus... 8k is just plain dumb tbh.Stochastic - Friday, August 31, 2018 - link
I think it could have a place in very large displays that are being used as PC monitors or for massive TVs, but 8K is going to remain super niche for a long time. It's only just now that we have the computing power and bandwidth to do 4k60 on most devices, and 4K120 is still a ways off.Valantar - Saturday, September 1, 2018 - link
For PC monitors? I suppose, but they'd either be for people doing photo work and similar intensely detailed visual work, or in huge sizes to fit a lot of information at once. 54" to replace 4 27" 4K monitors? Given that 4K is challenging to use at 100% scaling at 27", that would be the lower useful limit of such a use case.As for "massive TVs", no. There's no reason to get a TV so large that you have to move your head to see all of it, which means there's a limit to how large a TV most people can fit in their actual living rooms - and to how much people are willing to make TV placement the chief determinant of their interior design. For 8K to be useful (or even noticeable) for TVs, you'd need to be sitting very close to a very large TV, which is neither comfortable nor useful. Heck, at normal viewing distances the vast majority of people can't tell the difference between FHD and 4k on ~60" TVs. To make 8k a noticeable upgrade over 4k, you'd need a TV approaching 200" (which would be ~2.5m or 8' 2" tall - that's floor-to-ceiling in most homes, if not more).
PixyMisa - Sunday, September 2, 2018 - link
With a computer monitor, you can look more closely at something, so a 32" 8K monitor (like the one Dell sells already) makes perfect sense.With a TV you're supposed to take in the whole image, which is different.
niva - Tuesday, September 4, 2018 - link
That 80 inch display will be like 4 of the 40 inch 4k screens. These types of displays will be very useful in a number of fields such as anything that displays engineering data/telemetry, financial systems/market tracking, or anyone working on visual content creation.I'm personally not going to be able to afford one of these anytime soon, but this continues the match forward in technology.
surt - Monday, September 3, 2018 - link
You have to envision a future in which at _least_ one whole wall in your home is a walk-up touch display. It needs to have a resolution that makes sense at touch distance and wall scale.Manch - Tuesday, September 4, 2018 - link
I have a 50"(49.5" viewable...damn TV specs) 4k TV I use as a monitor. It does 4k@60/HDR so good enough for me. I previously had 3 24" 1080P Dells. I sit a bit back from the screen too.Pixel pitch, is marginally bigger so readability is the same. Great for games/CAD/visual work.
Still, I can see where an 8k monitor would be beneficial. NOT for gaming IMO. For some of the work I do, it would be nice.
Fallen Kell - Friday, August 31, 2018 - link
Agreed. Even then, if you are sitting 6 or more feet away, you still will not see the difference. I also wonder if the TV supports HDMI 2.2. If it doesn't, it is just another pass on the generation since it can't receive a source with 8k+HDR+Deep Color to take advantage of the display. It is like having a 800+ HP Ferrari engine on a car that only has a single fixed gear that will redline the engine well before taking full advantage of it's capabilities.nathanddrews - Friday, August 31, 2018 - link
I think you are confusing HDMI 2.1 (which is the latest HDMI spec featuring up to 48Gbps) and HDCP 2.2 (copy protection). But yeah, unless they state otherwise, assume that it does NOT support HDMI 2.1 so you'll be limited to 4K60 input. I hope that's not the case...nevcairiel - Saturday, September 1, 2018 - link
Ultra High Resolution TVs like that often have multiple inputs you can drive simultaneously. Of course thats another big headache to solve. The first gen Sharp 8K screens had quadruple HDMI that you could run to deliver 4 4K images to make up a 8K image.Mitch89 - Monday, September 3, 2018 - link
That is such a hacky setup that nothing will support when 8K adoption genuinely starts.It's like the early 4K TVs that shipped with HDMI 1.4 so you couldn't display 4K at 50/60p.
Manch - Tuesday, September 4, 2018 - link
I get what you're saying but that's a horrible analogy. If you're able to run the engine to red line, then with that single gear you've already use the engine to its full advantage. Redline is rarely if ever the peak of an engines power band so you've actually ran past it. ;)JoeyJoJo123 - Friday, August 31, 2018 - link
It's really not. For people who like to play back older content, 640x480 fits into a 7680x4320 display in 12x magnification, with zero interpolation. Every resolution monitor up until 8k needs to interpolate probably the most common video resolution people own, the quintessential video standard of DVDs.I get what you're going to say next though:
>Are you really touting playback of DVD format video on 8k displays as a "feature"?!
In most respects, yeah, most people don't care, won't notice, but I do appreciate the option of having 8k as the one resolution where every smaller resolution can be scaled up to without interpolation. 8k is also basically the point where I think resolution wouldn't be appreciably better anymore, even to desktop users looking at a screen from 2 - 3 ft. I could tell the different in font rendering in 1080p -> 1440p -> 4k.
A distant dream now would be an Open Source Scan Converter (OSSC) that can take in any video game console (modern or old) and do an integer scale conversion of it up to 8k (with a powerful enough FPGA) in real time/no lag, and it'd work for upscaling popular content like the 480p consoles, like PS2, GCN, etc. Even if such a device existed now, without 8k as an option, you're not going to get a integer scale that fits the screen exactly for 480p on 4k. Likewise, with 8k and better contrast and better input lag TVs, you get even closer to not needing to have a PVM/BVM high-end CRT to get a retro look. While there are CRT-like filters, the problem is that even at 4k, the resolution isn't fine enough to properly imitate the CRT shadow mask. You can darken every 3rd line or so to imitate the scanline look, but the shadow mask is a really fine detail that can't really be emulated right now with 4k.
I would hope 8k would really be the last point of increasing resolution, since we'd be hitting a point where anything higher doesn't really matter, and more focus would be put on achieving better contrast, out-of-box calibration, off-axis color accuracy, input lag, display response times, etc.
austinsguitar - Friday, August 31, 2018 - link
do you realize how insanly difficult it is to produce an image in 8k. even upscaled video will be a pain to run quietly because of the gpu noise. i understand what you mean by older games thats really great... but. unless we are talkin quality its unimportant. upscaling old consoles on your 8k tv... really man?Kamus - Friday, August 31, 2018 - link
"even upscaled video will be a pain to run quietly because of the gpu noise."Sure... if you're using ancient hardware. My mom's notebook can play 8k video using only 5 watts (no videocard, just using the built intel graphics) and she bought that last year....
All it takes for 8k to "take off" is for Netflix, amazon, etc. to flip a switch and you're off to the races. The built in TV decoder should do the rest with no effort.
A lot of content is already recorded in 8k, and places like youtube have had 8k 60 with HDR video available for quite a while.
Now, as for the usability of 8k on a small living room... that's a different thing altogether. You probably wouldn't see any benefit unless it was a gigantic, curved screen that would envelop your whole FoV.
Used to be people whined a lot about the lack of 4k and HDR content... but all it took was for netflix and amazon to flip a switch, and now most of the new content is in both 4k and HDR. The same thing would be possible with 8k video with a 60 - 90 mbps bitrate.
CheapSushi - Friday, August 31, 2018 - link
These are probably the same people who said they couldn't tell the difference between 720p and 1080p and it was a total waste of money in every possible way. They're grumpy when others do something they wouldn't or can't afford.boeush - Friday, August 31, 2018 - link
"all it took was for netflix and amazon to flip a switch, and now most of the new content is in both 4k and HDR. The same thing would be possible with 8k video with a 60 - 90 mbps bitrate."Of course, then there's the pathetic reality that at least in the USA, with very few exceptions most people's "broadband" Internet runs at about 0.1x that raw bandwidth (without even taking any communications overhead, or sharing of bandwidth between different data streams, into account...) Maybe in more developed countries, streamed 8k content could be a viable option in the near term. But not in the pitifully backward U.S. "free market"...
cmdrdredd - Sunday, September 2, 2018 - link
This is misleading. You can't say "a lot of content is already recorded in 8k" without the caveat that digital intermediates are still very often finished at 2k. So the content is then scaled up from there. The source doesn't matter if the finished content below even 4k.,gfkBill - Sunday, September 2, 2018 - link
You can't even say (with a straight face) "a lot of content is already recorded in 8k", never mind caveats about DI's done at 2K or 1080 :Dcmdrdredd - Monday, September 3, 2018 - link
True, a lot of movies are filmed at resolutions below 4k and I'm surprised when a movie is filmed entirely at 6k. Though you have movies like Avengers Infinity War filmed entirely at 6.5k and formatted for IMAX then the movie is finished at 2k and scaled up. Like what the heck is that?Santoval - Monday, September 3, 2018 - link
I frankly wonder what sort of notebook your mom uses, could you let us know? Oh, and please specify the fps, memory and CPU utilization while playing 8K video (presumably Youtube short clips). My ~3 year old Intel iGPU-only notebook can also do it but at 2 or 3 fps and it has a nasty habit of crashing after a while, because the memory (4 GB) wants to exceed 100% utilization.By the way, you are grossly underestimating the entire 8K chain, from source to screen, with your "flipping a switch" and "the TV decoder can do the rest with no effort" remarks. 8K cameras have been in the (professional cinematography) market only in the last couple of years, to my knowledge only from Panavision (DXL & DXL2) and RED (Weapon Helium, Dragon and recently Monstro).
While films are usually shot at 8K with these cameras, 100% of the time they are mastered at a 4K or sometimes even a mere 2K digital intermediate. This downsampling is done to increase clarity and sharpness, to reduce file sizes, and due to the simple fact that no cinema or TV network on the planet (barring some tests by NHK in Japan) projects or streams at 8K at the moment.
8K Youtube short clips are at 8K simply because they do not have to be mastered (among other things go through an extensive post-production, editing, visual effects, color grading etc stage).
Netflix and others, in turn, when films and TV shows start being mastered at 8K (I don't believe there is even a single such content, globally), will need to heavily upgrade their upstream bandwidth, by renegotiating contract terms with the likes of Verizon and co, and increase their storage capacity to host the much heavier content.
So no, it is *really* not as simple as "flipping a switch".
blppt - Tuesday, September 4, 2018 - link
I am also curious about this Intel GPU that fully accelerates 8k video--as far as I know the only gpus on the market that can fully accelerate Youtube vids above 4k are the Nvidia 10-series.GreenReaper - Wednesday, September 5, 2018 - link
Indeed - Intel's Gen9 iGPU can only handle up to 4K decode for H.264/5 and VP9:https://en.wikichip.org/wiki/intel/microarchitectu...
It does in theory handle 16K MJPEG, but I doubt that's a format YouTube supports.
NikosD - Saturday, September 8, 2018 - link
Kabylake Gen9.5 architecture available since August 2016 for laptops, can do 8K for HEVC and VP9 (both 8b/10b) easily.JoeyJoJo123 - Friday, August 31, 2018 - link
I get you, which is why I mentioned in my post that it was a "distant dream". It's not happening anytime in the near future.Diji1 - Monday, September 3, 2018 - link
Why are you talking like buying an 8K TV to get the best 640x480 playback is crazy? Oh wait ...EWP - Monday, September 3, 2018 - link
Make 640p Great Again!The Chill Blueberry - Monday, September 3, 2018 - link
That's where AI comes in. A good AI can interpolate/upscale faster than number crunching the entire 8K image.austinsguitar - Friday, August 31, 2018 - link
dont get me wrong im a huge ps2 emulator guy. it would be beautiful to see but its going to be about 14 years before the technologies are able to keep up with 8k. like for newer games. maybe even longer than that.milkywayer - Monday, September 3, 2018 - link
Give it 4 years, you'll be seeing 8K TVs at Best Buy :)lothar98 - Tuesday, September 4, 2018 - link
I would be more surprised to see Best Buy in 4 years.Lolimaster - Saturday, September 1, 2018 - link
More than focusing on resolution, contents needs to focus on the actual video quality, which means better cameras, standarized quality for many light scenarios (real life content), start to ditch 4-2-2 chroma for 4-4-4.Murloc - Sunday, September 2, 2018 - link
you're talking about insanely niche stuff.Azurael - Monday, September 3, 2018 - link
That's not really an advantage to the half of the planet for whom SD content is in 720x576i50 though, is it?edzieba - Friday, August 31, 2018 - link
Only with a naive measurement of acuity (ignoring everything except line-contrast acuity), and only with overly long viewing distances (i.e. too small viewing angles). Same 'problem' as when HDTV started rolling out and people 'couldn't see the difference', because they were sitting at the same distance from the same size SDTV set (whose recommended subtended angle was based on the capabilities of the older standard).Lolimaster - Saturday, September 1, 2018 - link
Just give me a god damn 27-28" 2560x1440 120-144Hz VA 3000:1 FULL GLOSSY with no curve. All I ask.Grizzly1991 - Saturday, September 1, 2018 - link
Don't hate just because you cannot affordbug77 - Saturday, September 1, 2018 - link
This is incredible news! You mean I can now miss 8k content on 2nd generation hardware? I'll take two!Senectus - Sunday, September 2, 2018 - link
Why 60, 70 and 80 inch? 8K is great, but these physical sizes are just completely wrong - too big for a PC monitor and too small for a TV.8K would be amazing for a 46" PC monitor - it will have the same viewing area and elements on the screen will be the same size as that of of 4 traditional 23" 1080p monitors (but with the benefit of no bezels) and it can be run at 200% Windows Scaling/macOS Retina mode for perfect HiDPI mode pixel doubling for super crisp text/icons/UI. That would be exactly 192 DPI which is the same DPI as the Microsoft Surface Studio's screen which is a beauty.
For an 8K TV, it will have to be much bigger than even 80 inches - I haven't done the exact calculations based on typical TV viewing distances, but I'm thinking a least 100" if not significantly more.
mr_tawan - Monday, September 3, 2018 - link
Can I ask, how large is your living room?Senectus - Monday, September 3, 2018 - link
I'm planning on getting an 86" 4K TV for my new apartment and the living room isn't really all that big. If only I could afford a 100"+ TV, I would get it and it would probably fit just fine. :)When you go to the cinema and look at the screen, it will typically fill your entire field of view - which is great for immersion and I haven't heard anyone complaining that cinema screens are too large. In fact, everybody loves a classic IMAX theatre.
But then for some reason, the majority of people get TVs that are significantly undersized for their rooms and the distance they would be viewing from, yet they're fine with them.
Now, I understand that large TVs are an expensive luxury and not everybody can afford one, but what I don't understand are the typical rationalisations such as that a large TV is pointless unless your living room is the size of a football field and you're sitting a hundred meters away from the TV.
An 80" screen won't even fill your entire field of view when sitting just 2m away from it (and the sofa in most living rooms that I've been to has been further away than that), so TVs can grow even more beyond the current sizes that are commonly deemed as being too large today. Just look at the history of TV sizes and what was considered as "humongous" just a decade or two ago is now just considered as mediocre or even small.
lothar98 - Tuesday, September 4, 2018 - link
But then for some reason, the majority of people get TVs that are significantly undersized for their rooms and the distance they would be viewing from, yet they're fine with them.That part made me laugh. Only one time in my life has my wife said I got a TV that was too small. I bet I am in the minority with that.
Manch - Tuesday, September 4, 2018 - link
Wouldnt 4 23" 1080p monitors be a 46" 4K?Senectus - Tuesday, September 4, 2018 - link
Yes, 4x23" 1080p monitors would be 46" 4K, however at only standard (legacy) 96 DPI.46" 8K would still give you the same working area/physical size of UI on the screen as that of 96 DPI, but at double the sharpness/clarity with 192 DPI - i.e HiDPI/Retina.
This is perfect pixel doubling on both axes i.e 200% scaling, where one physical pixel on the legacy 96 DPI will be drawn by 4 smaller pixels on 192 DPI.
Manch - Wednesday, September 5, 2018 - link
Ah OK, gotcha. Yeah I could see how that would be nice. I just replaced my 3 24" monitors with a 50" 4k. kept the pixel pitch relatively the same.Oxford Guy - Monday, September 3, 2018 - link
"Meanwhile, it is unclear whether a wider color gamut means a better coverage for the BT.2020 color gamut by a new 10-bit panel, or usage of a 12-bit panel."The bits for processing does not really mean gamut. It means the resistance to banding. Yes, more colors are technically available with less banding but the actual gamut (the color space) can be small even with 12-bit processing. It's just finer gradations of a smaller overall space.
One could have 32-bit processing within the very small sRGB color space, for instance.
Oxford Guy - Monday, September 3, 2018 - link
Yay! 8K! Just what absolutely no consumer needs in a television.(used as a television, not a monitor sitting really close)
twtech - Tuesday, September 4, 2018 - link
Too small for 8k. They should work on manstreaming TVs over 100" first, and then they'll have a market for 8k.V900 - Tuesday, September 4, 2018 - link
I’d love to know what kind of obscene hardware it uses, in order to upscale to 8K/120 Hz?!?