We were talking about TVs recently in the office and pretty much everyone agreed that even 4K is overkill for most TVs at a reasonable viewing distance.
I got a 65inch TV recently, and up close HD looks pretty bad, but at about 3m away it's fine.
Just when 100"+ TV are coming out and gaining traction. To put this into perspective, an upcoming 130" LCD TV would 8K would have 68PPI, on 98" would have 90 PPI, and 80" would be 110".
Depending on whether you want a TV experience sitting further back or Cinema is coming Home as Sony's tag line. I believe there is room for something greater than 4K. Especially when TV industry trend suggest TV purchase size is increasing every year. 40" used to be big, then it became entry level, now all top of the line TV dont even offer anything before 50", and the median is moving closer to 65".
80"+ price will come down in the next 5 years as LCD takes over again from OLED. I dont understand why but also wont be surprised if median size move pass 70".
In 2015 I wrote on AVSforum how 8K makes zero sense from Codec, computation, network, transport and TV. However I would never imagine median TV size moved up so quickly, and also I cant see how we could afford 100"+ TV at the time. Turns out I am dead wrong. TCL / CSOT will produce its first 130" TV in 2 years time. For those ultra wealthy they could afford 160" to 220" MicroLED made out of many panels. There will be 10% of population who could afford ultra large screen size. And I am sure there is a market for premium 4K + content.
There is definitely a future with 4K+ content and panel. I just hope we dont give up too early.
Yes larger format TV is getting more affordable, but I don’t think larger living room is catching up. Watching tv in general also feels like a dying trend. I would not be bullish on larger tv getting popular outside niche enthusiasts market (with money to buy a mansion)
You mention AVSforum, I'm sure you're watching BluRays (or, err, via a home backup/local streaming solution), of course it makes sense to you.
The median (and roughly 99th percentile for that matter) TV as well as being 65" is being used with Netflix et al. though, and that content already looks worse than you can buy on disc.
8k doesn't need to wait for TV sizes any more, right, but now it needs to wait for home internet speeds (and streaming provider infrastructure/egress costs) for it to make sense.
The entire pipeline to provide for this is prohibitive. At what distance do you actually need to be from your TV for the resolution to max out for the retina? 4k was already a dubious proposition for most TVs people actually own.
> In addition to being too expensive for many households, there has been virtually zero native 8K content available to make investing in an 8K display worthwhile.
I am jealous that you were even able to find a 55 inch 8k monitor. I couldn’t find one.
For my computer monitor, I ended up going with a cartoonishly large 85 inch 8k. It was somewhat of an adventure getting it into my house, but once it was set up I absolutely love it.
I don’t really see the point of 8k for anything but computer monitors but it’s absolutely great for that.
Do you think 65 inch would still be OK for a monitor usage ? I've been pondering about doing that for years but 65 inch is often easier to find for me in Europe
>The other part is that you should have the entire picture in your eyesight w/o moving the neck.
I agree, but that is still the case here. The difference is that the "full picture" no longer occupies the whole monitor.
The amount of windows and content you arrange on a normal office monitor is about one third of my available space. I can arrange my windows in that space and not have to move my neck.
But at a glance, I can also see the contents of 4 other code files in my project that are also visible, as well as my notes, the documentation, the team chat.
Or if I want, I can also see twice the amount of code in any file by having my editor take up the full height of the center third of this monitor.
Basically the monitor goes from being the "full picture" to a canvas where you are free to create any collection of a "full picture" you want, and you can have the leftover building blocks visible on the sides, optionally.
I am sure that if you let all knowledge workers in the world test this setup for a day, a vast majority of them would want to keep it. But since even 8k tv's are going away now, most will never know.
Curved gaming monitors costing more than my TV are being deployed everywhere lately, for productivity work. Most people are used to 27" or 24" low-res monitors and they are getting an upgrade, but i's not a very good one.
Had the panels from 8k tv's been used in monitors and marketed to corporations it would have been so much better!
Perfect for open offices too - no need for desk dividers if everyone is behind a huge screen! ;-)
I don't think it will work very well as a "normal" monitor (meaning placed at normal monitor distance on your desk).
My 55 is borderline too big already, and the main issue is actually the height. Tilting your head or rolling your eyes back to see the top gets noticeably uncomfortable pretty quickly.
I made a special mount so the lower edge basically rests on the desk surface which basically solved that issue, but I don't think I could have made it work if it was any bigger.
Also at 65 the pixel density is much lower, so you'd probably want it mounted further away. But if you do, the monitor will cover the same FOV as a smaller monitor anyway.
My dream is that someone starts making 8K 50" monitors with displayport inputs (HDMI is a mess) and sells them for the same price as these tv's used to cost!
> Gaming was supposed to be one of the best drivers for 8K adoption.
While the step from 1080p 1440p to 4K is a visible difference, I don't think going from 4K to 8K would be a visible since the pixels are already invisible at 4K.
However the framerate drop would be very noticeable...
OTOH, afaik for VR headsets you may want higher resolutions still due to the much larger field of vision
> While the step from 1080p 1440p to 4K is a visible difference
I even doubt that. My experience is, on a 65" TV, 4K pixels become indistinguishable from 1080p beyond 3 meters. I even tested that with friends on the Mandalorian show, we couldn't tell 4K or 1080p apart. So I just don't bother with 4K anymore.
Of course YMMV if you have a bigger screen, or a smaller room.
For reasonable bitrate/resolution pairs, both matter. Clean 1080P will beat bitrate starved 4K, especially with modern upscaling techniques, but even reasonable-compression 4K will beat good 1080P because there's just more detail there. Unfortunately, many platforms try to mess with this relationship, like YouTube forcing 4K uploads to get better bitrates, when for many devices a higher rate 1080P would be fine.
I'm curious, for the same mb per second, how is the viewing quality of 4k vs 1080p? I mean, 4k shouldn't be able to have more detail per se in the stream given the same amount of data over the wire, but maybe the way scaling and how the artifacts end up can alter the perception?
Yeah, I have a hard time believing that someone with normal eyesight wouldn't be able to tell 1080p and 4k blu-rays apart. I just tested this on my tv, I have to get ridiculously far before the difference isn't immediately obvious. This is without the HDR/DV layer FWIW.
10 feet is pretty far back for all but the biggest screens, and at closer distances, you certainly should be able to see a difference between 4K and 1080P.
there are so many tricks you can do as well, resolution was never really the issue, sharpness and fidelity isn't the same as charming and aesthetically pleasing
I usually still play at 1080p on my Steam box because my TV is like nine feet away and I cannot tell a difference between 1080p and 4k for gaming, and I would rather have the frames.
AAA games have been having really bad performance issues for the last few years while not looking much better. If you wanna game in 8K you are gonna need something like a NASA supercomputer.
We can’t render modern games at decent frame rates at 4k without going down the path of faking it with AI upscaling and frame generation.
There was no hope of actual 8k gaming any time soon even before the AI bubble wrecked the PC hardware market.
Attempting to render 33 million pixels per frame seems like utter madness, when 1080p is a mere 2 million, and Doom/Quake were great with just 64000. Lets have more frames instead?
(Such a huge pixel count for movies while stuck at a ‘cinematic’ 24fps, an extremely low temporal resolution, is even sillier)
I don't see a future in which we play at 4K at top settings either without AI upscaling/interpolation. Even if it were theoretically possible to do so, the performance budget the developers have going forward will be assuming that frame generation and upscaling is used.
So anyone who wants only "real frames" (Non upscaled, non generated) will need to lower their settings or only play games a few years old. But I think this will be something that becomes so natural that no one even thinks about it. Disabling it will belike someone lowering AA settings or whatever. Something only done by very niche players, like the CS community does today where some are playing 4:3 screens, lowering AA settings for maximum visibility not fidelity and so on.
Yeah, not only the huge required jump in raw fill rate, but to get the most out of a 4K TV you need higher detail models and textures and that means you also need a huge jump in VRAM, which never materialised.
> While the step from 1080p 1440p to 4K is a visible difference
It really isn't.
What you are likely seeing is HDR which is on most (but not all!) 4K content. The HDR is a separate layer and unrelated to the resolution.
4K versions of films are usually newly restored with modern film scanning - as opposed to the aging masters created for the DVD era that were used to churn out 1st generation Blu-Rays.
The difference between a 4K UHD without HDR and a 1080p Blu-Ray that was recently remastered in 4K from the same source is basically imperceptible from any reasonable viewing distance.
The "visible difference" is mostly better source material, and HDR.
Of course people will convince themselves what they are seeing justifies the cost of the upgrade, just like the $200 audiophile outlet and $350 gold-plated videophile Ethernet cable makes the audio and video really "pop".
I know the thread is about tvs, but since gaming has come up, worth noting that at computer viewing distances the differences between 1080p/1440p and 4k really are very visible (though in my case I have a 4k monitor for media and a 1440p monitor for gaming since there’s 0 chance I can run at 4k anyway)
Discussions about this are very tedious, because people have hard time making distinction between "being able to see the difference between 1080/4k/8k content" and "being able to see the difference between 1080/4k/8k panels".
I'm sure there's plenty of content (especially streaming content in mediocre bitrate) where people would be hard-pressed to tell the difference.
But I think if people went back to 1080p _panels_; they'd actually rather quickly notice how much worse the text-rendering is, and that the UI looks off for them.
Moving up to 8k would definitely be a smaller step-change in clarity than 1080p->4K and many people wouldn't feel it's worth spending extra; but I don't think it would be literally indistinguishable.
Yep we probably reached "CD quality" at 1080p to be honest. i.e. a level beyond which the vast majority of people won't be able to perceive a quality difference. We definitely reached it at 4k at a size/distance of most TVs
I had just recently been thinking of buying an 8K television to mount on the wall of my office to use as a huge monitor. Has anyone done this? Any recommended models?
Cinema theater projectors are rarely more than 4K. Many are still 2K. IMAX is sometimes 8K. The industry just doesn't see the need for content with more resolution.
I thought it was the other way. That 8k is problematic because the chance of dead pixel is so much higher, driving up the cost because of the higher ratio of ruined batches?
Hmm could very well be. With DRAM I know error rates are roughly constant, so actual per bit rate goes down as capacity increases. Perhaps its different for displays.
I got a 65inch TV recently, and up close HD looks pretty bad, but at about 3m away it's fine.
Depending on whether you want a TV experience sitting further back or Cinema is coming Home as Sony's tag line. I believe there is room for something greater than 4K. Especially when TV industry trend suggest TV purchase size is increasing every year. 40" used to be big, then it became entry level, now all top of the line TV dont even offer anything before 50", and the median is moving closer to 65". 80"+ price will come down in the next 5 years as LCD takes over again from OLED. I dont understand why but also wont be surprised if median size move pass 70".
In 2015 I wrote on AVSforum how 8K makes zero sense from Codec, computation, network, transport and TV. However I would never imagine median TV size moved up so quickly, and also I cant see how we could afford 100"+ TV at the time. Turns out I am dead wrong. TCL / CSOT will produce its first 130" TV in 2 years time. For those ultra wealthy they could afford 160" to 220" MicroLED made out of many panels. There will be 10% of population who could afford ultra large screen size. And I am sure there is a market for premium 4K + content.
There is definitely a future with 4K+ content and panel. I just hope we dont give up too early.
The median (and roughly 99th percentile for that matter) TV as well as being 65" is being used with Netflix et al. though, and that content already looks worse than you can buy on disc.
8k doesn't need to wait for TV sizes any more, right, but now it needs to wait for home internet speeds (and streaming provider infrastructure/egress costs) for it to make sense.
The last ones I saw for sale were below 600 usd in physical stores from name brands (LG). Mine was just under 1000 when I got it.
Why we can’t buy the same panels as monitors is a mystery to me.
For my computer monitor, I ended up going with a cartoonishly large 85 inch 8k. It was somewhat of an adventure getting it into my house, but once it was set up I absolutely love it.
I don’t really see the point of 8k for anything but computer monitors but it’s absolutely great for that.
But if you have it wall mounted at eye level or a deep desk you're likely okay.
Personally, I'd consider that large of a screen, a bad working area.
I agree, but that is still the case here. The difference is that the "full picture" no longer occupies the whole monitor.
The amount of windows and content you arrange on a normal office monitor is about one third of my available space. I can arrange my windows in that space and not have to move my neck.
But at a glance, I can also see the contents of 4 other code files in my project that are also visible, as well as my notes, the documentation, the team chat.
Or if I want, I can also see twice the amount of code in any file by having my editor take up the full height of the center third of this monitor.
Basically the monitor goes from being the "full picture" to a canvas where you are free to create any collection of a "full picture" you want, and you can have the leftover building blocks visible on the sides, optionally.
I am sure that if you let all knowledge workers in the world test this setup for a day, a vast majority of them would want to keep it. But since even 8k tv's are going away now, most will never know.
Curved gaming monitors costing more than my TV are being deployed everywhere lately, for productivity work. Most people are used to 27" or 24" low-res monitors and they are getting an upgrade, but i's not a very good one.
Had the panels from 8k tv's been used in monitors and marketed to corporations it would have been so much better!
Perfect for open offices too - no need for desk dividers if everyone is behind a huge screen! ;-)
My 55 is borderline too big already, and the main issue is actually the height. Tilting your head or rolling your eyes back to see the top gets noticeably uncomfortable pretty quickly.
I made a special mount so the lower edge basically rests on the desk surface which basically solved that issue, but I don't think I could have made it work if it was any bigger.
Also at 65 the pixel density is much lower, so you'd probably want it mounted further away. But if you do, the monitor will cover the same FOV as a smaller monitor anyway.
My dream is that someone starts making 8K 50" monitors with displayport inputs (HDMI is a mess) and sells them for the same price as these tv's used to cost!
While the step from 1080p 1440p to 4K is a visible difference, I don't think going from 4K to 8K would be a visible since the pixels are already invisible at 4K.
However the framerate drop would be very noticeable...
OTOH, afaik for VR headsets you may want higher resolutions still due to the much larger field of vision
I even doubt that. My experience is, on a 65" TV, 4K pixels become indistinguishable from 1080p beyond 3 meters. I even tested that with friends on the Mandalorian show, we couldn't tell 4K or 1080p apart. So I just don't bother with 4K anymore.
Of course YMMV if you have a bigger screen, or a smaller room.
Without HDR the differences are negligible or imperceptible at a standard 10' viewing distance.
I'll take it one step further: a well-mastered 1080p Blu-Ray beats 4K streaming hands down every time.
I doubt I’m unique.
There was no hope of actual 8k gaming any time soon even before the AI bubble wrecked the PC hardware market.
Attempting to render 33 million pixels per frame seems like utter madness, when 1080p is a mere 2 million, and Doom/Quake were great with just 64000. Lets have more frames instead?
(Such a huge pixel count for movies while stuck at a ‘cinematic’ 24fps, an extremely low temporal resolution, is even sillier)
So anyone who wants only "real frames" (Non upscaled, non generated) will need to lower their settings or only play games a few years old. But I think this will be something that becomes so natural that no one even thinks about it. Disabling it will belike someone lowering AA settings or whatever. Something only done by very niche players, like the CS community does today where some are playing 4:3 screens, lowering AA settings for maximum visibility not fidelity and so on.
Basically 400MB for 12 bytes/pixel (64bit HDR RGBA + 32bit depth/stencil)
vs the 64000 bytes that Doom had to fill...
It really isn't.
What you are likely seeing is HDR which is on most (but not all!) 4K content. The HDR is a separate layer and unrelated to the resolution.
4K versions of films are usually newly restored with modern film scanning - as opposed to the aging masters created for the DVD era that were used to churn out 1st generation Blu-Rays.
The difference between a 4K UHD without HDR and a 1080p Blu-Ray that was recently remastered in 4K from the same source is basically imperceptible from any reasonable viewing distance.
The "visible difference" is mostly better source material, and HDR.
Of course people will convince themselves what they are seeing justifies the cost of the upgrade, just like the $200 audiophile outlet and $350 gold-plated videophile Ethernet cable makes the audio and video really "pop".
I'm sure there's plenty of content (especially streaming content in mediocre bitrate) where people would be hard-pressed to tell the difference.
But I think if people went back to 1080p _panels_; they'd actually rather quickly notice how much worse the text-rendering is, and that the UI looks off for them.
Moving up to 8k would definitely be a smaller step-change in clarity than 1080p->4K and many people wouldn't feel it's worth spending extra; but I don't think it would be literally indistinguishable.
I mean my local cable TV is sending crap that's way worse than 720p YouTube videos and most people don't care at all.
I guess the primary benefit of an 8k display is that stuck or dead pixels are much less annoying than on a 4k panel of the same size.
I'm fine with 4k for my living room. Give me more HDR, less chroma subsampling and less banding.