The TV industry concedes that the future may not be in 8K

(arstechnica.com)

30 points | by cxrlosfx 4 days ago

13 comments

  • Underqualified 2 hours ago
    We were talking about TVs recently in the office and pretty much everyone agreed that even 4K is overkill for most TVs at a reasonable viewing distance.

    I got a 65inch TV recently, and up close HD looks pretty bad, but at about 3m away it's fine.

  • ksec 1 hour ago
    Just when 100"+ TV are coming out and gaining traction. To put this into perspective, an upcoming 130" LCD TV would 8K would have 68PPI, on 98" would have 90 PPI, and 80" would be 110".

    Depending on whether you want a TV experience sitting further back or Cinema is coming Home as Sony's tag line. I believe there is room for something greater than 4K. Especially when TV industry trend suggest TV purchase size is increasing every year. 40" used to be big, then it became entry level, now all top of the line TV dont even offer anything before 50", and the median is moving closer to 65". 80"+ price will come down in the next 5 years as LCD takes over again from OLED. I dont understand why but also wont be surprised if median size move pass 70".

    In 2015 I wrote on AVSforum how 8K makes zero sense from Codec, computation, network, transport and TV. However I would never imagine median TV size moved up so quickly, and also I cant see how we could afford 100"+ TV at the time. Turns out I am dead wrong. TCL / CSOT will produce its first 130" TV in 2 years time. For those ultra wealthy they could afford 160" to 220" MicroLED made out of many panels. There will be 10% of population who could afford ultra large screen size. And I am sure there is a market for premium 4K + content.

    There is definitely a future with 4K+ content and panel. I just hope we dont give up too early.

    • cherioo 1 hour ago
      Yes larger format TV is getting more affordable, but I don’t think larger living room is catching up. Watching tv in general also feels like a dying trend. I would not be bullish on larger tv getting popular outside niche enthusiasts market (with money to buy a mansion)
    • OJFord 1 hour ago
      You mention AVSforum, I'm sure you're watching BluRays (or, err, via a home backup/local streaming solution), of course it makes sense to you.

      The median (and roughly 99th percentile for that matter) TV as well as being 65" is being used with Netflix et al. though, and that content already looks worse than you can buy on disc.

      8k doesn't need to wait for TV sizes any more, right, but now it needs to wait for home internet speeds (and streaming provider infrastructure/egress costs) for it to make sense.

    • kranke155 1 hour ago
      The entire pipeline to provide for this is prohibitive. At what distance do you actually need to be from your TV for the resolution to max out for the retina? 4k was already a dubious proposition for most TVs people actually own.
    • kleiba 1 hour ago
      > In addition to being too expensive for many households, there has been virtually zero native 8K content available to make investing in an 8K display worthwhile.
  • Kon5ole 2 hours ago
    55 inch 8k tvs make for great monitors. Basically your whole field of view is a retina canvas for apps, equivalent to a 2x2 grid of 4k monitors.

    The last ones I saw for sale were below 600 usd in physical stores from name brands (LG). Mine was just under 1000 when I got it.

    Why we can’t buy the same panels as monitors is a mystery to me.

    • tombert 1 hour ago
      I am jealous that you were even able to find a 55 inch 8k monitor. I couldn’t find one.

      For my computer monitor, I ended up going with a cartoonishly large 85 inch 8k. It was somewhat of an adventure getting it into my house, but once it was set up I absolutely love it.

      I don’t really see the point of 8k for anything but computer monitors but it’s absolutely great for that.

    • ttoinou 1 hour ago
      Do you think 65 inch would still be OK for a monitor usage ? I've been pondering about doing that for years but 65 inch is often easier to find for me in Europe
      • elliotec 1 hour ago
        If you're sitting about a meter away, that's probably fine. Most desks aren't deep enough, and any closer would be a real problem for head movement.

        But if you have it wall mounted at eye level or a deep desk you're likely okay.

      • xxs 1 hour ago
        Monitor - I prefer higher refresh rate. The other part is that you should have the entire picture in your eyesight w/o moving the neck.

        Personally, I'd consider that large of a screen, a bad working area.

        • Kon5ole 40 minutes ago
          >The other part is that you should have the entire picture in your eyesight w/o moving the neck.

          I agree, but that is still the case here. The difference is that the "full picture" no longer occupies the whole monitor.

          The amount of windows and content you arrange on a normal office monitor is about one third of my available space. I can arrange my windows in that space and not have to move my neck.

          But at a glance, I can also see the contents of 4 other code files in my project that are also visible, as well as my notes, the documentation, the team chat.

          Or if I want, I can also see twice the amount of code in any file by having my editor take up the full height of the center third of this monitor.

          Basically the monitor goes from being the "full picture" to a canvas where you are free to create any collection of a "full picture" you want, and you can have the leftover building blocks visible on the sides, optionally.

          I am sure that if you let all knowledge workers in the world test this setup for a day, a vast majority of them would want to keep it. But since even 8k tv's are going away now, most will never know.

          Curved gaming monitors costing more than my TV are being deployed everywhere lately, for productivity work. Most people are used to 27" or 24" low-res monitors and they are getting an upgrade, but i's not a very good one.

          Had the panels from 8k tv's been used in monitors and marketed to corporations it would have been so much better!

          Perfect for open offices too - no need for desk dividers if everyone is behind a huge screen! ;-)

      • Kon5ole 1 hour ago
        I don't think it will work very well as a "normal" monitor (meaning placed at normal monitor distance on your desk).

        My 55 is borderline too big already, and the main issue is actually the height. Tilting your head or rolling your eyes back to see the top gets noticeably uncomfortable pretty quickly.

        I made a special mount so the lower edge basically rests on the desk surface which basically solved that issue, but I don't think I could have made it work if it was any bigger.

        Also at 65 the pixel density is much lower, so you'd probably want it mounted further away. But if you do, the monitor will cover the same FOV as a smaller monitor anyway.

        My dream is that someone starts making 8K 50" monitors with displayport inputs (HDMI is a mess) and sells them for the same price as these tv's used to cost!

  • Aardwolf 2 hours ago
    > Gaming was supposed to be one of the best drivers for 8K adoption.

    While the step from 1080p 1440p to 4K is a visible difference, I don't think going from 4K to 8K would be a visible since the pixels are already invisible at 4K.

    However the framerate drop would be very noticeable...

    OTOH, afaik for VR headsets you may want higher resolutions still due to the much larger field of vision

    • Doublon 2 hours ago
      > While the step from 1080p 1440p to 4K is a visible difference

      I even doubt that. My experience is, on a 65" TV, 4K pixels become indistinguishable from 1080p beyond 3 meters. I even tested that with friends on the Mandalorian show, we couldn't tell 4K or 1080p apart. So I just don't bother with 4K anymore.

      Of course YMMV if you have a bigger screen, or a smaller room.

      • alex43578 2 hours ago
        If your Mandalorian test was via streaming, that's also a huge factor. 4K streaming has very poor quality compared to 4K Blu-ray, for instance.
        • matsemann 2 hours ago
          Which is a point in itself: bitrate can matter more than resolution.
          • alex43578 1 hour ago
            For reasonable bitrate/resolution pairs, both matter. Clean 1080P will beat bitrate starved 4K, especially with modern upscaling techniques, but even reasonable-compression 4K will beat good 1080P because there's just more detail there. Unfortunately, many platforms try to mess with this relationship, like YouTube forcing 4K uploads to get better bitrates, when for many devices a higher rate 1080P would be fine.
            • matsemann 1 hour ago
              I'm curious, for the same mb per second, how is the viewing quality of 4k vs 1080p? I mean, 4k shouldn't be able to have more detail per se in the stream given the same amount of data over the wire, but maybe the way scaling and how the artifacts end up can alter the perception?
        • JasonADrury 1 hour ago
          Yeah, I have a hard time believing that someone with normal eyesight wouldn't be able to tell 1080p and 4k blu-rays apart. I just tested this on my tv, I have to get ridiculously far before the difference isn't immediately obvious. This is without the HDR/DV layer FWIW.
          • tokyobreakfast 1 hour ago
            Try comparing a 4K vs 1080p that were created from the same master, like a modern Criterion restoration.

            Without HDR the differences are negligible or imperceptible at a standard 10' viewing distance.

            I'll take it one step further: a well-mastered 1080p Blu-Ray beats 4K streaming hands down every time.

            • alex43578 1 hour ago
              10 feet is pretty far back for all but the biggest screens, and at closer distances, you certainly should be able to see a difference between 4K and 1080P.
              • tokyobreakfast 1 hour ago
                The Magsafe cord on a Macbook charger is 6'. It's not as far as you think.
      • DharmaPolice 1 hour ago
        The person was referring to gaming where most PC players are sitting closer than 3 metres from their screen.
      • yieldcrv 2 hours ago
        there are so many tricks you can do as well, resolution was never really the issue, sharpness and fidelity isn't the same as charming and aesthetically pleasing
    • tombert 1 hour ago
      I usually still play at 1080p on my Steam box because my TV is like nine feet away and I cannot tell a difference between 1080p and 4k for gaming, and I would rather have the frames.

      I doubt I’m unique.

    • ksynwa 1 hour ago
      AAA games have been having really bad performance issues for the last few years while not looking much better. If you wanna game in 8K you are gonna need something like a NASA supercomputer.
      • xxs 1 hour ago
        Even with a super computer it'd be difficulty to render the frames in time with low latency.
    • charcircuit 2 hours ago
      VR headsets won't use the same panels that a TV would use. Any growth in the XR headset space won't help the TV industry.
    • bluescrn 1 hour ago
      We can’t render modern games at decent frame rates at 4k without going down the path of faking it with AI upscaling and frame generation.

      There was no hope of actual 8k gaming any time soon even before the AI bubble wrecked the PC hardware market.

      Attempting to render 33 million pixels per frame seems like utter madness, when 1080p is a mere 2 million, and Doom/Quake were great with just 64000. Lets have more frames instead?

      (Such a huge pixel count for movies while stuck at a ‘cinematic’ 24fps, an extremely low temporal resolution, is even sillier)

      • alkonaut 1 hour ago
        I don't see a future in which we play at 4K at top settings either without AI upscaling/interpolation. Even if it were theoretically possible to do so, the performance budget the developers have going forward will be assuming that frame generation and upscaling is used.

        So anyone who wants only "real frames" (Non upscaled, non generated) will need to lower their settings or only play games a few years old. But I think this will be something that becomes so natural that no one even thinks about it. Disabling it will belike someone lowering AA settings or whatever. Something only done by very niche players, like the CS community does today where some are playing 4:3 screens, lowering AA settings for maximum visibility not fidelity and so on.

        • xxs 1 hour ago
          In most cases you dont need anti-aliasing at 4k.
      • teamonkey 1 hour ago
        Yeah, not only the huge required jump in raw fill rate, but to get the most out of a 4K TV you need higher detail models and textures and that means you also need a huge jump in VRAM, which never materialised.
        • bluescrn 1 hour ago
          The frame buffers/render targets alone for 8K are massive.

          Basically 400MB for 12 bytes/pixel (64bit HDR RGBA + 32bit depth/stencil)

          vs the 64000 bytes that Doom had to fill...

    • tokyobreakfast 2 hours ago
      > While the step from 1080p 1440p to 4K is a visible difference

      It really isn't.

      What you are likely seeing is HDR which is on most (but not all!) 4K content. The HDR is a separate layer and unrelated to the resolution.

      4K versions of films are usually newly restored with modern film scanning - as opposed to the aging masters created for the DVD era that were used to churn out 1st generation Blu-Rays.

      The difference between a 4K UHD without HDR and a 1080p Blu-Ray that was recently remastered in 4K from the same source is basically imperceptible from any reasonable viewing distance.

      The "visible difference" is mostly better source material, and HDR.

      Of course people will convince themselves what they are seeing justifies the cost of the upgrade, just like the $200 audiophile outlet and $350 gold-plated videophile Ethernet cable makes the audio and video really "pop".

      • scratcheee 1 hour ago
        I know the thread is about tvs, but since gaming has come up, worth noting that at computer viewing distances the differences between 1080p/1440p and 4k really are very visible (though in my case I have a 4k monitor for media and a 1440p monitor for gaming since there’s 0 chance I can run at 4k anyway)
      • dtech 1 hour ago
        For tv maybe, but you're replying to gaming, and it's definitely on a monitor, laptop or handheld
        • teamonkey 1 hour ago
          A lot of gaming is done on a TV in the living room
      • FeepingCreature 1 hour ago
        I can confirm that on a pc monitor, 1080p and 4k is very easy to tell apart.
        • tokyobreakfast 1 hour ago
          I missed the part this was about gaming. Most people don't sit 10' away from their monitor, but it's standard for TV viewing.
  • klausa 1 hour ago
    Discussions about this are very tedious, because people have hard time making distinction between "being able to see the difference between 1080/4k/8k content" and "being able to see the difference between 1080/4k/8k panels".

    I'm sure there's plenty of content (especially streaming content in mediocre bitrate) where people would be hard-pressed to tell the difference.

    But I think if people went back to 1080p _panels_; they'd actually rather quickly notice how much worse the text-rendering is, and that the UI looks off for them.

    Moving up to 8k would definitely be a smaller step-change in clarity than 1080p->4K and many people wouldn't feel it's worth spending extra; but I don't think it would be literally indistinguishable.

  • wosined 1 hour ago
    Pretty easy to achieve 8K when you make the TV twice the size. I wish they improved pixel density more instead.
  • padjo 1 hour ago
    Yep we probably reached "CD quality" at 1080p to be honest. i.e. a level beyond which the vast majority of people won't be able to perceive a quality difference. We definitely reached it at 4k at a size/distance of most TVs
    • VerifiedReports 1 hour ago
      Not to mention that we don't even get GOOD 1080p at streamers' current bitrates, let alone "4K." Any talk of 8K is an absurd joke.
    • realusername 1 hour ago
      And 8K to watch what content exactly? It's already hard for the movie industry to provide a consistent 4K...
  • LeoPanthera 2 hours ago
    I had just recently been thinking of buying an 8K television to mount on the wall of my office to use as a huge monitor. Has anyone done this? Any recommended models?
  • Animats 1 hour ago
    Cinema theater projectors are rarely more than 4K. Many are still 2K. IMAX is sometimes 8K. The industry just doesn't see the need for content with more resolution.
  • magicalhippo 4 days ago
    Gee, who could have foreseen that?

    I mean my local cable TV is sending crap that's way worse than 720p YouTube videos and most people don't care at all.

    I guess the primary benefit of an 8k display is that stuck or dead pixels are much less annoying than on a 4k panel of the same size.

    I'm fine with 4k for my living room. Give me more HDR, less chroma subsampling and less banding.

    • steinvakt2 1 hour ago
      I thought it was the other way. That 8k is problematic because the chance of dead pixel is so much higher, driving up the cost because of the higher ratio of ruined batches?
      • magicalhippo 41 minutes ago
        Hmm could very well be. With DRAM I know error rates are roughly constant, so actual per bit rate goes down as capacity increases. Perhaps its different for displays.
    • VerifiedReports 1 hour ago
      And, most of all, HIGHER BITRATE.
  • ChrisArchitect 4 days ago
  • jl6 2 hours ago
    640K ought to be enough.
  • WithinReason 2 hours ago
    Not enough content for it