• FireWire400@lemmy.world
    link
    fedilink
    English
    arrow-up
    228
    ·
    edit-2
    3 months ago

    It’s about time the electronics industry as a whole realises that innovation for the sake of innovation is rarely a good thing

    • Scratch@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      152
      ·
      3 months ago

      Look, we can’t have TVs that last 15 years anymore!

      We need to keep people buying every year or two. Otherwise line not go up! Don’t you understand that this is about protecting The Economy?!

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      84
      ·
      edit-2
      3 months ago

      It’s not even innovation, per say. It’s just Big Number Go Up.

      Nobody seems to want to make a TV that makes watching TV more pleasant. They just want to turn these things into giant bespoke advertising billboards in your living room.

      Show me the TV manufacturer who includes an onboard ad blocker. That’s some fucking innovation.

      • redditmademedoit@piefed.zip
        link
        fedilink
        English
        arrow-up
        13
        ·
        3 months ago

        The galaxy brain move is buying an old dumb tv for a pittance and use it for watching Jellyfin/Plex/stream from a browser with uBlock Origin/DNS filtering – all running on some relative’s “obsolete” smart toaster from last year that they happily gift you because “the new version’s bagel mode IS LIT – pun intended – but it needs the 128 gb DDR7 ram of the new model, can barely toast on the old one any more”.

      • Evilschnuff@feddit.org
        link
        fedilink
        English
        arrow-up
        9
        ·
        3 months ago

        I think this just comes down to human nature. Give people (engineers, execs) a metric that looks like a good proxy for performance and they will overcommit on that metric as it is a safer bet than thinking outside the box. I think the incremental improvements in deep learning with all those benchmarks are a similar situation.

      • chocrates@piefed.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        3 months ago

        You can’t really find a dumb TV anymore. I might see how big of a monkey I can find when I’m ready to upgrade, but I doubt I’ll find one big enough and cheap enough.

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          3 months ago

          I hooked my computer up to the HDMI and have used that as my primary interface.

          It’s not perfect, but it screens out 95% of bullshit

          • tyler@programming.dev
            link
            fedilink
            English
            arrow-up
            3
            ·
            3 months ago

            That doesn’t, unless you’ve blocked your TV from network access, because they use ACR - Automated Content Recognition - that literally scans what is being displayed over your hdmi port and then sells it off to advertisers.

          • dual_sport_dork 🐧🗡️@lemmy.world
            link
            fedilink
            English
            arrow-up
            17
            ·
            3 months ago

            That won’t save you anymore. My boss bought a smallish smart TV in contravention of my explicit instructions for use as a CCTV monitor because it was “cheap.” It nags you on power up with a popup whining about not being able to access the internet, and if you don’t feed it your Wifi password it will subsequently display that same popup every 30 minutes or so requiring you to dismiss it again. And again. And again. Apparently the play is to just annoy you into caving and letting it access your network.

            Instead I packed it up and returned it. Fuck that.

            • tyler@programming.dev
              link
              fedilink
              English
              arrow-up
              5
              ·
              3 months ago

              If you are at a business you should have an access point or router that is capable of blocking specific devices from WAN access. But I would create a new segmented network, block that network from WAN access entirely, put it on its own VLAN, and then connect the TV to that network.

              • explodicle@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                5
                ·
                3 months ago

                I’d assume it nags whenever it can’t connect to the home server, and just says “network”.

                So when they go out of business any remaining units will nag forever.

                • tyler@programming.dev
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  3 months ago

                  You can use your router or access point tools to check what address it’s trying to resolve and then set up a redirect to a device that can respond with a fake response.

    • Lucidlethargy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 months ago

      We traded 3D tv’s, which are amazing if you watch the right stuff, for 8k…

      8k is great, but we need media in 8k to go with it.

    • unwarlikeExtortion@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      3 months ago

      Innovation is good. That being said, slapping “AI”, “Smart” or more pixels is the opposite of innovation. Innovation is something new, out of the box. 1080p > 4k > 8k is logical progression.

  • [object Object]@lemmy.ca
    link
    fedilink
    English
    arrow-up
    171
    ·
    3 months ago

    There’s no 8k content, and only recently do standard connectors support 8k at high refresh rates.

    There’s barely any actual 4K content you can consume.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      57
      ·
      3 months ago

      There’s barely any actual 4K content you can consume.

      Honestly a little surprised the IMAX guys didn’t start churning out 4k+ content given that they’ve been in the business forever.

      But I guess “IMAX in your living room” isn’t as sexy when the screen is 60" rather than 60’

      • jqubed@lemmy.world
        link
        fedilink
        English
        arrow-up
        43
        ·
        3 months ago

        You don’t even need IMAX for 4K; ordinary 35mm film can normal scan to a nice 4K video. Films shot on the 65mm IMAX cameras would probably make good 8K content, but most of that was educational films, not what most people apparently want to watch all the time.

        The digital IMAX projections were actually a step backwards in resolution.

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          20
          ·
          3 months ago

          Films shot on the 65mm IMAX cameras would probably make good 8K content, but most of that was educational films, not what most people apparently want to watch all the time.

          Sure. But the cameras exist. You can use them for other stuff.

          Hateful Eight was filmed in 70mm, and while it wasn’t Tarantino’s best work it certainly looked nice.

        • Anakin-Marc Zaeger@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          3 months ago

          Films shot on the 65mm IMAX cameras would probably make good 8K content

          So there’s still hope that they might release The Last Buffalo in 8k 3D sometime in the future? Got it. :)

        • hcbxzz@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 months ago

          IMAX is a mess. They can’t even figure out a consistent aspect ratio, so most of the content shot on IMAX is cropped after delivery.

      • queermunist she/her@lemmy.ml
        link
        fedilink
        English
        arrow-up
        20
        ·
        3 months ago

        They don’t want IMAX in your living room, they want IMAX in the IMAX theater, where you pay a premium for their service.

      • worhui@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        3 months ago

        IMAX is 4K or less content. Its edge is special projection that can look good and brighter on huge screens.

        Only imax film prints are significantly better than anything else

    • bdonvr@thelemmy.club
      link
      fedilink
      English
      arrow-up
      15
      ·
      edit-2
      3 months ago

      There’s barely any actual 4K content you can consume

      I feel like that’s not true. But you’ve gotta try. If you’re streaming it, chances are it’s not really any better. 4K Bluray (or rips of them…) though? Yeah it’s good. And since film actually has 8K+ resolution old movies can be rescanned into high resolution if the original film exists.

      Supposedly Sony Pictures Core is one streaming service that can push nearly 4K Bluray bitrates… but you’ve gotta have really good internet. Like pulling 50-80GB in the span of a movie runtime.

      • CmdrShepard49@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 months ago

        Not true on paper but true in practice. Most people don’t buy/use Blurays (or any other physical media) anymore to the point that retailers aren’t even bothering to stock them on the shelves these days. Their days are certainly numbered and then all we’ll be left with is low quality 4k streaming.

    • mlg@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 months ago

      There’s barely any actual 4K content you can consume.

      Ironically there actually is if you bother pirating content because that’s the only crowd that will share full 4k Dolby Vision + Dolby Atmos/DTS-X BluRay rips.

      Aside from that though, even 4k gaming is a struggle because GPU vendors went into the deep end of frame generation, which also coincidentally is the same mistake lots of TV OEMs already made.

    • Kyden Fumofly@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 months ago

      There is a lot 4K to consume now. That was the reality 5 years ago (even 4K exists more than 10). I would say 4K is becoming slowly the new FHD, but very very slowly.

      The problem is that there is a lot low quality 4K, because of bandwidth, size etc.

  • AA5B@lemmy.world
    link
    fedilink
    English
    arrow-up
    94
    ·
    3 months ago

    I want a dumb tv with the price and and specs of a smart tv. Less is more

    • thatKamGuy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      31
      ·
      3 months ago

      What you’re looking for are commercial screens. They’re a bit more expensive for a comparable panel, as they are intended for 24/7 use- but are about as dumb as they get nowadays.

      • NocturnalMorning@lemmy.world
        link
        fedilink
        English
        arrow-up
        21
        ·
        edit-2
        3 months ago

        A bit more expensive? I was able to get a smart tv for like 800 bucks. The same equivalent dumb tv would have been a few thousand dollars and Best Buy said they would only sell it to business accounts which was infuriating to read.

        • Doomsider@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          ·
          3 months ago

          This isn’t a peasant TV. And it doesn’t even have any tracking, I am not sure a pleb can even legally own these. Sorry, but you have to be a wealthy person who watches CP to have it in your home.

    • avg@lemmy.zip
      link
      fedilink
      English
      arrow-up
      21
      ·
      3 months ago

      You would likely have to pay more since they aren’t getting to sell your information.

      • Alcoholicorn@mander.xyz
        link
        fedilink
        English
        arrow-up
        12
        ·
        3 months ago

        *you would have to pay more because major companies know they can charge more. There isn’t a limited amount of profit a company wants to mae, and then they pick a price from that, they price it as high as the market will bear.

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 months ago

        Realistically, not only do I not want an 8k tv, but I might not get a tv at all, if I had to do it today. We rarely watch tv anymore. The brainrot is the same but we’re much more likely to use individual screens

    • keyez@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      3 months ago

      My LG C3 not connected to internet and using an HTPC and Nvidia shield is working great so far.

      • webhead@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        3 months ago

        I don’t know why more people don’t do this. Don’t connect the TV to the Internet. Ever. Do updates using a flash drive if you must and connect whatever flavor streaming box you like. The TVs and their dumb os get slow and shitty anyway so why fuck around with it? Lol.

        • BurgerBaron@piefed.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          Ppl spend so much time fucking around with Android slop boxes and I’m just like…how is using a full fat linux desktop PC under the TV with a wireless trackpad keyboard clunky? Set display scale to 200%, install Stremio, and now you can game and shitpost on your couch too as a bonus.

          • webhead@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 months ago

            It’s mostly because all the streaming services won’t give you higher res and HDR unless you’re using one of the boxes. You don’t need the slop tho. Nvidia shield imo has been solid for over a decade now. I’m over fire TV and those cheap shitty no name boxes are even worse. I have no idea why people fuck with those. Even a fire TV is better than that lol.

    • drgeppo@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      3 months ago

      I just recently found out about Thomson’s “Easy TVs” LINK

      Actual dumb TVs, with a good range on inputs and at a cheap price, they seem to be aimed at hotels and such.

      The downside is that they only do 1080p at 40" or 43" but the appeal of getting one before they disappear from the market is strong (also I don’t have any 4k media in my library so my worry is more about future proofing than any current necessity)

  • lemmydividebyzero@reddthat.com
    link
    fedilink
    English
    arrow-up
    86
    ·
    3 months ago

    4k is enough, 60fps is enough, no smart or AI stuff is perfectly fine…

    What about reducing the energy consumption? That’s an innovation I want.

  • Pyr@lemmy.ca
    link
    fedilink
    English
    arrow-up
    48
    ·
    3 months ago

    Fuck.

    Now instead of each new generation of TVs being slightly higher in resolution some god damn business tech executive is going to focus on some god damn bullshit to try and change or add which is going be absolutely fucking ridiculous or pointless and annoying like AI television shit or TV gaming or fuck my life. Smart TVs are bad enough but they can’t help themselves they need to change or add shit all the fucking time to “INNOVATE!!!” and show stockholder value.

    Resolution was something easy and time consuming but we can’t rely on that keeping them from fucking TV up any more.

  • ohulancutash@feddit.uk
    link
    fedilink
    English
    arrow-up
    41
    ·
    3 months ago

    They showed Skyfall on 70ft IMAX screens, and that film was shot 2880 x 1200. Its not all about the pixel count.

      • ohulancutash@feddit.uk
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        3 months ago

        DP Roger Deakins has a forum on his site where he answers these sorts of questions. The Arri Alexa Studio was essentially developed for him, as he prefers an optical VF. It has an academy gate, and Deakins favours spherical lenses rather than anamorphic so we simply take the width of the sensor and crop down to 2.40:1 aspect ratio to get the quoted dimensions. Your link quotes the capture resolution, which will have some excess compared to the finished material.

        This was put through the DI process and at a later stage IMAX DNR’d a blow-up to 15/70mm.

  • Omega_Jimes@lemmy.ca
    link
    fedilink
    English
    arrow-up
    35
    ·
    3 months ago

    Theres a ton of other things I want my TV to do before more pixels.

    Actual functional software would be nice, better tracking on high speed shots (in particular sweeping landscapes or reticles in video games) higher frame rates and variable frame rate content, make the actual using of the tv, things like changing inputs or channels faster, oh man so much more.

    Anything but more pixels.

      • uniquethrowagay@feddit.org
        link
        fedilink
        English
        arrow-up
        13
        ·
        3 months ago

        Yes I do. I want an actual smart TV with a practical, open source, TV-optimitzed Linux OS. It’s not that software on a TV is a bad idea in itself. It’s how it’s ruined by for-profit companies.

        • balsoft@lemmy.ml
          link
          fedilink
          English
          arrow-up
          19
          ·
          3 months ago

          Nah, honestly, I think stuffing an entire computer inside a monitor and relying on it to generate/show content is a bad idea no matter what software it runs. A dumb TV + a small computing dongle requires only a tiny fraction more labor to produce than a smart TV, but it’s so much easier to upgrade in the future if you decide you need faster boot times or wanna game on the TV, etc. And if the TV breaks before the dongle does, you can also buy a new TV and keep all your settings/media without transferring anything.

          • Aceticon@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            7
            ·
            edit-2
            3 months ago

            Also to add to this, the life-cycle of a TV display is mismatched from the live-cycle of media playing hardware or just hardware for general computing: one needs to update the latter more often in order to keep up with things like new video codecs (as for performance those things are actually implemented in hardware) as well as more in general to be capable of running newer software with decent performance.

            I’ve actually had a separate media box for my TV for over a decade and in my experience you go through 3 or 4 media boxes for every time you change TVs, partly because of new video codes coming out and partly because the computing hardware for those things is usually on the low-end so newer software won’t run as well. In fact I eventually settled down on having a generic Mini-PC with Linux and Kodi as my media box (which is pretty much the same to use in your living room as a dedicated media box since you can get a wireless remote for it, so no need for a keyboard or mouse to use it as media player) and it doubles down as a server on the background (remotely managed via ssh), something which wouldn’t at all be possible with computing hardware integrated in the TV.

            In summary, having the computing stuff separate from the TV is cheaper and less frustrating (you don’t need to endure slow software after a few years because the hardware is part of an expensive TV that you don’t want to throw out), as well as giving you far more options to do whatever you want (lets just say that if your network connected media box is enshittified, it’s pretty cheap to replace it or even go the way I went and replace it with a system you fully control)

      • Omega_Jimes@lemmy.ca
        link
        fedilink
        English
        arrow-up
        8
        ·
        3 months ago

        I mean, yes and no. I like e-arc, and I like being able to adjust settings other than v-hold. But I don’t want this slow crud fest that keeps telling me when my neighbour turns on Bluetooth on their iphone.

          • Omega_Jimes@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 months ago

            Yeah, all my inputs go to the tv, then i run a wire to the receiver. This makes it so my ps5 and PC are plugged directly to the tv so i can get different resolutions are variable refresh rate and the tv can control the receiver. So when I turn something on, the tv/receiver turn on and set themselves to matching settings, Dolby, stereo, whatever. Its not huge but its a nice convinienice over the older optical connection.

      • Zozano@aussie.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        I want software on my TV.

        Steam Link specifically. I like streaming to my TV via Ethernet.

    • yeehaw@lemmy.ca
      link
      fedilink
      English
      arrow-up
      18
      ·
      3 months ago

      I still probably watch 90% 1080p and 720p stuff lol. As long as the bitrate is good it still looks really good.

    • Lenggo@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      3 months ago

      I have a Samsung frame because I wanted a TV that didn’t look so much like I had one, but the software is so goddam bad. The only way to switch sources quickly is to set them as a favorite which isn’t always that straight forward if you didn’t do it right away. Regardless you have to let the the home page fully render before you can even worry about that. Even the Samsung TV app which you would think would be perfectly optimized for the hardware since the same compare makes the software is barely functional and loads like a web page on AOL in 1998

      • lobut@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        I like my frame because it faces a set of windows and with all my other tvs … I would have to close the blinds to see the tv in the day time.

        However, the software is straight garbage. I didn’t even know about the favourite thing … every time I change source it would spend like a minute or two trying to figure out if it could connect to it for no reason.

    • Grandwolf319@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      18
      ·
      3 months ago

      Right? I really don’t get it.

      It’s the law of diminishing returns. When we went from 480p to 720p, it was a huge jump, something everyone noticed.

      But the jump to 1080p was much less noticeable, to the point that people just thought its no different than 720p.

      Now we have 4K, and most people stream, so it’s a bad bitrate ensuring it doesn’t look good all so the consumer feels like they are getting more cause 4K sounds cool.

      I’m sooo glad 8K is not becoming a thing.

      • TheKingBee@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        3 months ago

        But the jump to 1080p was much less noticeable, to the point that people just thought its no different than 720p.

        What!?

        Unless it’s animated and even then depending on the art style, I can immediately clock the difference between 1080 and 720. It is another huge leap.

        1080 to 4k though when sitting on the couch are basically the same.

    • ORbituary@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 months ago

      Same. My TV is capable of it, but I stream everything off of Jellyfin to a RasPi 5 running Kodi. It just plays. As long as it’s not 480p, I’m happy.

    • SuperSpruce@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      My Internet is so bad that I still often watch YouTube/Nebula at 360p or even 240p. I almost never go above 720p.

  • towerful@programming.dev
    link
    fedilink
    English
    arrow-up
    26
    ·
    3 months ago

    Yeh, do: 60fps, 30 bit color… and I guess HDR?
    Do things that people can actually appreciate.
    And do them in the way that utilises the new tech. 60fps looks completely different from 24fps… Work with that, it’s a new media format. Express your talent

    • theparadox@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      3 months ago

      Sorry, the best I can do is install a camera and microphone on our next model, to spy on you and force interaction with advertisements.

      I mean video conferencing from your living room. How neat is that?

  • A_Random_Idiot@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    ·
    edit-2
    3 months ago

    TV manufacturers salivated at the idea of TV resolution, hoping desperately to turn the TV market into something like the PC market, in that you have to upgrade every 5ish years to stay on top of technology and use the latest stuff to artificially increase sales beyond what their already abysmal build qualities provide them.

    I’m glad the plan is failing spectacularly.

    Hopefully this forces them to think more about quality and start focusing on TVs that actually last now… You know, like we used to have 30 years ago.

    • jj4211@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 months ago

      start focusing on TVs that actually last now…

      That only makes their “people need to refresh their sets for our bottom line” even worse for them.

      BTW, 30 years ago TVs were expensive and still failed. There was a viable TV repair industry because it was worth spending the money to repair and easier to repair.

      Anecdotally, my Plasma and my LCDs have been more problem free than when my family had CRT TVs back in the day.

      • A_Random_Idiot@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        Yeah, exactly. TVs were better back then. they were more durable (The Wiimote accidents would never send a CRT to the dump), and actually repairable.

        and they lasted decades. Hell, I’ve seen people find CRT TVs found abandoned in fields for years and bring them back with minimal effort.

        as long as the tube/neck of a CRT is intact, it will run/be repairable.

      • Malfeasant@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        I still have a ~30 year old tube tv that has never needed anything, it still works… But I’ve been through at least 4 HDTVs.

    • Tylerdurdon@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 months ago

      Pfff, they’ve just turned to adware-laden boxes. Next they’ll make up some BS about requiring the device to be Internet connected so you can’t disable ads too easily.

      That’s a big part of enshitification: maximizing profit at the sacrifice of product quality. All of those pro-capitalist folk want you to believe the market will correct itself. The problem is when the entire market is dominated by this mentality and anyone (doing anything different) tries to enter that market is snuffed out immediately. None of the major brands will stray from this model because they are completely and hopelessly servant to the shareholder, and all that matters to them is maximizing profits at any cost. Yay enshitification!

      • A_Random_Idiot@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        Yep, and that added complexity to make a dumb TV smart, just means theres more parts that are likely to die and make ethe TV not work.

        Its bullshit that the only way to get large dumb TVs anymore is to roll the dice on a Scepter… Which, given how they procure their screens, could either give you a great TV or a shit TV.

  • arthurpizza@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    ·
    3 months ago

    For the majority of people a 1080p60 with a high bitrate and 10+ bit color space will look absolutely perfect. Some can pixel peep and tell, but more people still struggle seeing when the aspect ratio is wrong on their TV.

  • LiveLM@lemmy.zip
    link
    fedilink
    English
    arrow-up
    22
    ·
    edit-2
    3 months ago

    Even if it was, the streamings everyone’s using crush down the bitrate so bad it’d barely look better than 4k anyway.

  • magic_smoke@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    19
    ·
    3 months ago

    That’s because the answer isn’t higher resolutions, it was legally enforcing h.265 to be open source. Now the solution is AV1, but video codecs shouldn’t be locked down like that.

    To act like that was ever in favor of “protecting the sciences” is a fucking joke.

    • jj4211@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 months ago

      Umm… ok, but that’s not really related to this article…

      Everyone ditching H265 in favor af AV1 universally doesn’t make TVs sell any more or any more expensive.

      • magic_smoke@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        3 months ago

        No, my point was that people don’t need higher resolution TV’s, they need good transcodes that don’t look like shit.

        Streaming services run at bitrates/codecs that look like dookie compared to bd rips even on my shitty $100 sceptre 1080p Amazon special TV.

        Who the fuck is gonna buy an 8K oled panel when no ones willing to conveniently provide content that looks good on it, or even content that pushes their current TV to its fullest extant?

        Its not like anyone can afford a GPU that renders modern games at a playable framerate to that either.

        • jj4211@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          Ah, ok, that’s fair. I agree that codec/bitrate choice has made a lot of ostensibly ‘4k’ content look like crap, so why have 8k when many providers/internet connections won’t even cover the requisite detail to drive 4k in streaming.