Compression. While Compression tech HAS improved, its been maintaining our current quality while reducing bandwidth needs.
A 1080p Bluray disk will look far far better than Netflix in 4k every time because its not compressed. The reality is that any form of compression will cause loss in fidelity in some way, so the only way to really improve video is to increase the bandwidth of the video.
I talk to IT nerds frequently who are asking things like “why do you need 16x 400GB ports of non blocking bandwidth” to which I have to explain that a SINGLE stream of uncompressed UHD is 12GB/s and we are trying to put 200+ streams onto their network.
Yep. On a Blu-ray disk, you have 25-100GB of space to work with. The Blu-ray standard allows up to 40mbps for 1080p video (not counting audio). Way more for 4K.
Netflix recommends a 5mbps internet connection for 1080p, and 15mbps for 4K. Reportedly they cut down their 4K streams to 8mbps last year, though I haven’t confirmed. That’s a fraction of what Blu-ray uses for 1080p, never mind 4K.
I have some 4K/UHD Blu-rays, and for comparison they’re about 80mbps for video.
They use similar codecs, too, so the bitrates are fairly comparable. UHD Blu-rays use H.265, which is still a good video codec. Some streaming sites use AV1 (at least on some supported devices) now, which is a bit more efficient, but nowhere near enough to close that kind of gap in bitrate.
A 1080p Bluray disk will look far far better than Netflix in 4k every time because its not compressed.
You’re not wrong about the quality difference but video on a Blu-ray is compressed. There is no way to get raw video unless you’re shooting it yourself.
any form of compression will cause loss in fidelity in some way
Lossless video compression also exists although I don’t think any consumer products have it.
You mean 12 Gbps right?
The standard is defined as 12000Mbits/s and spelled out as 12G-SDI or 12G SMPTE 2110.
So i do tend to mix things up since big G and little G mean different things in computer land.
Diminishing returns. It’s already hard for many people to see the difference between 1080p and 4k. The difference between 4k and 8k is almost nonexistent at significantly higher storage costs.
Yep. There are already systems designed to watch and play back 8k video at home, but it is largely seen as not worth the expense of implementing the system.
While 4K resolutions provide a very detailed image for average viewers, 16K resolutions exceed the detail the human eye can perceive at typical viewing distances. Therefore, most people may not notice significant improvements with higher resolutions like 16K.
https://9meters.com/technology/highest-resolution-the-eye-can-see
These links explain it technically, but in summary, 4k or 5k video is really all we can perceive up close. Anything beyond this is really just wasted, unless you’re talking really large screens.
We’re almost at the limits of what we can see, so improvements beyond this like IMAX aren’t needed for most applications unless you’re recording 360⁰ video, or projecting onto a huge cinema screen.
fullHD already looks pretty good
actually transfering all the data is a problem, like my wifi struggles even with streaming fullHD sometimes, so 4K is just unusable (+ you need a more expensive screen to actually show the 4K which I don’t have either)
I’m pretty sure that most devices have to compress the data going over HDMI and DisplayPort cables anyway.
Because the details get harder and harder to notice the difference.
I mean, I have difficulty seeing the difference between Blu-Ray and DVD, maybe in some cases if some effort were to be put in. But even so.
1080p and 4K? Barely can tell.
Things like going from VHS to DVD, yeah you can tell significantly. 360p to 720p to 1080p? You can tell, less pixelation.
Now I understand that it’s all about being great quality in greater resolutions, I get that, but really I don’t get the big freaking deal for 4K and 8K and all that.
Odd. I can immediately tell the difference.
I did an empirical test with friends comparing various bitrates and resolutions of the same source (on a 100 inch projector screen with a good 4k projector). I could guess 100% of the time correctly.
on a 100 inch projector screen
That’s why. You notice it because more pixels do help a lot with huge screens. I was watching some old series shot for TV recently, on my 34" monitor it doesn’t look too bad. On my projector with a ~100" screen it looks terrible with artefacts all over the place.
(My monitor is 1440p and the projector is only 1080p but I don’t think it mattered in this case since the source video was 480p)
I work in tech and I have a bit of a retail component to my job. This includes selling monitors.
I assure you you’re (we’re*–I can also tell the difference easily) in the extreme minority. The vast majority of people buy color and size, not clarity.
OEMs forgot to add hardware support for AV1 because H.265 jumped the gun in an attempt to stay relevant with clout from H.264 lol.
1080p is enough for most people’s eyes. What do you want?
Depends on your setup. On my 100 inch projector screen you can immediately tell when it’s not 4k. And low bitrates (like Netflix) become especially noticable with rubber banding and loss of sharpness.
Those are compression and bandwidth issues, not resolution.
No I’m also talking about resolution too. You can tell on 100 inches. On 100 inches a 1080p pixel is more than one millimeter in diameter. You can definitely see that when you sit 3 meters away from the screen. 4k will give you visibly more sharpness. Provided of course that you don’t butcher your movie with terrible bitrate compression.
It’s cyclical, years ago, video quality improved dramatically, and then stagnated for almost 40 years as prices fell until it was economical for video quality to improve again. We likely won’t see a meaningful improvement for another 15-20 years, then prices will come down and then we’ll see another improvement.
As others mentioned it’s diminishing returns, but there’s still a lot of good innovation going on in the codec space. As an example - the reduction in the amount of space required for h265 compared to h264 is staggering. Codecs are a special form of black magic.
Honestly I’m not sure I completely agree, we’re pretty close to a perfect TV with new LED technology.
But maybe we’re talking about a next stage here, like true 3D or something else like smell? I’m not sure what the future will be, but TVs look pretty good to me and I’m not sure what perfections the current ‘variant’ needs.
Mean on TV or YouTube videos or what?