Tom's Hardware of Finland

Sounds valid, i’d say.
If we consider that probably some re-implementation work had to be done in the competing VP9 (supposedly on its way out) and AV1 codecs, we’d have another reason for less impressive gains.

I’m not sure to what extent we can say it’s “moved forward” when it’s the same codecs – it might simply be a matter of first generation blu-ray having had a needlessly high bitrate for the resolution and codec when taking into account what could be achieved with better and multi-pass encoding, which at that time felt analogous to what could be achieved with xvid vs mpeg2, even though in this case it was h264 vs h264.

encoders (especially fixed-function encoders like the ones that are baked into GPUs for realtime framebuffer capture for streaming) did improve for h264 over time in the sense of being able to produce fewer artifacts in quicker and smaller encodes, but given the maturity of the streaming infrastructure that h264 built I expect hevc to have less runway and I really don’t think we’ll ever see a 10bit 4K hevc producing acceptable results below 10-15GB/hour.

if anything I’d have expected the physical media to have grown more than necessary again but of course that’s not trivial given diminishing use cases; the doubling they achieved here is obviously sufficient.

HEVC won’t be as popular until the market cycles enough that low end hardware can run it well. Anything over about 1.2 gigs per hour, chokes my 3.2Ghz ivy bridge laptop. And my windows phone can’t even manage a slideshow on half that.

HEVC/h.265 is only about 30% more efficient than H.264. But 4k is four times as many pixels as 1080p. So, you still need plenty of bitrate to cover all those pixels. And you need hardware support.

As far as image quality goes, HEVC is less grainy. At lower bitrates, It has more of an artificially smoothed look to it. Whereas H.264 sorta just lets blocking and banding show. Although if you encode with motion compensation settings cranked, pixel crawl and artifacts under motion are basically eliminated in h.264. But that can dramatically increase encode times or require proportionally better hardware to keep the times down.

oh yeah I’m pretty adamant about h264+++ always being decoded in hardware, I agree there shouldn’t be a use case for doing it in software

this is something that Google’s otherwise pretty admirable stewardship of VPx and AV1 (although those are still all but controlled by Google, compared to something like FFV1) has always been frustrating about – I know the fixed-function hardware has to follow the software implementations, but they aggressively switch their consumer-facing stuff to codecs that aren’t yet decoded in hardware, and people I work with still avoid hangouts because it’s so CPU-heavy.

So yeah, in terms of image quality, 4K streams (or bit-rate equivalent pirate rips) are only about as good as a 1080p Blu-ray. And depending on the material, the 1080p blu-ray is still gonna be better. Usually for darker or lower light material. Such as Arrival. Because bit-rate is king, in those scenarios.

It’s gonna be awhile, before streams clearly beat Blu-ray. 1080p and definitely 4k discs.

And that’s only speaking about video. Blu-ray is still far better for audio options.

I think that’s a substantial exaggeration – you can’t watch Planet Earth 2 on Netflix in full 4K HDR and say it’s comparable to the 1080p release – but I will have to take your word for it as I don’t think I have ever watched a commercial blu ray disc.

That 30% improvement stat you quoted is of a sample at a particular resolution (I’m guessing one less than 4k). Compression ratio improves with resolution, because the higher the resolution the higher the probability of any 2 adjacent pixels having a similar color, so knowing “4x as many pixels” in raw terms actually doesn’t tell you much about exactly how much additional bitrate you need. The compression ratio improvement rate may also vary depending on codec. So simple rules of thumb won’t get you very far in this area.

Here’s a blog post about this anticipating the possibility of 4k streaming in 2013: http://alexzambelli.com/blog/2013/01/28/h-265hevc-ratification-and-4k-video-streaming/

this 4k amazon rip of the expanse s2 has a dts-hd track, I guess that could of been muxed in from the bluray, seems a lot of work for a pirate though

According to the internet, Netflix 4k HDR can be as high as 18mbps (about 16mbps without HDR). I dunno if that’s a rule for all of their 4K or if only flagship titles get that bitrate. And they say you need an at least 25mbps connection, to probably sustain that.

The best Dual layer 1080p blu-rays sustain about 37mbps. And I can’t remember the last time blu-ray.com noted a movie being well under 30mbps. As Im not sure if any movies or shows ship on single layer discs anymore.

Planet Earth’s daylight scenes are probably a best case scenario and indeed the extra pixels probably win out. Even if some of them are not the cleanest pixels. Especially in wide angle shots.

However, I am willing to bet anything with long night scenes or low light, such as a sci-fi, would probably be a worse overall experience streamed 4k Vs. the 1080p blu-ray.

but you’re comparing two different codecs at that point (h264 vs hevc) which means bitrate is not at all apples to apples

this is great and all but how does this help me, the man who has been sitting next to a cinema system for 3 hours ingesting a 2K scope film (so, 2048x858) that’s also 250 GB big

I want every home movie release to come on solid state media and be encoded in JPEG2000 frames from now on is what I’m saying

3 Likes

It’s still compression and dark stuff is still the weak point.

I could be wrong. I haven’t actually done my own comparisons at that level. I have only personally compare h.264 and hevc at 1080p and various bitrates. I haven’t seen dark stuff done well, under about 20-25mbps.

Streaming audio is not equivalent to the blu-rays in nearly all cases. There might be a couple out there. But basically all streams which claim HD or Master audio under the same labeling as the blu-ray, is a bait and switch. They are higher resolution and high bitrate audio streams. But not nearly at the level of a physical disc. Vudu claims dolby atmos. Those streams meet the technical parameters of working with certain speaker configurations for ATMOS. But the audio files themselves are much smaller than blu-rays.

Additionally, most pirated rips are probably further compressing. I mean, they can only capture whatever is streaming (whereas the stream is compressed from a master) and most rips under about 3 gigs per hour, are probably still cutting the audio further.

It’s moving in the right direction. But it will be a long time before streaming audio nears parity. It’s a lot less of an issue for most people. But my eldest brother has thousands into a system setup for Dolby Atmos. Some top end Pioneer receiver, and several Definitive Audio brand speakers. And he just added two more subs from Klipsch.

If it wasn’t just muxed in from a bluray I kind of doubt they made a whole second dts-hd master to save a couple hundred megs per file in this case. a dts-hd track is only like a gig in a two hour movie.

not nowadays usually. most amazon rips now you get the full thing you’d get from streaming it. that’s why on torrent sites now the choice is always between some little 900meg dinky compressed rip for phone watching or an 8gig rip.

1 Like

Are they directly saving the incoming bits or are they using capture software?

I’m assuming whatever one results in the equivalent of a bluray remux for a streaming rip because that’s the rules on private torrent sites for web-dl rips and they mostly got the same amazon and netflix rips up that the public torrent sites have

I still can’t decide whether Nvidia is likely to go back up to 300w+ power envelopes for their 1180 lineup. They’ve been on 250w for their high end and professional cards for so long now that it almost feels like a de facto thing, and Titan V being 250w and still not having any consumer architecture derived from it makes it seem unlikely they’d go bigger, but on the other hand they don’t really seem to have anything in the pipe to actually merit a new release cycle without doing that and at best they’re shipping a refinement of the same Pascal lithography so I’m not sure what else they can do if they want to release something that gets up to like 16tflop of single precision.

1 Like

Or maybe they’ll do something completely nuts and decline to gimp fp16 on a consumer architecture for a change and start pushing for mixed precision game engines since that’s the way they’re advertising some compute features now

That would be impossible to sell featurewise though so for now I’m just guessing that 350w GPUs are coming back temporarily until 10nm/7nm are mature enough to make GPUs on in a year or two.

actually nevermind the Titan V can already do 14tflop of fp32 and that’s with all of the additional compute features on-die

my new guess is they’re just going to be re-binning those as 1170 and 1180 with the thermals tweaked so that they can get the fp32 a little bit higher at the expense of everything else on the card & those are both targeted at 250w like the past few generations of Titan & there won’t be a consumer 1180Ti equivalent because there’s nowhere to go from there this time

:flushed:

… wow. I’ve never heard that before,… wow. How do you even do that…

anyway, 1170/1180:
Idk how many fp32 register sets there are on a modern gfx consumer product, or rather what most of them actually do, and am only talking of looking deeper into the MME extensions they did for the PPC/CELL family, called ASX or VSX, it has been about 10 to 8 years ago

still, considering that nvidia has a strong footing in the automotive market, would that be a thing worth following up on for a mainstream producer of mass market products, to make a quick dime or two in the pc consumer market with a product that would be introduced for a totally different market at the same time?