Youtube Upload and Streaming Quality Notes.

I’ll work on this this weekend if there’s interest in a cross-platform GUI ffmpeg-backed upscaler. Also I’ll be more motivated if someone with more front-end experience can answer my dumbass questions lol.

1 Like

I worked on both an electron project and an ffmpeg project today so feel free to get at me

1 Like
Pissed off rant, no one's fault except my own.

Ok… just because I am a pissant I really need to say this. 1440p compared to 720p has double length in both measurements (height and width), meaning that it has four times the area. Meaning 1080p is twice as bigger as 720p, and 1440p is twice as bigger as 1080p and four times bigger 720p.

Yeah, I know I really sound like one of those annoying people that go around the internets correcting everyone for every single detail, but it is simply because I had to actually remind people of this many times related to pixel art. People that just ask for pixel art forget so many times that twice the measurements means that a person has to fill and click on 4x as much pixels around the picture… and it pisses me off every single time. Imagine if I was actually good at math.

Now that I got that out of my chest, I did some comparisons with your tests @disestablished.
Basically I reduced the quality of both 1080p and 1440p videos to 720p (YT menu), and did some screen shots (images are saved as PNG so no re-re-compression).

RESULTS =D:
720p Video

1080p

1440p

As anyone can see even without zooming the picture, probably even from a fucking mobile in vertical, 1440p looks waaaaaaaaaay much better. Why?

Well, I actually figured (well, found around the interwebs) this even before @disestablished replied to me on the Sekiro topic.
It’s because of this:

Weird weird stats and numbers. BUT pay attention to the codecs.
Oh yeah… different codecs for 1440p uploads.
After some investigation I also figured that you actually get higher bit rates with these codes. It’s not based on resolution only (obviously 1080p has higher bit rate than 720p otherwise it would look horrid even with poorer codecs), but on the codec you “get”. VP09 codec will always get higher bit rates at any resolution, compared with avc1.

But now, this video from a youtuber that I’ve been following for a few years, which coincidentally is now releasing a playthrough of DS2: SotFS (the drawing at the start of the video is actually mine =D, he’s a bit cheesy, but so very not annoying compared to other “let’s players”):

If you skip to minute 22:00 you also have that corridor passage, and looks about the same as @disestablished 1440p version (slightly less sharpen).
Also if you check the “Stats for nerds” console you can also check that he has VP09 codec, even if his video is only 720p (he always uploads 720p).

Why is that?

Well, now here comes the information that will flame people.
IF you have a channel with good views (not many thought, he has, but it’s not a huge channel), then any video 720p and above will automatically get the VP09 codec.

For the rest of us mortals, we have to upscale to 1440p minimum to actually receive that same codec. That is the ONLY way for us to force Youtube to give us that better codec, and the higher bit rate that comes with it. The normal every day Joe, that only wants to upload a few videos, which was the base for the democratization of Youtube and that made this platform as powerful as today, has to eat their content with a lower standard of quality or work harder on their side for it.

I kinda see where this comes from, Youtube preparing for the future TV with higher resolutions, but still I believe it betrays everything Youtube represented… actually a long time ago already.

Last year you could fool Youtube into making your 720p video use the VP09 codec, which was going to Enhancements on your Video Manager, and making some change like Trimming a bit of the video, changing saturation or whatever. The video would re-process and come out with VP09 codec.

Which takes me into another subject. Those video downloaders.
In fact I read someplace that Youtube actually stores (probably just for a few weeks) the original video file, and creates the other resolutions from resampling and recompressing.
So it might be that if you have videos at 720p that are no older than whatever time youtube keeps your original video, and you start chasing the big viewing numbers, your videos will be also get codec VP09 copies.


So just as a conclusion, @disestablished said all that is necessary, you have to upscale 1440p to access the better quality storing that Youtube provides (except if your channel has a moderate number of views).

I just wanted to expose all my findings from yesterday spending a few hours trying to figure what was going on, thought it relevant for topic.

Also I have a few more questions for both @Felix, @disestablished, and everyone around that knows about all this video shenanigans better than myself. If you scale a video, with any of the methods presented here, the video will be re-compressed? Does that mean that you will compress again a compressed video, and then if you still edit on an editor, you will compress it a 3rd time when rendering it? Kinda remember my video/sound classes saying that is terribly bad, or do any of these new scalers actually avoid that? Also, if the video surface is bigger, will the bit rate have to increase drastically? Otherwise you would just get a bigger video but with worst quality because the bit rate would still be matching the data stream for a much smaller surface?

2 more thing. First I do petition for this topic to be moved into another subforum. I think the this is a very relevant topic, and people can keep coming back, posting tests of youtube videos and asking for opinions.

Second, I will do another post after this one, with some tests I did with my Sekiro recording.
I would like then to ask for a few opinions on which versions look better, and maybe some advises on how to get even better results. Specially if someone actually knows a magical scaler that doesn’t re-compress the video, does the super resolution thing as well, allowing me to get better editing material than mine just very slightly lossy 720p Sekiro recordings.

Just to top it off, guys… a big manly kiss to you all because this thread and the fact that you guys are actually interested in talking about this subject, is already a huge help for my big problem of the last week. How to archive this playthrough in a way that will allow me to look at it with pleasure in 5, maybe even 10 years.

6 Likes

I was very careful about the wording. 720p is 1280x720p. 2K/1440p is 2560x1440

1280 x 2 =2560

720 x 2 = 1440

So its twice the vertical and horizontal. But since we are talking about screen area, its four times the pixels. : )

1 Like

Yeah xD. Noticed it… but the rant just came out for the reasons mentioned.

1 Like

for the record, vp9 isn’t a strictly better codec, it can just achieve slightly better results at lower bitrates because it’s more efficient.

In fact, there are some cases where you might prefer to get an h264 stream instead of vpx, if you’re on a platform that can decode h264 in hardware but not newer codecs, and you have to use a browser extension to lie to YouTube about not even being about to decode vp9 in software if you want to achieve this: https://chrome.google.com/webstore/detail/h264ify/aleakchihdccplidncghkekgioiakgal?hl=en

the reason YouTube does this is presumably to save bandwidth, which they need gargantuan amounts of and are obviously interested in optimizing as far as possible, but to some extent I think this actually offloads an unreasonable net inefficiency onto the client side, because the difference between decoding a vpx video in software and an h264 video in hardware, especially at 1440p and above, can be upwards of 50w of CPU power. also, unlike h264 and h265, vpx codecs (and av1) are developed primarily by Google themselves and don’t require any licensing fees, so they have a vested interest in using them as much as possible.

so it’s not just a conspiracy to downgrade your videos.

4 Likes

they will however always transcode everything on YouTube to h264 at 1080p and below in case they have to serve it on mobile, which is the one place where they will always default to h264 on the assumption that you can definitely decode that in hardware but may not have the fixed-function decoding or sufficient CPU power to make up the difference beyond that; you just might not be served the h264 version of those videos on desktop if they can favour a vp9 derivative instead.

conversely, I don’t think they generally do h264 at all at 1440p and above, because bandwidth matters that much more at that point and those resolutions are safely regarded as premium. even though most modern GPUs can decode h264 up to 4K (and in fact that’s slightly more widely supported in hardware than newer codecs are), almost no one streams h264 at that resolution because the bitrate would be so high, preferring h265/hevc (apple / Netflix / blurays) or vp9 (google) instead.

4 Likes

I run a fairly popular YT channel for work with a ton of videos, and a double checked a sampling of early and newer stuff.

My old NTSC stuff that was uploaded as h264/mov files uses whatever this VP codec is, whereas the newer stuff as h264/mp4 is using h264.

My HD stuff was all uploaded as h264/mp4 and is also using h264.

1 Like

If you are going to edit the video, you should do it in an editor which supports upscaling during the rendering/compression process. That way you are only compressing one time, before upload to youtube. And it will have all of your edits. Or you could upscale first and store it in a lossless format. Then make your edits and compress the final. However, that would take longer. But it makes sense if you are going to add overlays and whatnot.

I’ve seen a lot of youtubers talk about Sony Vegas for upscaling. I’m sure most of the other pro software can do it.

Also, if the video surface is bigger, will the bit rate have to increase drastically? Otherwise you would just get a bigger video but with worst quality because the bit rate would still be matching the data stream for a much smaller surface?

Yes, generally speaking, more resolution demands more bitrate. However, H.264 is pretty efficient. I dunno if youtube will accept an H.265 upload (they probably will). That’s even more efficient. I purposefully used a really high bitrate of 40mbps, to avoid getting too deep into the bitrate discussion. (see: what bitrate should I use for my video?. And the answer is-----complicated. There are some general bitrate milestones. But at the same time, encoder settings matter and it kinda ends up trial and error and personal preference, to settle on bitrate. Its a balance. or you can just crank it.)

With my video, the 720p version looks completely indistinguishable from the source file, @ 40mbps, 135Mb file size. The source recording is 218mbps, 711Mb. So we have compressed to roughly 1/5 bitrate/size and still look exactly the same.

The 1440p upscale @ 40mbps seems to be free of blocking and artifacting. I’m not gonna say it looks lossless. But its pretty damn close. (I also haven’t viewed it on anything other than my laptop’s 720p screen. Problems may reveal, on a proper high res screen). Also, the upscaling process gave a natural boost to sharpness so overall detail looks higher. But, sharpness boosts are not without some negatives.

Here are some example screenshots from the 720p source file and then from the 1440p upscale----downscaled back to 720p (because that’s the limit of my laptop screen). In these first two screenshots, you can see that the textures look better, kind of like a better texture setting for the graphics options. However, we are mostly looking at large objects and not many high contrast edges.

1440p upscale


720p source video

1440p upscale


720p source video

And this last comparison, we have lots of skinny objects and high contrast edges. Look at the tent cables, tree branches, and some of the cliff edges. You can see the added noise from the sharpness boost. But…all of the textures look better. (You can add this effect in real time with something like Reshade. Its one of the most popular things to do, as it makes the the textures look nice. And if you keep it reasonable, noise isn’t notiecable unless you stop to really take a look, like we are right now. But there are ways to deal with that and real time inject-able shader effects is a whole separate topic)
1440p upscale

720p source video

Second, I will do another post after this one, with some tests I did with my Sekiro recording.
I would like then to ask for a few opinions on which versions look better, and maybe some advises on how to get even better results. Specially if someone actually knows a magical scaler that doesn’t re-compress the video, does the super resolution thing as well, allowing me to get better editing material than mine just very slightly lossy 720p Sekiro recordings.

If you want to upscale before editing, you should use a lossless format. Do your edits, and then compress the final product for upload to youtube. There are probably some industry standards lossless formats and I’m sure some of the pro softwares have propietary lossless options, as well.

**Oh yeah, here are the 1440p youtube versions for 2 of those screens (again, screenshot is 720p because my monitor). The missing screen doesn’t skip to the same frame, on youtube, so I didn’t include it.

1 Like

Do those videos using the H.264 codec look relatively bad?

This is so dumb!

If you set my 1440p version to 720p, they look similar. But if you watch it at full 1440p, it looks way better. And that’s because, as we have identified today; youtube gives more bitrate to higher resolutions. But my video is an upscale. So the 1440p on youtube, looks similar to the original 720p source file.

And it seems to trickle down, because setting it to 720p on my video, still looks better than your friend’s 720.

Your friend’s content could look better on youtube, if he upscaled it before uploading it to youtube.

1 Like

For this post I’ll write the relevant stats on top of each video. They are all exported in Quicktime (.mov) for the sake of having PCM sound, instead of re-compressed sound. Original sound is compressed, compressed again at render, and then youtube might re-compress again, I’ll avoid what I can. Also MBpr is Megabits Per Second.
Also, my recordings are 720p at about 66MBps (higher than that I would get lag).

More accurate settings or recordings
  • Recording Software: OBS Classic
  • Format: MP4
  • Audio Tracks: 2 (game & micro)
  • Encoder: NVIDIA NVENC H.264 (STUPIDLY fast, couldn’t have done 60fps any other way)
  • Rate Control: CQP
  • CQ Level: 10
  • Preset: Max Performance
  • Provile: high
  • Look Ahead: off
  • Psycho Visual Tuning: off
  • GPU: 0
  • Max B-frames: 2

Finally, I always upload the entire fight video, because it is a section that I know very well, it has high contrast, moving alot, pixels all around the video with effects and what not. I think it is a good test, at least for myself. But you guys only probably need the first 8 seconds to see the differences. I just don’t mind watching this over and over.


720p, H.264, 30MBps.

I was _not_satisfied with this. I went way overboard with the bit rate, because I saw on a video that you better have the best quality possible because Youtube will re-compress anyway, so 3x the recommended bit rate (9.5MBps according to YT), seemed about right. Problem is that I also uploaded the same video with 10MBps, and the difference was barely there, if it was at all and not only on my mind.

So after hearing all about the 1440p trick I started doing some renders and experiments. One thing that I read at one of the videos that I saw explaining the 1440p trick and the whys, stated that it is better to do some sharpening on your video, so things don’t become too blurry (as @disestablished just wrote about right now).

I am using DaVinci Resolve, cause it is free and I am kinda used to Adobe Aftereffects. However this free version does not have an actual Sharp filter (on the Pro), but it is possible to create a Sharp filter with layer combinations, and sharpening the colors. Saw it on a video, the results are kinda fine (you guys tell me), I can probably still move a few sliders to get better colors and results.


Anyways, the tests. I advise to push to 1440p even watching on an 1080p screen, it does look better:

1440p, H.264, 30MBps, NO Sharp Filter

1440p, H.264, 30MBps, Sharp Filter

Personally, I think the Sharp Filter is actually working pretty well in here.
Without the filter it just becomes a bit too blurry. Also setting the video to 1080p, even if my screen is 1080p, actually reduces the quality considerably.

Still I noticed quite a few artifact here and there, even with VP09 codec and 1440p resolution.
So I decided to do a couple more of these tests but exporting with 40MBps to check if it came out better, and so:


1440p, H.264, 40MBps, NO Sharp Filter

1440p, H.264, 40MBps, Sharp Filter

I thought at first that I saw much difference between the 30MBps and the 40MBps versions… but now that my eyes are kinda getting used to all this testing, and now they don’t look that different, if any different at all.

Still, watching the entire video I would risk stating that the 40MBps versions do look better.
Again to my personal opinion, I would bet more on the Filtered version instead of the unfiltered one.


For the purpose of discussing if I should use filter of not, I tried my best to match the same frame on 3 different versions, and screen shot them:

Original 720p Video, scaled to fullscreen 1080p.

Youtube upload @ 40MBps, NO filter.

Youtube upload @ 40MBps, Filtered.

What would you guys say??
What should I do here? Filter or no filter?
I’ll probably still go with 40MBps because I don’t mind the render times, not do I mind the upload times (amazing polish fast internets).


I absolutely agree, but I am actually not a friend… more of a follower. He just accepts those title cards from whoever wants to make them. Did that one for DS2 and another in pixel art for Conker’s Pocket Tales. He started a long time ago doing 720p videos. I think some closer followers already asked him to start uploading at least at 1080p, but he just keeps working at 720p.

I don’t believe he had any kind of academic video education, so he probably thinks those long render times for 1080p or 1440p are something weird (lol… remember rendering 1080p, with cuts and filters, 10 years ago xD?).


[edit]
BTW, I am rendering and uploading another couple of tests at 50MBps, just for the kicks of it.
Kinda curious about the results, I’ll post them later.
[/edit]

1 Like

Probably made it sound like that, but didn’t meant to.
Kinda imagined those reasons, but not with that much detail, so thanks for that. Again I believe we are building a good and strong discussion about the Youtube platform in here.

But what I did meant is that these bit rate policies, even if very rational, do imho betray what YT represented and achieved for today’s society.
More than a conspiracy, I would point more that these kind of policies for money grabbing and money saving have been a constant source of complains about YT for the past… dunno… 5, maybe 3 years?

Most “normal” users, viewers, and smaller Youtubers did noticed “the chance” and have a big array of complains. Conspiracy is much stronger than what I intended, more like probably another consequence of something that is in plain view of a change of mentality from the people behind the platform. Forced or deliberate on their part? I have no fucking idea.

Ok so A. In your first comparison with the 30mbps videos, you posted the same video twice. I’m certain it is the video with the sharp filter.

B. I think the 30mbps video with the sharp filter, is the best looking one of all of the examples.

However, I think you should try a different sharp filter and also a lower setting. The particular one which you used, looks kinda “dry” with Sekiro. A different algorithm filter might work better, with Sekiro.

And I think the slightly extra quality of the 40mbps, actually shows the sharpness more, so it looks a little worse. So be careful about adding too much.

I haven’t played Sekiro. But, it looks like it has a motion blur effect during movement. and maybe even some depth of field stuff. And also maybe a bloom effect. All of those are going to soften the image. I mean the original 720p source image looks like a game which is meant to look “soft”. So be careful about fighting that too hard, with a sharp filter. You might try turning off the motion blur, if you want sharper captures during fast combat movement.

Additionally, I think that you should use as much bitrate as you can stand. Dark Scenes like this, are really challenging for H.264 and will basically always result in banding and artifacts. Save for really high bitrates. It looks like there is some banding in your 720p source image. And that’s just the nature of the beast. If you can use NVENC to encode in H.265, I would try that on a dark scene and see if it is better.

I’ve never noticed an issue.

Thanks for the heads up, changed the link.

And I kinda see your point. It is true, Sekiro does have a general “smoothness” about it that I absolutely destroy with a strong filter like that. I’ll play with the settings a little bit, and try to get some better results.
I have to admit that no filter is indeed more faithful to the original capture, but I do feel it lacks at least a little bit of sharpness in there.

Still, using the same filter I did the 50MBps versions and uploaded them, here we go:

1440p, H.264, 50MBps, NO Filter

1440p, H.264, 50MBps, Sharp Filter


For good measure I’ll also add the previous screen shots with 2 newer ones I just created:

Original 720p Video, scaled to fullscreen 1080p.

Youtube upload @ 40MBps, NO filter.

Youtube upload @ 50MBps, NO filter.

Youtube upload @ 40MBps, Filtered.

Youtube upload @ 50MBps, Filtered.


Now the quality increase, imo, is there. It is minimal but it is still there.
I purposedly linked both soft and sharp versions together, so it would be easier to notice the differences between 40 and 50 MBps.
You can see a very subtle “increase” on the soft version, but it is even more evident on between the sharp versions.

Putting aside how bad or good the sharp filter looks, it does allow to notice the increase in quality better, because at 50MBps things do look even sharper.
I won’t go beyond 50MBps, I think it is rather acceptable, but anyways I would be interested in your opinion. Should I step up all the way to 60MBps?

This is kinda weird, because on the original video files at my PC, I don’t see that much of a difference between both of the versions.

That’s because they were compressed from the original source. Youtube’s re-compression can only be as good as what you give it. So I would feed youtube the highest bitrate you can stand, as far as render times and upload times.

In terms of the sharpness filter, I would play around with different filters. For example, in Reshade you can choose between like 10 different algorithms for sharpness filters. And they all look a little different. So if you really care, you gotta play around and find the best one to suit each game.

Yeah but my problem here is… well… I don’t really have a sharpen filter =D.
Like I previously said, I am basically playing around with layers to get a “sharpen” effect.

One layer without saturation, you sharpen it, because our eyes are much more sensible to sharp light differences.
Another layer with just color, no light information, you actually blur it to balance the sharping of the light.
Then you mix them with a simple addition of layers.

I do admit I might have abused a tiny bit on the sharpeness, and definitely didn’t blurred the color enough to counter-balance. But I’ll work around with my settings, and I bet I’ll eventually find something that I will find more comfortable (dunno about other people, but hey… them the breaks).

Also I do admit I sharpened this much because it allows to see how much dirt this re-compression thing does.
I decided to be even more thorough and upload a couple of 60MBps videos, just for personal purposes. I won’t make another post about them because I think we already got to most of the important points about the topic, with nice examples, screen shots, and what not.

If I find 60MBps worth it (just more render and upload time, I’m not in a hurry), I’ll then just work around with the filter settings, and try to get very slight setting to sharpen just a little bit to fight against all the blurring.

But to be honest, this DaVinci Resolve program already is set to sharpen videos that are scaled (other choices are Smother, Bicubic, and Bilinear). Sharper work pretty ok as you can see on the smoother video, but… still a bit too blurry for my personal taste. But just a tiny bit.

Yeah the sharpness boost you see in my videos, is actually built into the upscaler. I mean, its automatic, with no settings. there are a bunch of other sharpness filters which you can add pre-upscale or post upscale. But, it would be too much to add anything extra.

I just realized that my videos don’t have sound!!!

1 Like