Does anyone still want that cross-platform GUI drag and drop upscaler? I got the backend and a rudimentary front-end working. I’ll clean it up, package it, and put it on Github if there’s demand. @Toptube @Felix @Decinoge
I don’t need it but here’s my own notes on packaging electron apps just in case:
electron-squirrel-startup to dependencies and put this at the beginning of your
then, to build:
electron-packager . --icon=resources/icon.ico (Windows)
electron-packager . --icon=resources/icon.icns (Mac)
Finally, to package for install:
electron-installer-windows --src radiamagent-win32-x64/ --dest install/ --config config.json (Windows)
hdiutil create tmp.dmg -ov -volname "RadiamAgent" -fs HFS+ -srcfolder radiamagent-darwin-x64/ && hdiutil convert tmp.dmg -format UDZO -o RadiamAgent.dmg && rm tmp.dmg (Mac)
ignore the references to radiam, that’s our project
Yes, it would be fun for me to play with. And the upscaler I’ve been using is very finicky. Thank you!
Still interested in this. Have a few tests I’d like to do.
I did a couple of clips for Goldeneye Source
720p 30 frames per second
Same video, upscaled to 4K before uploading. So Youtube thinks its a 4K/30 video.
This clip is 720/60. The game was actually running at 30fps (I capped the framerate at 30), but I set the recording software to capture at 60fps.
And same video, upscaled to 4K/60 before uploading. So, Youtube thinks its a 4K/60 video. But the game was running at 720p/30
Remember, the point here is not to make a 720p video look like a 4K video. The point is to keep our video looking like the source file. In this case, a 720p source. Youtube does not give enough bitrate to a 720p video upload. So we trick youtube by upscaling to 2K or 4K. This assigns more bitrate when youtube processes your video. The net result is that the video on youtube will look more like the 720p source file. 60fps also gets more bitrate Vs. 30fps. If your source file is 30fps, you can frame double before uploading, to gain that extra bitrate. Or you can try to record 60fps to begin with.
If you haven’t read this whole thread: 2K/60 on youtube looks pretty darn good. But still has some noticeable compression issues. 4K/30 and 4K/60 help things further. There are of course still going to be some issues. But they will be further minimized. Its the price you pay for compression. Also note-able is that with a 4K upload, if you select the 2K or 1080p quality setting while watching the video, its going to look better than a video which you uploaded at 2K or 1080p. The extra tier of bitrate for 4K has a trickle down effect to the other, lower quality settings. Here is an example:
4K/60 upscaled upload, with 720p/60 quality setting selected
4K/60 upscaled upload, with 4K/60 quality setting:
Note that if you frame double a 30fps video to 60fps: you need to also double the bitrate for that new video you are outputting. So lets say you captured at 1080p/30. Then you make some edits in a program and set the final product to upscale to 2K resolution with 40,000kbps bitrate. and that’s the file you upload to youtube. —If you wanted to do that same video with doubled frames for 60fps, you should output it at 80,000kbps, to achieve the same relative quality, before uploading to youtube. Even though you simply doubled the frames rather than having 60 unique frames----you still have twice the amount of frames to compress. So, you need more bitrate to keep it looking good.
Its the same with capturing video. If you are going to capture at 60fps, you need to double the bitrate of what looks acceptable to you, at 30fps. Otherwise, you will run into a lot more compression issues.
Overall, after all of my comparisons: I recommend upscaling to at least 2K/60. With the extra bitrate assigned by Youtube, you gain a huge amount of clarity compared to 720p and 1080p. 2K/60 Minimizes a lot of mush and artifacts. Especially for brighter scenes, it can look very clean and is similar enough to a 4K/60 upload. Looking overall similar to the source file.
If its mostly darker scenes, I recommend spending the extra processing time to upscale for the 4K/60. It goes a couple of extra steps to help keep detail in dark colors and darkly lit secenes. Especially when there is a hi amount of motion.
Oh so this program I’ve been using to upscale with the super resolution method — does not require payment. I thought it did. But it just lets you register it for free, since a year ago. Haha. You just put in your email and they instantly give you a key on the same screen. I haven’t even received an email from them at all.
Long thread to review, posting for the moment to say I’m about to start looking at the best export of 3-4 hours 1440x1080 60 fps from Final Cut, then a size reducing conversion that will maintain highest grade audio along with video…
Right now an H.264 .mp4 of 25 minutes is about 3 GB, while a .mov from FCPX with the same qualities is 2.4 GB.
I’ll look at the direct bounce settings some more, but really feeling like I’m just gonna have to render and export huge files then compress down best as possible. Gonna comb here and maybe find some nods toward the best compromise when something hits youtube/vimeo, then facebook etc
Size reducing as in disk space? or resolution?
and is the goal to keep an high quality archive? or is this going straight to youtube and then the original doesn’t matter?
Do you know what the bitrate is, etc?
*you can grab media info, which will allow you to view video file stats, with a right click of the file
Also, what are final cut’s options like for H.264? Does it give you options to use your GPU? Does it give you options for “very fast” “slower” etc? Does it give you options to select the H.264 profile?
I’m gonna need to export again, I cleared the larger tests out yesterday. There’s finer preferences and presets I can create but atm main send options come to
Computer does inherently I’m assuming best encode but a given larger size compared to the other publishing options. Mastering Video and Audio 175% the best publishing ones.
Apple’s VideoToolbox API (the equivalent of DXVA on Windows for providing direct access to the GPU encoder) is kind of bad, you can use https://www.voukoder.org/ to get direct access to ffmpeg’s GPU encoders if you’re using Premiere on Windows but in general most commercial NLEs don’t have great support for GPU encoding because CPU encoders can generally produce slightly higher quality per bitrate (nvenc has only very recently gotten close to parity with x264 and x265 and it still doesn’t autotune as well so you need to know what you’re doing) unless you need realtime encoding/decoding with minimal CPU impact, which is not a common use case for video production
I’m guessing with loose math here. But I think 3GB for 25 minutes is about 2,000kbps bitrate.
Which is really low. That’s like what you’d use for streaming 30fps video, on good hardware, with some quality settings cranked up to counter the low bitrate.
If you can deal with the render times, I would export your video from final cut with a common lossless format. And then compress it with something which exposes more options for you.
Handbrake works well and allows you to use your GPU for faster compression.
However, this thread is largely about how we have to trick youtube to give us better quality. youtube re-compresses everything you upload. And it does it based on resolution and framerate. Higher resolutions and higher framerates, get more bitrate. And anything lower than 2K resolution, gets butt for bitrate.
Basically, I suggest upscaling your video to at at least 2K/60. If not 4K/60 (even if its a 30fps source. frame double it to 60). Before uploading it to youtube. and using a fairly high bitrate. Because we want youtube to compress from as high a quality original, as possible. Some people with amazing internet connections just upload lossless files…
I’ve been using a program called Video Enhancer 2, which does very good quality upscaling. Its a little bit finicky, but I’ve got it figured out. Its totally free. even after the trial period, you just give your email address and they hand you a free code.
However, it runs on your CPU, so it will take a long time. I’ve been using that @ 80,000kbps and the quality is really great.
Hmm good foods for thought, really
that’s what I should be doing then shift all the tuning to ffmpeg or handbrake, which I’m plenty used to.
I expect to be a bit aggravated with my first uploads of this project to streaming sites, so the core stuff here had always been interesting to me. Yeah gonna go full lossless then boil down to a mid, high, and high + for comparison.
If you are interested in the upscaling, just try pluggin your lossless file straight into video enhancer and skip handbraker or whatever. Because you’ll be compressing to H.264 as you upscale, with video enhancer 2.
or if final cut will let you upscale on export, that would be cool. upscale to lossless, then compress afterward.
scroll up to see my most recent comparison post, with Goldeneye source.
Thoughts and frustrations. And then some notes on quality comparisons between different hardware for encoding.
I’ve been messing around a whole lot recently with OBS, for streaming. With a particular interest in “low” bitrate streaming, under 4,000kbps.
My experiences with OBS have been strange and frustrating (Not blaming them for it. Simply noting my experiences). The biggest issue, was something I was only able to quantify after hours of experimentation and vague comparison to articles and videos of streaming knowledge, from others.
I have been using a Radeon HD7870 for the past 7 years. Even though that GPU family is the basis for two of the current consoles and both consoles can stream: something has happened on the PC side. It seems the drivers are antiquated, in certain respects.
In short, on my system, I just could not reproduce the general quality and speed which I’ve been seeing in comparison articles and videos. As well as lurking forum posts and even actual streams. It didn’t matter if I was using the CPU for encoding, Quicksync, or the encoder chip in my videocard. All of them seemed to be under performing. I could not do a 720p 60fps stream/record, in any game, with Quicksync, CPU, or GPU. Unless I used some of the absolute lowest quality settings. Which looked awful. Even downscaling was itself a performance hit, offsetting some of the speed benefit I should have been getting, from encoding a lower resolution.
All of my testing suggested that OBS was having relatively more difficult time interrupting the frames from the videocard, in order to encode them, add overlays, downscale, show the preview window, etc. Simply disabling the preview window, gave me a notable boost in performance. Once upon a time when Dark Souls 1 and 2 were new games on PC, I could used DSFix and GeDoSaTo to downscale, with basically zero performance from the scaling, itself.
It seems that either the way the drivers for these old cards is antiquated, or OBS isn’t coded well for the way they present their frames. or a combo of the two.
I basically confirmed my thoughts, because I just installed an RTX 2060 yesterday and my CPU performance in OBS is WAY better. Like twice as good. Streaming at 720p/60 is pretty much no issue, at realistic quality settings. Even 1080p is now viable. I’m now aligned with the performance suggested by all those videos and articles which I have looked at. And you can run Soul Caliber 6 at full speed on a single core. So, its nothing to do with the games I’m using, over taxing my hardware. Dark Souls 2 runs at 720p/40 on my laptop with 2013 Intel Graphics. So, my desktop barely leaves idle temps, with that game. But I couldn’t do a 60fps stream, until I swapped my videocard for something modern.
Now, about quality. I’m interested in “low” bitrates for streaming. Under 4000kbps. I’ve been doing most of my testing around 3000kbps. These lowe bitrates really expose the compression capability of each different piece of hardware. In OBS, you can encode your stream on your CPU cores, via Intel Quicksync, via Nvidia GPU, and via AMD GPU. They all look different, have different performance considerations, have different settings for quality, and can vary (sometimes a lot) between different games.
Before anything else, whatever encoder you use, you should be using “high” profile. Unless you are very concerned about your viewers using olders smartphones/tablets/TV apps which only support “Main” profile or even only “low” for really old stuff. “High” profile enables some options which help with compression/quality per bit. Which I have found to be noticeable, even over “Main”. Not as large a concern at higher bitrates. But at lower bitrates, it seems to be a thing.
X.264/CPU encoding: In terms of varying between different games, X.264 is the most agnostic/has the least variance. Generally speaking, even using realistic settings for a quad core, its the best compression per bit and the most consistent quality at lower bitrates. X.264 is super mature and has a ton of tweaks for quality, which maximize compression per bit. And especially benefits slower paced games. It also has a bunch of settings which you can use to tune for different games (darkly lit/dark colors Vs. brighter, etc). However, the basic presets and “tune” options are very good and I would recommend just using those unless you have a specific game which you are really trying to cater towards. Or need to gain a few extra frames of performance to keep a stable 60fps on your stream. As someone who has spent hours and hours laboring over this stuff, Even at 6,000kbps, there’s only so much visual quality you can gain. After a point, its more about speed.
In particular, I have found the “SSIM” tune and the “Animation” tunes to be very good for visual quality in games. Should be good or most things. However, feel free to ping me in this thread, for suggestions and tweaks.
The potential problems with encoding X.264 on your CPU is that it can be a large load on your CPU. So, trying to play a game and encode at the same time, can be a tough balancing act. If the game is particularly demanding, you may not have enough performance to share between the game and the stream.
The faster paced a game is, the more difficult it is to compress quickly. More motion to track per frame, demands more hardware performance. And of course the higher compression quality you try to use, the more power you need from the CPU. Which can affect even games which don’t require much from the CPU. This is why many streamers have a separate PC to handle the encoding, or are using CPUs with more than 4 cores.
And if X.264 has a visual weakness, its with really fast motion or visual affects which affect the entire screen (so, lots of motion). Stuff like full screen ripple effects, underwater distortion, etc, exposes some visual weakness in X.264. It can struggle to keep these visuals looking smooth and defined. Often negatively affecting the clarity of whatever is behind these effects, in the frame.
AMD’s GPU hardware encoders: AMD has a few issues with their hardware encoding. First off, they keep changing their encoder hardware. Sometimes it gets a little better, sometimes it gets worse. Its a strange thing, that. As such, OBS doesn’t cater well to AMD encoders. I think in general, but also because AMD presents a constantly moving target, with their hardware.
I have no idea what the quality is like on many of their cards. On my 7870 at the afore stated lower bitrates, there is a lot of granular macroblocking. However, with really fast motion, it tends to focus more of the bitrate to the center of the screen, where your character is. Therefore, even amongst all of the blocking, there can be a pleasing solidity to your character under fast motion. Also, it seems to do well with those full screen effects such as ripples, which I said X.264/CPU can struggle with. And its good with alpha effects. But its not good at all, for slower paced games. The encoder seems to have a lot of trouble in locking down fine details which should be viable in a slower paced game. There are actually a lot of settings exposed for my 7870, OBS. However, it seems OBS struggles to access its framebuffer to actually do the encoding at acceptable speeds.
AMD just released the new 5700 videocards. They supposedly have a new and improved encoder chip on them. However, preliminary tests have awful visual quality. I suspect this may be due to streaming software not yet taking specific advantage of the new chip. I think they are probably just using the old AMD code paths to access the new chip. It remains to be seen how this will end up. AMD hasn’t been too vocal about it, either. They only briefly mentioned the new encoder chip, at E3, for example.
Quicksync is on most of Intel CPUs. Its a separate portion of the CPU, from the main CPU cores. Using it does use some resources from your CPU cores (it uses system RAM as VRAM. CPU time is needed to swap that data), but, not as much CPU as using the X.264 CPU encoder. Visually, it does not offer as good of compression per bit, as the X.264 CPU encoder. The best thing about Quicksync, is that its pretty fast. If you are having performance issues encoding your stream on the CPU, Quicksync is worth a try. And unlike the X.264 CPU encoder, its visual quality can vary noticeably, from game to game or even different in-game lighting conditions. Due to this, it can be worth testing on the game you are trying to stream. to see if it is a good match. Especially if you are doing higher bitrate streaming (6,000kbps, etc)
With Quicksync in general, there can be a fair amount of macroblocking on “flat”, less detailed areas of the screen. Sometimes with a strange, unstable “combing” effect. But, higher details fair pretty well, especially under high motion and alpha effects. OBS exposes a fair amount of options for quicksync. However, there is also a fair amount of options which they have yet to make available. Especially for stuff which is Haswell or newer. Which in my experiences with Quicksync in handbrake, might be useful. Time will tell.
NVENC encoder via Nvidia Graphics cards.
I just got my first Nvidia card, since like 2007. I’ve only had it for a couple of days. But already, testing the NVENC encoder has been pretty interesting.
First thing to note is that from what I have read and seen, Nvidia cards before the Geforce 10** series (1060, 1070, 1080), have noticeably lower visual quality per bit, for encoding. And Geforce 10** and 20** (1650,1660, 2060, 2070, 2080) have better compression quality and very similar in quality to eachother. Its not a large jump from 10** series to 20** series. The newer 20** series do have a small bump in overall sharpness of details. Otherwise, they are very similar in encoding quality.
Using NVENC has been interesting, because it exhibits a much wider range of behavior, from game to game. Final Fantasy 15 (I used the benchmark demo) seems to be a best case scenario. Where I got as good of results as you could ever expect. The compression quality NVENC does on that game, is remarkable.
However, Obduction, is just ok. Looking only a little better than Quicksync. Dark Souls 2 is similar. Motion on your character is a bit better with NVENC. But, lots of blocking in the skies and “flat” less detailed areas of the screen. Easily fixed with X.264 via CPU.
Soul Caliber 6 is interesting as well. NVENC does a really great job of keeping a solid visual feel to the characters and the high amount of screen motion for that game. Better than X.264 CPU, in many respects. Characters appear more “solid” and with more “depth” while in motion. And full screen visual effects and fades fair much better.
Background details are pretty high. However, the skies and clouds do show a fair amount of blocking. And if you look at your character frame by frame, the encoder actually often culls chunks of your character. Like segments of their arms or weapons can be completely missing in still frames. Which I have not noticed with the other encoders. But you don’t notice when actually playing. Its interesting and says a few things about how to allocate screen details, for actually perceived visual quality.
OBS and Nvidia seem to be working well together. As they have recently added a new codepath for NVENC and a couple of new quality features. As such, NVENC is very fast and even cranking all of the settings available, I’m not sure you could dip under 60fps stream for 1080p or lower. Also, this partnership should also mean using X.264 on your CPU will be as fast as possible. Because OBS is optimized to interrupt Nvidia framebuffers to encode the frames, ad overlays, etc.
NVENC is interesting competition for CPU encoding. Its too inconsistent from game to game, to say it outright matches X.264 CPU. But in maybe many scenarios, it can be about as good, while demanding much less resources from your system. Trading some visual quality, for a large amount of performance. And in specific scenarios, it can actually be favorable in visual quality. I have no idea what sort of emphasis Nvidia has for R&D on GPU encoding. But, there’s so much shading power on these things, it seems like they could take this a lot further, if they really wanted. We’ll see!
Overall, I think OBS could do a lot more to guide users on choosing settings. Some of the options have decent explanations. Some do not. And some options don’t really seem to do much. Tying to find out about these things is a really painful process. I hope they find some time to invest in improving documentation and presentation, within the software itself. For example, Handbrake has a fairly extensive compendium on the options they have available. Their issue is that the language is often very technical. Can be tough to follow, even as far as how to activate the settings.
I miiiiiiight post some comparisons. We’ll see.
also, NVENC generally doesn’t use the shader hardware; it’s a separate fixed-function block
Indeed. However, the new codepath in OBS does leverage the CUDA cores a bit, for the RTX20** cards. And I think it mostly about squeezing extra quality. Not speed.
I wasn’t totally clear on whether Nvidia’s drivers were able to let CUDA compute kernels compete with 3D rendering without interrupting each other and really screwing up performance so it’s nice to hear they’re thinking about that, it bodes well for games being able to run CUDA threads too
I’m guessing the chip in the 20** series has a sort of doorway to the CUDA cores, which the older cards do not. They said they are trying to do as much work with each frame, on the GPU as possible. So there’s less swapping back to the CPU and system memory.