Does anyone still want that cross-platform GUI drag and drop upscaler? I got the backend and a rudimentary front-end working. I’ll clean it up, package it, and put it on Github if there’s demand. @Toptube @Felix @Decinoge
I don’t need it but here’s my own notes on packaging electron apps just in case:
electron-squirrel-startup to dependencies and put this at the beginning of your
then, to build:
electron-packager . --icon=resources/icon.ico (Windows)
electron-packager . --icon=resources/icon.icns (Mac)
Finally, to package for install:
electron-installer-windows --src radiamagent-win32-x64/ --dest install/ --config config.json (Windows)
hdiutil create tmp.dmg -ov -volname "RadiamAgent" -fs HFS+ -srcfolder radiamagent-darwin-x64/ && hdiutil convert tmp.dmg -format UDZO -o RadiamAgent.dmg && rm tmp.dmg (Mac)
ignore the references to radiam, that’s our project
Yes, it would be fun for me to play with. And the upscaler I’ve been using is very finicky. Thank you!
Still interested in this. Have a few tests I’d like to do.
I did a couple of clips for Goldeneye Source
720p 30 frames per second
Same video, upscaled to 4K before uploading. So Youtube thinks its a 4K/30 video.
This clip is 720/60. The game was actually running at 30fps (I capped the framerate at 30), but I set the recording software to capture at 60fps.
And same video, upscaled to 4K/60 before uploading. So, Youtube thinks its a 4K/60 video. But the game was running at 720p/30
Remember, the point here is not to make a 720p video look like a 4K video. The point is to keep our video looking like the source file. In this case, a 720p source. Youtube does not give enough bitrate to a 720p video upload. So we trick youtube by upscaling to 2K or 4K. This assigns more bitrate when youtube processes your video. The net result is that the video on youtube will look more like the 720p source file. 60fps also gets more bitrate Vs. 30fps. If your source file is 30fps, you can frame double before uploading, to gain that extra bitrate. Or you can try to record 60fps to begin with.
If you haven’t read this whole thread: 2K/60 on youtube looks pretty darn good. But still has some noticeable compression issues. 4K/30 and 4K/60 help things further. There are of course still going to be some issues. But they will be further minimized. Its the price you pay for compression. Also note-able is that with a 4K upload, if you select the 2K or 1080p quality setting while watching the video, its going to look better than a video which you uploaded at 2K or 1080p. The extra tier of bitrate for 4K has a trickle down effect to the other, lower quality settings. Here is an example:
4K/60 upscaled upload, with 720p/60 quality setting selected
4K/60 upscaled upload, with 4K/60 quality setting:
Note that if you frame double a 30fps video to 60fps: you need to also double the bitrate for that new video you are outputting. So lets say you captured at 1080p/30. Then you make some edits in a program and set the final product to upscale to 2K resolution with 40,000kbps bitrate. and that’s the file you upload to youtube. —If you wanted to do that same video with doubled frames for 60fps, you should output it at 80,000kbps, to achieve the same relative quality, before uploading to youtube. Even though you simply doubled the frames rather than having 60 unique frames----you still have twice the amount of frames to compress. So, you need more bitrate to keep it looking good.
Its the same with capturing video. If you are going to capture at 60fps, you need to double the bitrate of what looks acceptable to you, at 30fps. Otherwise, you will run into a lot more compression issues.
Overall, after all of my comparisons: I recommend upscaling to at least 2K/60. With the extra bitrate assigned by Youtube, you gain a huge amount of clarity compared to 720p and 1080p. 2K/60 Minimizes a lot of mush and artifacts. Especially for brighter scenes, it can look very clean and is similar enough to a 4K/60 upload. Looking overall similar to the source file.
If its mostly darker scenes, I recommend spending the extra processing time to upscale for the 4K/60. It goes a couple of extra steps to help keep detail in dark colors and darkly lit secenes. Especially when there is a hi amount of motion.
Oh so this program I’ve been using to upscale with the super resolution method — does not require payment. I thought it did. But it just lets you register it for free, since a year ago. Haha. You just put in your email and they instantly give you a key on the same screen. I haven’t even received an email from them at all.
Long thread to review, posting for the moment to say I’m about to start looking at the best export of 3-4 hours 1440x1080 60 fps from Final Cut, then a size reducing conversion that will maintain highest grade audio along with video…
Right now an H.264 .mp4 of 25 minutes is about 3 GB, while a .mov from FCPX with the same qualities is 2.4 GB.
I’ll look at the direct bounce settings some more, but really feeling like I’m just gonna have to render and export huge files then compress down best as possible. Gonna comb here and maybe find some nods toward the best compromise when something hits youtube/vimeo, then facebook etc
Size reducing as in disk space? or resolution?
and is the goal to keep an high quality archive? or is this going straight to youtube and then the original doesn’t matter?
Do you know what the bitrate is, etc?
*you can grab media info, which will allow you to view video file stats, with a right click of the file
Also, what are final cut’s options like for H.264? Does it give you options to use your GPU? Does it give you options for “very fast” “slower” etc? Does it give you options to select the H.264 profile?
I’m gonna need to export again, I cleared the larger tests out yesterday. There’s finer preferences and presets I can create but atm main send options come to
Computer does inherently I’m assuming best encode but a given larger size compared to the other publishing options. Mastering Video and Audio 175% the best publishing ones.
Apple’s VideoToolbox API (the equivalent of DXVA on Windows for providing direct access to the GPU encoder) is kind of bad, you can use https://www.voukoder.org/ to get direct access to ffmpeg’s GPU encoders if you’re using Premiere on Windows but in general most commercial NLEs don’t have great support for GPU encoding because CPU encoders can generally produce slightly higher quality per bitrate (nvenc has only very recently gotten close to parity with x264 and x265 and it still doesn’t autotune as well so you need to know what you’re doing) unless you need realtime encoding/decoding with minimal CPU impact, which is not a common use case for video production
I’m guessing with loose math here. But I think 3GB for 25 minutes is about 2,000kbps bitrate.
Which is really low. That’s like what you’d use for streaming 30fps video, on good hardware, with some quality settings cranked up to counter the low bitrate.
If you can deal with the render times, I would export your video from final cut with a common lossless format. And then compress it with something which exposes more options for you.
Handbrake works well and allows you to use your GPU for faster compression.
However, this thread is largely about how we have to trick youtube to give us better quality. youtube re-compresses everything you upload. And it does it based on resolution and framerate. Higher resolutions and higher framerates, get more bitrate. And anything lower than 2K resolution, gets butt for bitrate.
Basically, I suggest upscaling your video to at at least 2K/60. If not 4K/60 (even if its a 30fps source. frame double it to 60). Before uploading it to youtube. and using a fairly high bitrate. Because we want youtube to compress from as high a quality original, as possible. Some people with amazing internet connections just upload lossless files…
I’ve been using a program called Video Enhancer 2, which does very good quality upscaling. Its a little bit finicky, but I’ve got it figured out. Its totally free. even after the trial period, you just give your email address and they hand you a free code.
However, it runs on your CPU, so it will take a long time. I’ve been using that @ 80,000kbps and the quality is really great.
Hmm good foods for thought, really
that’s what I should be doing then shift all the tuning to ffmpeg or handbrake, which I’m plenty used to.
I expect to be a bit aggravated with my first uploads of this project to streaming sites, so the core stuff here had always been interesting to me. Yeah gonna go full lossless then boil down to a mid, high, and high + for comparison.
If you are interested in the upscaling, just try pluggin your lossless file straight into video enhancer and skip handbraker or whatever. Because you’ll be compressing to H.264 as you upscale, with video enhancer 2.
or if final cut will let you upscale on export, that would be cool. upscale to lossless, then compress afterward.
scroll up to see my most recent comparison post, with Goldeneye source.
Thoughts and frustrations. And then some notes on quality comparisons between different hardware for encoding.
I’ve been messing around a whole lot recently with OBS, for streaming. With a particular interest in “low” bitrate streaming, under 4,000kbps.
My experiences with OBS have been strange and frustrating (Not blaming them for it. Simply noting my experiences). The biggest issue, was something I was only able to quantify after hours of experimentation and vague comparison to articles and videos of streaming knowledge, from others.
I have been using a Radeon HD7870 for the past 7 years. Even though that GPU family is the basis for two of the current consoles and both consoles can stream: something has happened on the PC side. It seems the drivers are antiquated, in certain respects.
In short, on my system, I just could not reproduce the general quality and speed which I’ve been seeing in comparison articles and videos. As well as lurking forum posts and even actual streams. It didn’t matter if I was using the CPU for encoding, Quicksync, or the encoder chip in my videocard. All of them seemed to be under performing. I could not do a 720p 60fps stream/record, in any game, with Quicksync, CPU, or GPU. Unless I used some of the absolute lowest quality settings. Which looked awful. Even downscaling was itself a performance hit, offsetting some of the speed benefit I should have been getting, from encoding a lower resolution.
All of my testing suggested that OBS was having relatively more difficult time interrupting the frames from the videocard, in order to encode them, add overlays, downscale, show the preview window, etc. Simply disabling the preview window, gave me a notable boost in performance. Once upon a time when Dark Souls 1 and 2 were new games on PC, I could used DSFix and GeDoSaTo to downscale, with basically zero performance from the scaling, itself.
It seems that either the way the drivers for these old cards is antiquated, or OBS isn’t coded well for the way they present their frames. or a combo of the two.
I basically confirmed my thoughts, because I just installed an RTX 2060 yesterday and my CPU performance in OBS is WAY better. Like twice as good. Streaming at 720p/60 is pretty much no issue, at realistic quality settings. Even 1080p is now viable. I’m now aligned with the performance suggested by all those videos and articles which I have looked at. And you can run Soul Caliber 6 at full speed on a single core. So, its nothing to do with the games I’m using, over taxing my hardware. Dark Souls 2 runs at 720p/40 on my laptop with 2013 Intel Graphics. So, my desktop barely leaves idle temps, with that game. But I couldn’t do a 60fps stream, until I swapped my videocard for something modern.
Now, about quality. I’m interested in “low” bitrates for streaming. Under 4000kbps. I’ve been doing most of my testing around 3000kbps. These low bitrates really expose the compression capability of each different piece of hardware. In OBS, you can encode your stream on your CPU cores, via Intel Quicksync, via Nvidia GPU, and via AMD GPU. They all look different, have different performance considerations, have different settings for quality, and can vary (sometimes a lot) between different games.
Before anything else, whatever encoder you use, you should be using “high” profile. Unless you are very concerned about your viewers using older smartphones/tablets/TV apps which only support “Main” profile or even only “low” for really old stuff. “High” profile enables some options which help with compression/quality per bit. Which I have found to be noticeable, even over “Main”. Not as large a concern at higher bitrates. But at lower bitrates, it seems to be a thing.
X.264/CPU encoding: In terms of varying between different games, X.264 is the most agnostic/has the least variance. Generally speaking, even using realistic settings for a quad core, its the best compression per bit and the most consistent quality at lower bitrates. However, the image quality tends to be on the soft side. Unless you start stacking some special quality options. But, X.264 tends to show less macroblocking, at any given bitrate. Especially if you use “faster” CPU preset or better. So, what it may lack in sharpness, it balances with a more stable image.
X.264 is super mature and has a ton of tweaks for quality, which can further maximize compression per bit and detail. It also has a bunch of settings which you can use to tune for different games (darkly lit/dark colors Vs. brighter, etc). However, the basic presets and “tune” options are very good and I would recommend just using those unless you have a specific game which you are really trying to cater towards or have a very specific quality issue to overcome. There are also some things you can turn off, to gain a lot of extra performance to keep a stable 60fps on your stream. As someone who has spent hours and hours laboring over this stuff, Even at 6,000kbps, there’s only so much visual quality you can gain. After a point, its more about speed.
In particular, I have found the “SSIM” tune and the “Animation” tunes to be very good for visual quality in games. or having no tune option selected at all. These should be good for most things. However, feel free to ping me in this thread, for suggestions and tweaks.
The potential problems with encoding X.264 on your CPU is that it can be a large load on your CPU. So, trying to play a game and encode at the same time, can be a tough balancing act. If the game is particularly demanding, you may not have enough performance to share between the game and the stream. And as you add options to X.264, it gets slower.
The faster paced a game is, the more difficult it is to compress quickly. More motion to track per frame, demands more hardware performance. And of course the higher compression quality you try to use, the more power you need from the CPU. Which can affect even games which don’t require much from the CPU. This is why many streamers have a separate PC to handle the encoding, or are using CPUs with more than 4 cores. So that they don’t have to compromise on quality options.
And if X.264 has a visual weakness, its with really fast motion or visual affects which affect the entire screen (so, lots of motion). Stuff like full screen ripple effects, underwater distortion, etc, exposes some visual weakness in X.264. It can struggle to keep these visuals looking smooth and defined. Often negatively affecting the clarity of whatever is behind these effects, in the frame.
AMD’s GPU hardware encoders: AMD has a few issues with their hardware encoding. First off, they keep changing their encoder hardware. Sometimes it gets a little better, sometimes it gets worse. Its a strange thing, that. As such, OBS doesn’t cater well to AMD encoders. I think in general, but also because AMD presents a constantly moving target, with their hardware.
I have no idea what the quality is like on many of their cards. On my 7870 at the afore stated lower bitrates, there is a lot of granular macroblocking. However, with really fast motion, it tends to focus more of the bitrate to the center of the screen, where your character is. Therefore, even amongst all of the blocking, there can be a pleasing solidity to your character under fast motion. Also, it seems to do well with those full screen effects such as ripples, which I said X.264/CPU can struggle with. And its good with alpha effects. But its not good at all, for slower paced games. The encoder seems to have a lot of trouble in locking down fine details which should be viable in a slower paced game. There are actually a lot of settings exposed for my 7870, OBS. However, it seems OBS struggles to access its framebuffer to actually do the encoding at acceptable speeds.
AMD just released the new 5700 videocards. They supposedly have a new and improved encoder chip on them. However, preliminary tests have awful visual quality. I suspect this may be due to streaming software not yet taking specific advantage of the new chip. I think they are probably just using the old AMD code paths to access the new chip. It remains to be seen how this will end up. AMD hasn’t been too vocal about it, either. They only briefly mentioned the new encoder chip, at E3, for example. *One of AMD’s VP’s has heard this from youtubers/streamers and put a team on improving the situation.
Intel Quicksync (revised opinion)
Quicksync is on most of Intel CPUs. Its a separate portion of the CPU, from the main CPU cores. Using it does use some resources from your CPU cores (it uses system RAM as VRAM. CPU time is needed to swap that data), but, not as much CPU as using the X.264 CPU encoder. On a quad core, its like half the CPU resources. Quicksync is quite fast. You should be able to do 60fps captures at least at 1080p.
Its visual quality can vary noticeably, from game to game or even different in-game lighting conditions. Due to this, it can be worth testing on the game you are trying to stream. to see if it is a good match. Whereas X.264 is generally pretty consistent across different visuals. Something really nice about Quicksync, is that the overal image quality retains a lot of detail and sharpness. Generally more than NVENC. And you’d have to really tweak X.264 and use a preset better than (Faster) to achieve such sharpness and overall detail. It also compresses colors more accurately than NVENC.
The big problem with Quicksync is that it doesn’t handle really fast motion well. And motion in general, can cause a strange combing effect on certain aspects of the picture. Usually flat areas with less details and more brightness. the sharpness of its image quality also makes blocking more apparent than say, X.264. Quicksync generally needs relatively more bitrate, to handle faster motion without losing details in the motion and to minimize the combing effect. However, once you hit 6,000kbps bitrate, the negatives are minimized pretty well. And the overall sharpness, color, and detail, is really nice. I came away pretty surprised after my tests today. And may find myself using Quicksync more often.
I recommend Quicksync for slower paced games or games where you really want to show off the details. I think even something like Bioshock or Halo, would be slowish enough, that the negatives wouldn’t push you toward a different encoder (although the combing effect can be distracting if your particular game shows it more). It would should be excellent for adventure games, TCG card games, etc. I would recommend trying it for slow paced games, rather than messing with X.264. The detail and sharpness is just so easily attained. You have to work a little harder, with X.264
NVENC encoder via Nvidia Graphics cards.
I just got my first Nvidia card, since like 2007. I’ve only had it for a couple of days. But already, testing the NVENC encoder has been pretty interesting.
First thing to note is that from what I have read and seen, Nvidia cards before the Geforce 10** series (1060, 1070, 1080), have noticeably lower visual quality per bit, for encoding. And Geforce 10** and 20** (1650,1660, 2060, 2070, 2080) have better compression quality and very similar in quality to eachother. Its not a large jump from 10** series to 20** series. The newer 20** series do have a small bump in overall sharpness of details. Otherwise, they are very similar in encoding quality.
Using NVENC has been interesting, because it exhibits a much wider range of behavior, from game to game. Final Fantasy 15 (I used the benchmark demo) seems to be a best case scenario. Where I got as good of results as you could ever expect. The compression quality NVENC does on that game, is remarkable.
However, Obduction, is just ok. Losing to X.264 in compression per bit, there can be some more “muddy” parts of the screen, with distant objects. Quicksync again has more detail and better colors. But the combing still shows up. NVENC’s weakness with colors really showed up with Obduction, as there are many areas in the game with little color variety to distract you from the inaccurate presentation.
Dark Souls 2 faired similar, overall. Motion on your character is a bit better with NVENC. But, lots of blocking in the skies and “flat” less detailed areas of the screen. Easily fixed with X.264 via CPU.
Soul Caliber 6 is interesting as well. NVENC does a really great job of keeping a solid visual feel to the characters and the high amount of screen motion for that game. Better than X.264 CPU, in many respects. Characters appear more “solid” and with more “depth” while in motion. And full screen visual effects and fades fair much better. Overall detail is lower. But the trade for the solidity and clarity in motion, is a preferrable advantage for something like Soul Caliber. And likely for a faster paced FPS, as well. I think it probably has algorithms which detect motion and prioritize bits to those parts of the screen.
Comparing frame by frame during high motion with NVENC: SC6 characters have a lot more frames where the characters appear high quality Vs. Quicksync and even X.264. those two encoders tend to pixelate as soon as motion gets really fast. You have to pump up the options to overcome it, with X.264. Quicksyncs only answer is more bitrate. As soon as you stop moving a lot, Quicksync puts more details into the stages. But for this game, NVENC is the superior choice. Even at 6000kbps, I think NVENC has the edge. Due to the importance of motion for this game.
However, in Soul Caliber 6, the skies and clouds do show a fair amount of blocking with NVENC. And if you look at your character frame by frame, the encoder sometimes culls chunks of your character. Like segments of their arms or weapons can be completely missing in still frames. Which I have not noticed with the other encoders. But you don’t notice when actually playing. Its interesting and says a few things about how to allocate screen details, for actually perceived visual quality.
I tested it on Dirt 3 with a snow course, compared to Quicksync and X.264. Quicksync had the highest general detail. Especially on the track surface and on the cars themselves. But the snow blasting behind the cars had that combing effect and some blockyness. X.264 had a softness to it, but had the least blocking. NVENC actually had more overall detail compared to X.264. Due to its positive abilities with high motion. Trackside details had more detail. Snow blasting behind cards looked natural and clear. Cars looked solid.
OBS and Nvidia seem to be working well together. As they have recently added a new codepath for NVENC and a couple of new quality features. As such, NVENC is very fast and even cranking all of the settings available, I’m not sure you could dip under 60fps stream for 1080p or lower. Also, this partnership should also mean using X.264 on your CPU will be as fast as possible. Because OBS is optimized to interrupt Nvidia framebuffers to encode the frames, ad overlays, etc.
NVENC is interesting competition for CPU encoding. Its too inconsistent from game to game, to say it outright matches X.264 CPU. But in maybe many scenarios, it can be about as good, while demanding much less resources from your system. Trading some visual quality, for a large amount of performance. And in specific scenarios (fast motion), it can actually be favorable in visual quality. I have no idea what sort of emphasis Nvidia has for R&D on GPU encoding. But, there’s so much shading power on these things, it seems like they could take this a lot further, if they really wanted. We’ll see!
Overall, I think OBS could do a lot more to guide users on choosing settings. Some of the options have decent explanations. Some do not. And some options don’t really seem to do much. Tying to find out about these things is a really painful process. I hope they find some time to invest in improving documentation and presentation, within the software itself. For example, Handbrake has a fairly extensive compendium on the options they have available. Their issue is that the language is often very technical. Can be tough to follow, even as far as how to activate the settings.
I miiiiiiight post some comparisons. We’ll see.
also, NVENC generally doesn’t use the shader hardware; it’s a separate fixed-function block
Indeed. However, the new codepath in OBS does leverage the CUDA cores a bit, for the RTX20** cards. And I think it mostly about squeezing extra quality. Not speed.
I wasn’t totally clear on whether Nvidia’s drivers were able to let CUDA compute kernels compete with 3D rendering without interrupting each other and really screwing up performance so it’s nice to hear they’re thinking about that, it bodes well for games being able to run CUDA threads too
I’m guessing the chip in the 20** series has a sort of doorway to the CUDA cores, which the older cards do not. They said they are trying to do as much work with each frame, on the GPU as possible. So there’s less swapping back to the CPU and system memory.
I played around with Quicksync today. I hadn’t done that yet, with the new videocard. As I mentioned previously, OBS had trouble interrupting the frames from my old videocard. X.264 and Quicksync both look better now. Yeah, Quicksync really surprised me today. I updated my thoughts on it, in the giant post above.
Suggested Encoder settings in OBS, for streaming.
Select the “New” NVENC encoder.
Rate Controle CBR (its what Twitch wants. VBR and its variations can cause dropped frames, audio desync, etc)
Choose whatever bitrate you want. I think non-partnered streamers are capped somewhere after 6,000kbps.
Keyframe interval 2 (its what Twitch wants)
Preset “Max Quality”
Ignore both “Look ahead” and “Psycho Visual Tuning”. These are cool options, in theory. But Nvidia’s current implementations need more work. In my testing you experience overall lower quality with either option turned on.
“GPU” Leave this at 0 unless you have more than one graphics card in your system.
Max B frames: You may be familiar with “b frames” in other encoders. But it doesn’t necessarily work the same, here. Set this to “2” for games with a lot of fast motion. Especially 3D games where the entire screen can move and rotate. For slower games with occasional fast motion, you can try 3, to get more overall detail. But lose some solidity and detail during motion. For really slow games (I’m thinking like Shenmue, walking simulators, etc) or practically static, set b frames to 4. (I still need to try these with sidescrollers or an isometric game such as Zelda.)
Try these settings as a local recording, before streaming. If the resulting video file has a lot of pausing in it, it means that the settings you are using to run the game, are nearly maxing out your GPU and its not leaving enough left over for NVENC. You can do a few things to fix this:
- lower some game settings
- Change the quality preset from “Max Quality” to “Quality”. “Max Quality” uses your shader cores (Cuda) in additon to the NVENC encoder chip. Quality uses less shader power or maybe even none. If you still have pausing, try “Low Latency Quality”.
- If you are totally plagued by pausing and hitching with NVENC and you don’t want to drop your game settings further, try the NVENC encoder which is not labeled as “NEW”. Its the old code path, which does not use Shader power at all. Its still pretty good quality.
Quicksync (these settings are based on Intel’s Kaby Lake or newer. Some of these options may be missing, on older CPUs. If they are, feel free to ask about it here. I’d be interested to hear about it.
Also, these settings may change later. I heard about some possibly hidden options for Quicksync, in OBS.
Target Usage “Quality”
Keyframes 2 (its what Twitch wants)
Async Depth 7 (or whatever your highest available number is) (This is basically a setting which splits up the encoding across the Intel shader cores in an asynchronous fashion, allowing higher quality. Has a lot of effect on overall quality)
Choose your bitrate
Rate Control: Choose “LA”, if you have it. If you don’t have it, select CBR. LA implies CBR, but also activates a “lookahead” feature. And you can choose how many frames ahead. Set it for 11. This basically means the encoder will do 2-pass CBR, 11 frames ahead of real time. Slightly improves overall quality. But has a particular benefit to high motion. However, setting it much higher, can make the combing effect worse during motion. And can cause some other odd behavior to certain visuals. Higher settings are supposedly only meant for really slow paced or static material.
If you are having trouble keeping a smooth recording at 60fps, try changing to the “balanced” preset. And/or try lowering Async Depth until it gets better.
I think “LA” is important to getting the most out of Quicksync in OBS. So, I would leave it on, if you can.
X.264 - X.264 settings are a can of worms. You can mix and match them too much for your own good. I’ve spent a ton of time looking at them. I don’t pretend to know it all. These are my suggestions
Rate Control “CBR” (its what Twitch wants)
choose your bitrate
check the box for “Use custom buffer size” and set the buffer to be the same size as your bitrate.
keyframe interval 2 (its what Twitch wants)
CPU usage preset: If you have a quad core, you probably shouldn’t set this higher than “Faster”, unless you are streaming under 60fps. I would start with “Faster”, because it noticeably cleans up macroblocking, compared to all of the presets underneath it. However, if your game is heavy on CPU usage, you might be forced to go to a lower preset, with a quad core.
Tune: Personally, I think no tune selected, “Animation”, or SSIM are all good choices to get you started with video games. “animation” is going to be the most resource intensive. But, it should look the best, for most content. SSIM is a small tweak over not selecting any preset. It changes how the bitrate is allocated in each frame. In a way which I think is visually favorable. However, it seems to suffer more under motion. I think it may tweak more options than what I have seen documented. No tune selected uses the default X.264 options, which is a healthy smattering of options and will generally look fine with most anything.
Also, if you are playing a game with a noise filter or a game with a lot of fine details such as fog, rain, or such effects: you can try the “grain” preset, to maybe preserve these subtle details. YMMV with that. I think its a little heavy handed in how it tweaks the settings. But it would take way too much text to explain that and suggest other, custom settings. If you ever have particular trouble with such visuals, of course you can ask about it : )
If you have some extra CPU resources (less than 60fps stream and/or more than 4 cores), You can also try the “fast” and “Medium” presets. Anything higher than that is not recommened for streaming. And is sort of antiquated in terms of what it changes under the hood. At least, in context of a real time stream.
Regarding the spot for “X.264 options”:
Its spot to put in custom commands to tweak things. Each option is separated by a colon. If you put one option by itself, no colon.
Here are a couple of options which may be redundant with “High” profile selected, but if you have good performance with high profile and at least “faster" preset, go ahead and put these in. Or you can try adding them to lower presets, for a small boost in quality:
8x8dct=1:cabac=1:weightp=2 (the first two are options for small gains in compression efficiency. The last option helps keep lighting fades and changes from macroblocking too much).
dct-decimate=0 Setting dct-decimate to zero, is another thing which can help with subtle details like grain, rain, fog, etc.
me=umh (you can add this to noticeably improve solidity during motion. Combines especially well with “faster” preset or higher. umh noticeably cleans up blocking and noise, during high motion.)
aq-mode=2:aq-strength=1.0 This changes how the bitrate is allocted in each frame. In a way which is visually positive, in my opinion. Similar to the SSIM preset, distilled into a couple of settings and without the negatives. However, aq-mode=2 is slower than 1. You can set 1 and still use aq-strength for very positive results, with better speed than aq-mode 2.
Trellis=2 (or 1) This is something else you can try, to keep subtle details like grain, fog, rain, etc. Pretty resource intensive for a single option, when set to 2.
Everything below this line are suggestions for gaining some speed with X.264 via your CPU
you can set “profile” to medium, for a possible small performance boost, but also a small visual hit. I don’t recommend “low” profile. But, its there if you need the performance.
try the “low latency” tune. It usually increases macroblocking and can introduce some tearing. But, it can be a pretty dramatic boost and will help you keep your stream framerate high. You can mix the low latency tune with any of the other options for…interesting results!
or you can try these custom options:
subme= “0” or “3” These will greatly increase your X.264 speed. At the cost of overall sharpness and overall fine details. 3 still looks pretty good on detail, but may be a bit noisy. 0 will soften the image, but its fast. and it often doesn’t look as bad as you might think, if you know what these options really mean. You could even try “5”, for a smaller boost, but smaller quality loss.
you can also try
me=dia as a sort of last effort. Will make motion look worse.
And finally, analyse=none or some. “analyse” dictates how X.264 makes choices. The options are none, some, most, all. “Some” will give you a speed boost in “faster” or higher presets, while still partially leveraging any special options. “none” will skip many things, for a large speed boost. But, a large hit to visual quality.
So, a sensible, fairly large speed boost would be:
set profile to “medium” and then add these three options
and/or try the “low latency” tune