Tom's Hardware of Finland


The MTX/mobage/“content” quantity outrage is so tantalizing because they’re close to a moral understanding of why capitalism straddles creators with these bad incentives/compromises but they blame it on the company or a particular person they found on LinkedIn or mobygames instead of the market. They’re even turning on Steam/Gaben but can’t see why. It’s like the Qanon followers who don’t see the One True Conspiracy Theory (capital gonna capital) in front of their noses because it’s too banal next to a fantastic pushpin-and-string collage.

This time last year I was at a memorial service seated with an off-the-shelf liberal Reddit guy and it baffled me how mad he was about Battlefront II, a game he neither played nor wanted to play at any point. I said I’d played a bit of it and thought it was fine and frankly dazzling next to the old Pandemic Battlefronts that I played as a kid. It just looped back around and around to being entitled to be Darth Vader. It’s the curse of all Star Wars games. (He was also mad that the service contained ecumenical religious readings because the deceased was a boomer atheist and I’m like, dude, if you’re an atheist and materialist, can you just let the mourning have this?)

It hurts to watch my friend play Assassin’s Creed Odyssey and think about the blood, sweat, and tears compromised by an unholy union of The Witcher and Dynasty Warriors to pad out the grind. I hate that to be an indie dev means accepting personal/familial precarity and we can’t have a proper diversity of voices without commensurate commercial audiences.



oh I tried the DLSS (AI-image sharpening antialiasing) out. And I hate it.

Apparently it needs to be trained for every input->output resolution, so FFXV only supports 1440p->2160p. Which suits my use case, but…

Performance is very similar to rendering at 1800p, while I think 1800p looks better. 1440p w/DLSS aiming for 2160p has issues with fine lines and really doesn’t look like a higher-resolution image. Now, above 1440p everything looks pretty good and sharp anyway, but I don’t this is the best bang-for-the-buck image upgrade.

Worse, the way it works is by running an image upscaler on sampled textures; that is to say, the upscaled textures have thick contrasty elements that’s not actually true to the art, in a manner meant to suggest the crispness of a higher-quality texture sample. But it still looks like a TV with the sharpness cranked up to my eyes.

Worse worse, it creates the ‘wormy’, ‘stringy’ artifacts telltale to neural net image processing (seen in Google’s Deep Mind image toy), the acid-like repeated edges. And my eyes can’t not see it (I can’t believe I haven’t seen people call this out) and it icks me out. I much prefer a little blurriness to a wormy image (unless I’m playing SQUIRM: The Game of the Movie).


man this was not a good way to try to add FP16 compute features to the render pipeline I guess, better luck next time

I’m generally agnostic to the “everything is postprocessing now” argument put forward for high end phone cameras but those definitely don’t create this effect


Media impressions seem generally favorable, but I can’t tell if opinion will break against it in a few years like FXAA as people got concerned about blurred image, or if these artifacts just bother me a lot more than they do other people. But it definitely doesn’t look comparable to a 4k image like it’s presented as, it’s within the world of upscaled source resolution images.


What is the good AA now


anything other than FXAA, MSAA, or SSAA is usually comparable in terms of quality and rendering hit (which is to say, worth enabling) in my mind, but I don’t really keep them straight beyond that, I just know those three for their specific shortcomings.

it annoys me when modern game engines only ship with MSAA support (looking at you forza) because it’s like, fine, I know it’ll look bad otherwise without at least a 4x (plus whatever that Nvidia driver setting does with frame blending, MFAA? it’s free) but it’s a fairly tangible performance hit. And I’m not hugely anti-FXAA for that matter either, it’s just usually the worst available option.

MLAA/SMAA are the goods iirc


the SSAA used here for FF9 honestly looks pretty nice
don’t think this is applicable to modern stuff tho? i’m not even sure if they are referring to the same “SSAA”! cause i’m dumb!

oh hey i just purchased a bunch of new PC components in the last week or so. highlights include

16GB 3200 DDR4
Samsung 860 EVO 500GB SSD

so i’m pretty excited.

just not sure which card i’m gonna get. might just get a (cheap, hold-me-over) 580 for now and see what happens with nvidia (i prefer green when feasible)


oh yeah SSAA is great for anything that you’d traditionally want to upscale like 3D emulators, it’s just that the performance cost is incredibly high for anything that natively renders higher than that, by definition


Yuck, just naming them “wormy” makes me dislike this type of artifact more than I otherwise would


porting libiconv to System 6/7 is actually extremely cool


Temporal (TXAA) is current best practice, but we’re seeing some low-quality implementations on consoles; the telltale artifact is the ‘ghosting’ afterimages as things move in front of complex surfaces, especially water and shiny surfaces. Red Dead Redemption 2 in particular really suffers from this, although RDR 2 is notable for implementing many complex effects normally seen only on PC settings.


SSAA isn’t really antialiasing, it’s rendering at a high res and downscaling for your display resolution. So it’s exactly as expensive as ‘render at resolution x’. I ran a PS4 off a 720p plasma TV for years and was getting this – it looked higher resolution than PS3 games running at native 720p. I’d say SSAA gets you at least half the image quality of having a display that can also output at that resolution.

MSAA is very clean and high quality (it makes extra samples of edges, basically) but it doesn’t work with modern deferred rendering engines; Forza has very static lighting (it runs a forward renderer because it doesn’t want lots of point lights but does want lots of transparencies) so it’s still an option, and a good one. Postprocess/shader-based antialiasing became a thing mostly because deferred rendering engines cannot use MSAA.


came very close to throwing financing on some 9900k/aorus mobo/32gb ram guts today and then figured i actually don’t feel like rebuilding a computer during my last holiday of the year


Used monitor arms to put my two 16:9 monitors on top of each other just above my desk for a giant square-ish display. :+1:



Has anyone “shucked” an external hard drive for internal use? Best Buy has 8TB and 10TB 3.5" externals for $130 and $180 respectively. $16.25/TB is the best price I’ve ever seen.

I found this guide on how to fix the 3.3v pin issue that comes up on some PSUs.


I’ve done it at various times over the past couple decades, but I have noticed that externals are way cheaper than internals right now to an unusual degree; I suspect that’s because they have way lower reliability ratings and spinup speeds (presumably because those are two things that have been improving lately on higher-capacity drives for NAS branding and so on).

if it is in fact just a retail positioning issue, I guess I’ll buy one, I think I have a few molex-to-sata adapters already in the closet


Some people on e.g. reddit, slickdeals say they used to be just straight up Red-brand NAS drives and then WD changed the label and they perform similarly.

Honestly all of my Important Shit fits in my university-provided Box account and this would be for games, ripped discs, and fansubs none of which are going to be tragic to lose.

They are definitely 5400 rpm but like you (I think?) pointed out re: battletech most games are just assuming you have an SSD for their data management so I copy things to and fro when I’m actively playing them.

Link to the drive in question for those interested:


yeah I can’t be bothered to copy everything to an SSD to be played, I like to leave the games that clearly need it on there so the loss of 7200 is a dealbreaker for me

plus now that I’ve found out that consumer 12TB drives can actually do linear reads >200MB/s that’s already solidified in my mind as a feature I want

but the last hard drive I bought was 3TB for $100 and that was almost six years ago and it looks like we’re not there yet on affordable upgrades, so