Hardcore Will Never Die But Hardware Will

You might see if there was an extended holiday return window when you bought it, assuming you’d be willing to ride out the inevitable stock issues

Amazon has my 4070 ti super as returnable until Jan 31.

I’m not compelled at the moment though. I’ve got 16gb of VRAM and a lower tdp than the 5070 ti.

We’ll see what benchmarks show and what prices/availability are actually like when they hit shops.

I’m less interested in the expected performance uplift and more that they’re bringing back two slot cards

2 Likes

I’m still rocking an 8GB 3060 and a 60hz 4K/32" IPS monitor I bought two years ago, with the mindset that these are stopgaps, and first on the upgrade list when midrange stuff with a good cost/performance ratio comes out. Everyone was saying that video card is definitely not good enough, but I haven’t really noticed a problem even when I do play a bombastic 3d game like Armored Core 6.

I’m thinking I’ll wait another year or two before buying a 5070 Ti along with a RGB OLED monitor

1 Like

yeah, because of the steam deck and consoles, almost every modern game has to scale from like 2 to 80 shader flops now, and only Alan Wake 2 and Indiana Jones even have a hard requirement for RT/Mesh Shader hardware, which is like, a 2060+ (otherwise a 1070 from 8 years ago could still run everything).

120hz is definitely nice but if you look at the way that I, a 4090 owner, am getting there, it’s like… turn all the optional raytracing stuff on, forcing new releases to only be able to hit a 1440p60 render target even with all 80 of those shader flops, and then use the 4xxx upscaling features to turn that back into 2160p120. it’s clearly not strictly necessary which is a good thing

3 Likes

If it wasn’t for the stock issues I’d consider returning my 4080 and getting a 5080 for the same price

also Armored Core 6 came out on the PS4 so it hardly counts as having high requirements but obviously if that’s what counts as a big 3D game to you that’s a good thing because it means you do not need to upgrade

also, Japanese games have historically targeted like, the lowest-end current console (and PC equivalent) wherever possible, whereas other high end games tend to have slightly higher requirements on PC than on consoles (though again the steam deck is changing that)

4 Likes

It looks like the 5000 lineup is generally a 20%-30% performance uplift over 4000 at the same tier, but they achieved this by cranking up the power consumption.

Marketing is focusing on the AI frame interpolation stuff because the base-line performance uplift is underwhelming.

A game needs to already run well before generating frames is even beneficial, and it needs to be manually implemented by the developer.

I’m predicting frame interpolation will be marginally useful in a handful of games for boosting 60 FPS to 120 in games that don’t rely on reaction times. In 2 years it will be replaced by DLSS-5 which will require the 6000 generation.

4 Likes

I feel like Nvidia is playing a rhetorical game with the DLSS value proposition

One of these is extremely useful, and the other is not so useful. And it so happens only the latter requires an absurdly expensive GPU

8 Likes

happy the only games i like to play nowadays are remakes of thirty five year old jrpgs and unreal engine 5 slop turns me off

7 Likes

with the exception of the 5090 which is really positioned at a different tier than the 4090 was (it’s a 512-bit bus instead of a 384-bit bus card and priced accordingly relative to the 4080), this is more like 5% tbh. the shader performance per watt is exactly the same

it already pretty much does this (Indy goes from something like 50 to something like 90 on my 4xxx series), this is promising to be more like a 2.5-3x framegen boost rather than a 1.5-2x boost, which makes a lot less sense in terms of the implications for input latency or higher refresh rates afaict so far

1 Like

I was extrapolating from the 5090 and being generous because I can’t find numbers on the other cards. But yes, this looks even more dire than I was thinking.

And yeah, extremely niche use case for boosting AAA singleplayer games anywhere above 120fps.

genuinely unsure if you mean the lighting or the subpixel array

The latter, that’s why it’s on my “wait another year or two” list

2 Likes

Why not get a QD-OLED with no W subpixel? I can’t notice any text rendering difference from regular RGB on my new 32” 4k and the color luminance is much better than my WOLED TV

1 Like

I’ll consider that. I’d have to go to a store with one on display to figure out whether “can’t notice difference in text rendering” is also true of me. It really has to be no discernable difference or I’m not getting one. I read text on my computer screen a lot more than I play videogames

1 Like

:cherry_blossom: Vengeance :cherry_blossom:

CES2025_Image6_CCLDRAM_copy

11 Likes

I bought another Thinkpad T440P because it was cheap and because I damaged the screen of my daily driver the last time I switched drives for windows, so I might as well just have a dedicated windows T440P for taking tests (until October at least)

2 Likes

weak ram

image

3 Likes

imagine adding that RAM to your SONIC mobo …

:kissing_closed_eyes: