ohhh NVMe 5 is pretty cool I gotta say
that plus infinite ram (now with the correct optimized CL30 3000 UCLK timings) is really making this feel pretty lightning quick
ohhh NVMe 5 is pretty cool I gotta say
that plus infinite ram (now with the correct optimized CL30 3000 UCLK timings) is really making this feel pretty lightning quick
Uuuhhh donāt make me second guess my decision toā
TOO LATE
j/k, what i planned to do this year is put a 4TB NVME into my VEGA machine for dumping games on it and remove the still-external 2.5" drive iāve been using as a stop-gap solution. Everything else is optional stuff, like a replacement for the V64 that marches on for AGES ⦠AGEV i mean
the one thing with NVMe 5 drives is that they seem to run incredibly hot compared to NVMe 4, to the point where they pretty much need active cooling and half of the released ones all have negative reviews from people going āwtf why does this disk need active cooling.ā my motherboard (the only x670e ITX board that was even on the market 2 years ago and for all I know there hasnāt been a newer one released for Zen 5) actually was prepared for this because thereās a tiny VRM fan thatās adjacent to the NVMe 5 slot (which is stacked in a giant heatsink under the NVMe 4 slot), and once I figured out how to fit the drive in there, it works great, but I think these are going to be extremely niche products.
uh-oh.
is that by chance the
because that stacked nvme tower+heatsink-prefitment is the only reason why i would even begin to consider putting two inside that stack.
yup
damn
OK taking bets now what will be happening first, upgrading to a NVME5 drive or replacing the VEGA64 with a FFF replacement
Jokeās on me that i will probably know sooner (because it is EASIER TO FIND OUT) whether these
fit on my UX than finding a GPU that fits into that case and isnāt a downgrade or mere upgrade by a puny 5% or so.
Yes, i consider rims hardware for the sake of it ā¦
update from using the NVMe 5 drive for gaming: I need to adjust the fan curve. and Iām very glad I went with NVMe 4 for my C: drive, because the NVMe 5 should ideally only be used when the GPU fan is louder than the VRM fan. Ok, fair enough.
I just preordered the base model Galaxy S25.
Tech reviewers are performatively bored with it, but from my POV itās like 50% more RAM, (benchmarks TBD but hopefully) much faster CPU, satellite texting support, Wifi 7, and the launch price has crept back down to $750 (for a while there, Galaxy flagships were launching at over $1000). Whatās not to like?
I had been seriously considering getting one of those Garmin satellite text widgets in case I get stranded hiking in a cellphone signal dead zone. I decided to wait to see if more phones started to get the feature this year instead, and that seems to have been a good call.
(Rest in peace, another of Garminās lines of business)
Cool, looks like blackwell is kinda whatever, same-to-worse perf/watt than the current cards. Sounds like supply is going to be pretty bad, the joke going around CES was āevery 50xx in existence is in vegas right now.ā
yup ā this is the first GPU generation in history when shader performance has not improved at all, which appeared to be the case 2 years ago when they had no roadmap
5090 form factor is impressive but thatās about it, I would still rather have the overall power/memory envelope of a 4090 (which I conveniently do), and it seems like no one else is investing in those kinds of cooler innovations yet
they arenāt gonna have a meaningful shrink on TSMC 4nm ready for more like 4 years than 2, either, so thatās kind of a wash too. the 4000 series was really good, nevermind the lineup
best case scenario is that AMD non-shader performance catches up
Iāve been cursed with finding out someone made a CUDA-based JPEG2000 encoder and now Iām running the numbers in my head for 200 bucks and the cost of a GPU versus the time I would save encoding packages if Iām getting 80-150+ FPS instead of 16-28 FPS, a thing entirely dependent on people not actually following the directions of whatever film festival theyāre sending films into
this right here is why itās hard to recommend AMD. you do find a use for this stuff one way or another
oh god no, Iām not dragging films to and fro from work to home, you donāt mix business with pleasure
instead I can scoop up the glut of Pascal server and professional cards and have a specialized system just for that
Yeah I thought I didnāt care about CUDA but then realized my favorite OCR utility for Japanese videogames uses it (not sure how slow it would be in CPU mode, but it gives me near-instant dictionary lookups from screenshots as is).
I also in theory donāt much care about AI, but this utility is also a transformer model I use everyday. And yeah I have to admit itās way more reliable than traditional OCR algorithms. People love to complain about AI hallucinations, but you should see what sort of gibberish non-AI-based OCR used to hallucinate sometimes
had to revert Windows 11 24H2 which it pushed to me this week because it was adding really weird frame pacing issues to all accelerated video on my iGPU, no idea how that hadnāt been caught in testing so far
luckily the revert actually worked super gracefully and in 10 minutes flat which is better than I expected
I imagine the experience of trying to rewrite any low level systems in windows is very unfulfilling because of stuff like this, Microsoft keeps acting like they donāt want to be the company that supports eg mixed GPU driver combinations and therefore shouldnāt have to regression test them before pushing updates but hereās the thing fellas, thatās actually the only reason I donāt just use Macs for everything, you need to maintain your 30 years of bullshit edge cases or you are useless to me
i finally upgraded to 11 and all I got for it was the phone UX and washed out HDR in cyberpunk. thanks, obama
youāre looking at it the wrong way
now you can have washed out HDR in any DX11 or 12 app
Not that I strongly disagree with anything in this article, but Iām still mad that they published a 1000-word article about lightbulb specs without even mentioning CRI.
There was a perfect opportunity to segue into that in the paragraph about good light helping you chop onions safely (higher CRI is what makes colors of objects distinguishable from each other, not color temperature)