Tom's Hardware of Finland


F’real. This is the only one I owned and it had cracks all over it after like a year and a half of not even vigorous usage. Then Apple wanted me to pay for an OS upgrade to be able to use something and I checked out of that ecosystem.


idk if it’s possible to directly link to the listing but EVGA put this thing up on its b stock listing for $250:

EVGA GeForce GTX 1070 GAMING, 08G-P4-6571-RX, 8GB GDDR5, iCX - 9 Thermal Sensors & LED G/P/M

  • 1506MHz GPU Clock
  • 8192MB GDDR5 Memory
  • 8008MHz Memory Clock
  • 256.3GB/s Memory Bandwidth

This… seeems… good? But I can’t tell if there’s a catch somewhere with these specs bc of learned helplessness I still don’t understand how to parse video hardware


titanium powerbook was the best mac innit


Yes, a 1070 will handily do 1440p or 1080p at 60hz and can push higher framerates if you compromise on graphics quality. In March I got two 1070s used locally (can’t remember if it was FB or Offerup) for $200 each, but they were the slightly inferior blower designs. $250 seems alright, and you’re better off with that than any of the 1660 nonsense.


a $1070 for $250 or less is still a great buy right now imo


I checked “deals” on 2060s and they’re $330+ at best so yeah, 1070. You get an extra 2GB of VRAM, too.

How’s that Ryzen working out for ya, @toups?


I’m extremely happy with it and all my other new parts. Thank you again for helping me out :)))



I’m worried too. At this rate, AMD might not be able to find and implement enough security vulnerabilities to overcome Intel’s HUGE architectural design lead.


tbh I’ve always thought HT was overrated anyway, as usual the big story is going to be about what VPS providers have to do in order to cope with this

also I’m sure these flaws are present in AMD too which they will find out when their marketshare deservedly rockets upward in the second half of this year


Well, so far AMD has had hardly any vulnerability issues.

Also, hyperthreading is actually pretty useful. Now that we have some fairly widespread threading, the usefulness has become apparent.

Even in fairly low thread is situations. I’ve seen so many people claim smoother experiences in Windows with SMT turned on, I can’t discount the validity. Also, There are several games nowadays which have microstutter for many people, on 4, 6, and 8 core CPUs. Unless SMT is turned on. Again, so many instances of people talking about, I can’t deny it. Similarly, very high framerate gamers 100fps+ often report SMT helping them ensure higher minimums.

Content creation also sees boosts.

and while we don’t have any tests, I have to imagine a high utilization server would benefit a bunch. Even though there’s already a couple of dozen cores or whatever. Doubling the threads is probably a huge boon in those cases.


it’s definitely bad then


ok so there are two of these that are nearly identical in specs and price. one of them uses ACX cooling which apparently has overheating issues but can be easily fixed with a bios update (apparently), and has a “clock boost” feature. the other uses the newer standard, ICX, but no clock boost.

do either of these things matter very much?


I’ve never comparison shopped for a GPU across vendors because I’ve always gotten either the smallest one or the reference design from Nvidia

I tend to write it off as woo, GPUs will naturally clock up much higher than many people seem to think is acceptable and it’ll perform +/-5% regardless


this is typically an “anal retentive synthetic benchmarker” case imo, just let the scheduler do its thing

90% of these workloads should be GPU accelerated instead if they need that many threads


I know this here thread is Nvidia turf but Newegg has been selling refurb reference Vega 64s for 300 bucks for the past few weeks and that thing can be tuned to ~1080 performance


i beg to differ!
wait a sec, i’ve just found out that the Athlon 200 doesn’t support ECC ram for real and now i have to buy another mobo+CPU great i better delete this post before hitting reply


feeling real good about shelling out the extra 30 bucks for the 2200G over the 200 chips at the store

it’s going to be even better once The Plan goes into effect. eventually.


butbutbutbut it’s all about shaving off that pesky 30watts!

… i’m not going to proudly link my VEGA64 screenshot with that 248W number i’ve been proudly linking too often already for a change, heh.


I need to actually go back and mess with overclocking the 2200G; I had it sitting comfortably at 3.9 GHz on the stock cooler with a decent voltage but I found out through poking through the board’s settings that ASRock just defines the overclock set as the default P-state and I could, in theory, make a set of custom P-states that still gives me the default AMD curve but the max of 3.9-4

but that’s work


That last 100mhz on Ryzen is a big power suck anyway.

What about overclocking the GPU side? I’ve seen some pretty great results on those.