Tom's Hardware of Finland

my take on GPU overclocking is that there’s a R9 290 in the case and the system is set up to use switchable graphics (iGPU for desktop, video players, Retroarch and select emulators, light games, dGPU for everything else) so there’s no point

as a bonus, the bios had an option to straight up not send the dGPU power if not in use so I’m at maximum power savings and maximum laziness

1 Like

What’s the power/thermal difference like in coaxing a desktop not use the dedicated GPU for light tasks, when the GPU is near-idling? I know it’s important on laptops but on desktops I figure it’s still reasonably efficient at low-power states so I leave the GPU to do it’s thing all the time.

**note: The lower temps while OC’ing, are because they changed the fans to run at 100%, during full load, for their Overclock.

For an Intel CPU, the Intel graphics have a TDP of 6 watts (correction: 15 watts max). and can increase CPU temp as much as 10 degrees, at full utilization. But a more typical temp rise is 5 - 7 degrees.

So you are looking at a savings of about 60 watts (at idle) and sparing your computer’s internal ambient air from dealing with the extra air coming off your GPU at 20 - 30 degrees Celsius. That’s idle temps. Presumably a “light” task which actually uses the dedicated GPU, would heat up more and use more power. but not as much as the gaming temps.

In the case of hardware encoding video, the dedicated GPU is going to get it done faster than the graphics chip on your CPU. Even though it will use a lot more power as it is decoding and exhaust more heat. New 8 core CPUs might even beat the CPU graphics chip, in encoding time. But again, more power and heat.

But for say, hardware decoding 4K video: your Intel graphics are going to be a much more efficient use, for the same user experience. (you get to watch a 4K video without bogging down your whole computer). Similarly, with an AMD APU.

However, I’m unsure how common the feature is, to completely deactivate and power down your dedicated GPU and profile it to only wake up for certain tasks. That sounds like an expensive motherboard feature. But I don’t know. My newest motherboard is 3 years old and was a mid-range model (it doesn’t have that feature). Could be common on the newest stuff.

Yeah, I’ve read enough GPU reviews that I shouldn’t have glanced over that order-of-magnitude difference in power consumption at ‘idle’. That’s pretty crazy, and I’m surprised there’s not a lot of ground to be made just rendering desktop with these cards.

I don’t think a fixed-function dGPU video decoder is going to be any less efficient than a iGPU video decoder, and it’s definitely more efficient than doing it on the CPU…

Apple approved the Steam Link app! All it took was losing at the Supreme Court lol

2 Likes

Intel’s integrated graphics chip is 6 watts max TDP. A dedicated card idles more heat than that. In terms of heat and electricity, its more economical to decode/watch video with your Intel graphics.

I don’t think that’s true across the board (an HD4000 could use up to 20w; it may just be that because we’ve been stuck on the HD 630 or whatever for so long, people think they have a static measurement), and importantly I don’t think Intel can make fixed function hardware that’s any more efficient than anyone else’s. Unless we’re assuming a scenario where a dedicated GPU can be shut off entirely (which I don’t think is even possible in Optimus type notebooks, let alone desktops), I think comparing idle power is functionally a wash, and comparing PureVideo vs QuickSync I’d expect Nvidia to have the edge in utilization if anything.

Indeed I was wrong about the TDP. HD 630 is 15 watts max.

a GTX 1060 idles at twice that. Before it even starts any work. HD 630 is the more economical way to watch your 4K video or whatever.

my Titan X Pascal idles at like 10w, so I don’t think the 1060 could possibly be more than that

Intel graphics are no doubt still better than that but at that point we’re down to numbers that really only matter for notebook battery, which is a reason to not want a dedicated GPU in a notebook (which I agree with!) but not a reason to avoid using a GPU for video decoding if you have one.

I guarantee the other has a boost clock, as its part of the design. I mean the reference Nvidia design has a 1683mhz boost clock. They just don’t have it listed because they are dumb. and the boost tech isn’t a hard set number. its dynamic. with good cooling, they are known to average over 1 hundred mhz faster than 1683

I imagine since the ACX here is a refurb, they probably added the mods. However, I would pay the $10 premium for the ICX. It should run relatively cooler and quieter.

I guess the ACX either had bad VRMs or poor cooling design for the VRMs. The “fix” was to add different thermal pads and then change the bios so that the fan curve is faster, sooner. So, louder. The ICX is an improved design.

That ACX is an “FTW” model, which might possibly be binned chips (specially selected chips) for high overclocking potential. But Nvidia cards generally overclock pretty well, anyway.

1 Like

Man I am stepping on my own toes today.

I stated the idle temp ----- as the idle wattage. What dummy.

this is a great discussion and all but I did it pretty much because I could

we do the things we must because we can

also newer decode engine

1 Like

To get back to this topic: Regardless of our opinions about SMT usefulness on a home desktop, This is a huge deal for the server world. All those extra threads provide real performance, especially since those CPUs are equipped with the extra cache to properly address the extra threads.

But it still is at least a huge marketing failure, for home CPU users. As SMT/Hyperthreading, is one of the key features you pay for, with i7 and i9 CPUs. The key difference between i5 and i7 are more cache and SMT/Hyperthreading. This is gonna piss people off. At home and in server business. Any situation where hyperthreading is forcibly turned off, sorta cuts your CPU down to an i5. That’s hundreds of dollars switched off.

1 Like

a) does this affect an 8700k?

b) googling “can i waive class action claims by opening a box in 2019”

its looks like everything from Sandy Bridge to the newest 8th and 9th gen, are affected.

8th and 9th gen do have some mitigations already in place, built into the CPU hardware. But, it looks like they will still require micro-code and software updates, to be made more acceptably secure. And some software vendors are recommending (Ubuntu) or even forcing Hyperthreading off (Chrome OS, OpenBSD). MacOS now has an option to disable HT, without rebooting to the bios.

Here is an reasonably understandable article on it. Which confirms that AMD is not vulnerable (or at least, has not yet been demonstrated to be vulnerable). Same with ARM based chips (cellphone/tablets):

** At AMD we develop our products and services with security in mind. Based on our analysis and discussions with the researchers, we believe our products are not susceptible to ‘Fallout’, ‘RIDL’ or ‘ZombieLoad Attack’ because of the hardware protection checks in our architecture. We have not been able to demonstrate these exploits on AMD products and are unaware of others having done so.

For more information, see our new whitepaper, titled “Speculation Behavior in AMD Micro-Architectures.”

I run my 3570K with meltdown mitigations off (since it’s one generation before the x86 instruction enhancement that makes them less bad) without a care in the world knowing that I have ublock to catch any malicious javascript and any other attack would be way too hard to target, fwiw

Ok, it looks like it’s 10-17w idle for modern desktop GPUs, which was my basic assumption, which is to say, close enough that I’m not going to notbov it and just let it ride

1 Like

don’t you verb me

5 Likes

other perks of the dumb setup I did:

-I’m lazy and can just plug in to the motherboards video ports
-when I get a new GPU for main computer, my 980ti goes down and gets Freesync though the Vega hookups
-the 2200G will get shoved into its own system eventually, where it will be a dumb Linux/emulator box
-the iGPU can do 4k/HDR (though I hear HDR on Windows is a bad time)
-my R9 290 is literally a ticking timebomb that can crash the system at any moment unless I downclock the memory and using it as little as possible and not powering it gets around a lot of things and also do I really need the thing on to run a shader or two in RA
-I like playing with the 2200G (fuck man, running RPCS3 with Vulkan on it gives me a good time)

ultimately everything is in service of tinkering

1 Like