Tom's Hardware of Finland

I will say this for AMD: I still think it’s awesomely hilarious that they called their product THREADRIPPER

also look at this fucking router

3 Likes

are they ever going to make an itx board that ditches the sata slots. I just need 2 u.2 slots and any m.2 slots they can fit on after would be nice too

1 Like

Actually, they aren’t all sharing silicon fab tech. That’s important, trade secret stuff. Lots of R&D money goes into refining fab processes. and even if something is available to anyone, AMD’s doesn’t have as much money as Intel or Nvidia. And Silicon fab techniques have hit walls. Its why we are just now getting Intel chips which actually can clock to 5ghz on air, even though they have been using the same basic architecture, for awhile. Hell, Kaby Lake and Coffee Lake offer zero performance gain per clock, over Skylake. But they clock higher for the same power, and ultimately top out higher, when overclocking. That’s better silicon. 6 core Coffee Lake clocks higher than 4 Core Kaby Lake, even though its exactly the same core architecture. That’s better silicon. Good silicon fab is where you get performance per watt nowadays. R&D can come with whatever architectures, but the silicon is the limiting factor. We are too the point that dye shrinks aren’t even helping that much.

AMD used to even have their own fabs, but they had to sell them to get some quick money. Some people suspect that losing their fabs may have played into why AMD stayed so far behind with CPUs. It costs more money to pay to R&D with a third party fab. Ryzen is a good architecture, but their silicon still doesn’t clock high. Hell, you have to bump up the power a lot, to break just 4ghz on a Ryzen. (And Ryzen IS still a bit behind in performance per clock. So they still have a dual problem to figure out.)

Similarly with Vega, it performs similar to a GTX 1070/1080, at clocks similar to a 1070. So, the performance per clock, is pretty close (Its actually situated better, than Ryzen). The architectures are comparable in performance. But, to clock that high, to nearly match Nvidia’s clocks: AMD has to juice up the power a lot. That’s silicon issues. In fact, the Vega 64 has a power saver mode which drops off 86 watts (according to a very reputable testing site Techreport) and only loses about 1% of its performance. They are trying to squeeze every last clock from their silicon, to achieve relevant clocks at the high end.

Thermals are everything in laptops, consoles, and certain small form factor scenarios. But when we are talking about high end discrete graphics cards for all out performance, thermals aren’t really an issue. Don’t get me wrong, Nvidia has an amazing product. and indeed, if they wanted, they have the thermal room to do something even more powerful. but they won’t, so what’s it matter? And that’s a problem with Nvidia. They don’t move the market, the market moves them. Intel was doing the same thing. Ryzen came out and is pretty good. All of a sudden we have affordable 6 core Intel i5 nearly 2 years sooner than their previous roadmaps.

And its actually looking like Vega’s performance per watt is a lot better, at lower power. its the high end high clocks, where it has to toss power efficiency out the window. That Intel/AMD Vega hybrid is said to outpace a GTX1060 Max Q. The 1060 Max Q is 60-70w on its own. The Intel processor + AMD Vega APU is 100w total. That’s pretty interesting. The Ryzen 2700u APU isn’t out yet, but leaked performance puts it just shy of Nvidia’s MX150. The Ryzen APU is a CPU + GPU targeted for 15w (25w ceiling) Whereas the MX150 is a GPU chip on its own, at 25w. So it appears AMD’s silicon is good at low watts. But hits a wall when reaching for high end, and requires a bunch of power to just crest that wall.

I’m not 100% sure what your little images mean, but [H]ardocp is pretty forthright about insider babble, particularly with AMD (because they really want to love AMD, but AMD has peddled a lot of PR bullshit the past few years). So, that’s where I got that info. I have no doubt that with new (hopefully better) management and all the money they stand to make, they can afford R&D to bridge these gaps.

I do want to see how much the “better at keeping pace at lower clocks” is borne out. I’m willing to humour that.

as for thermals, I think you’re still engaging in some wishful thinking – like you said in rejecting the comparison between Intel vs. AMD and Nvidia vs. AMD, at least AMD can still scale up their GPUs, but that’s a truism of how GPU architectures work relative to CPUs; they’re in effect infinitely parallelizable for however many GPU cores you can stuff into the wattage. thermals are everything (and thermals of laptop x86 cores at this point should be trivial, 1-2w each at most, which is why 100w vs. 60w still isn’t very good).

anyway in 2018 you can buy this notebook that has an Intel, an AMD, and an Nvidia GPU, just like it’s 2009 and you’re trying to drive six monitors at once so you’re stuffing in as many secondhand GPUs as you can for shits:

(I am not too pleased with the PC industry at the moment, but maybe by the end of 2019 I’ll get my 13" quad-core Macbook with 32GB of LPDDR4 and my teraflop iGPU the way I thought I would in 2016)

I think that’s just confusing wording:

Inside are 8th-generation Intel CPUs, and each model comes with Radeon RX Vega M GL graphics with 4GB of high bandwidth memory. The XPS 15 laptop has an optional Nvidia GPU (Intel HD Graphics 630 comes standard), so the new convertible is a better option if you want discrete graphics in a default configuration.

They’re distinguishing the new convertible form factor with the Vega graphics with the previous non-convertible, maybe-maybe-not refreshed straight laptop which had the Nvidia card as an option.

all this talk about thermals is reminding me of when i owned a G5

1 Like

Every router looks like a Lego starship crossed with a dead horseshoe crab nowadays

4 Likes

but if there was a riot I’d loot Best Buy and grab one maybe

have fun paying like, 4 grand for such a fat monstrosity

Well the Vega APUs seem to be delivering. The Vega chip in that Intel hybrid is supposed to be better than the GTX 1060 max Q, for less wattage. Official reviews aren’t out yet. But even if its “same” performance, that would be pretty cool.

Thermals and power usage are everything in laptops. On the desktop side, you have a ton more headroom thermally, due to possible use of larger heatsinks/fans/etc. Power usage, while definitely useful for reviews and marketing-----isn’t really an issue for hardcore users buying $500 GPUs. I mean, it could be an issue for certain individuals. But my overall impression on the [H]ardocp forums is that power usage is not a big deal. Because everyone is overclocking anyway. So, as long as you can keep the GPU around 80c or less, it doesn’t really matter how much power it takes. But, there is still a wall. And obviously, AMD still can’t get a chip which will perform similar to a 1080ti, at 80c or less. But they indeed have a veritable wall in their silicon. As mentioned, the last 80w of power in the Vega 64, nets only 1% additional performance. I see that a lot with overclocking.

Eh, I don’t think its realistic to expect decent X86 cores at 1-2w. Current architectures are way to complicated and transistor dense, for 1-2w. Even Intel’s graphics is another 4-6w. You’d probably have to underclock to like 1ghz, to get a 5w dual core intel mobile chip.

that’s literally what mobile Kaby Lake cores draw (and even that is like 2x Apple’s ARM cores which are nearly as powerful)! GPUs just are that much more powerful than CPUs now, for the same reasons of scalability.

One model of Kaby Lake is listed as 4.5w at 1.3ghz. So, pretty much what I said. We aren’t at a point yet, where silicon is refined enough to allow a fairly fat dual core, to sip power that low. And frankly, we might never get there. 14nm took a while to get to and it didn’t bring big improvements on the low end. And Kaby and Coffee have even been accused of being mobile centric refinements. The real win for Coffee, was adding cores and more frequency at the mid and high end desktop chips. For the same relative power. *and we finally have realistic quad core mobile chips.

I’m sure those are great. But until we see them running windows or OSX, I’m not sure we can get relatively useful performance comparisons. iOS is undoubtedly lighterweight than OSX or Windows 10.

*I would say that something like this could be amazing in a gaming console. But then I remember that Sony was trying to do cool stuff and everyone hated it. So now we are back to x86 in consoles. And with the current shape of the industry, I doubt we will be going back any time soon. Not unless huge strides are made, enough to push away the ease and compatibility of keeping newer model consoles on x86.

1 Like

holy shit it is so great that the power died at CES

4 Likes

https://www.youtube.com/watch?v=foLkyLkGUUo remember this shit

how can I poop in a smart toilet if there’s no power

Nothing will ever be sillier than the detachable Microsoft laptop with one GPU behind the screen and another GPU behind the keyboard though.

Convertibles sound good on paper but always turn out to be a bad joke in practice. Portable devices can’t really be more than one thing at a time without being crap.

I am still happy with my Haswell Venue 11 Pro!! but there was a very specific confluence of things that led me to get it (price fell like a stone after 6mo on the market because no one was buying Win8 devices then and refurbished models were super cheap, didn’t want to get the exact same Macbook Air as my wife that year when we were both due a refresh, Dell’s build quality was abnormally high all of a sudden, it has microUSB charging, providing a generic standard before the still-ongoing USB-C mess, it has a removal back flap allowing easy access to wifi/wwan/SSD/battery which was taken off of the broadwell successor for no reason, it has a standard USB port despite being nominally a tablet, its detachable keyboard has a surface book-style sturdy hinge rather than a kickstand and only has a second battery in it, no other parts). it did lots of things that are still (feebly) news to PC outlets when a new product does them! and Arch/Gnome/the Linux kernel have had good support for the touchscreen and everything else on it pretty much since it released!

I will be ready to replace it as soon as Apple and Intel get their act together though.

2 Likes

particularly as it’s now like 2/5 as fast as my phone lol

2 Likes