GlobalFoundries abandoned development on their 7nm process, AMD announced it moved to TSMC:
I’ve never read a really satisfying study of the economic effects of once-healthy market competition in some domains becoming impossible due to sheer technological complexity and what an appropriate response is to that, but I bet there are other historical examples
because only three companies having the ability to produce a huge portion of the world’s consumer goods (and, in effect, essential infrastructure) is obviously extremely bad but it’s not the kind of borderline market failure you can fix by just nationalizing something
I’d be interested to see it too.
We’ve seen interesting things play out in display tech as smartphones have shot past laptop and especially desktop displays stagnating much earlier than the chips inside.
An example is AT&T’s vacuum-tube-based phone network that gradually subsumed all the regional competitors that were unable to manage the long distance. The appropriate response was better technology that reduced in some sense the complexity of the system, and breaking them up. If digital technology had not come along, then I don’t see much alternative to either a continuation of the heavy regulation and price controls – or nationalization.
that still probably could have been nationalized, though, which isn’t to say that having the political will to break them up wasn’t more or less just as good at the time. this is still a good rather than pure infrastructure, and it’s a good that is used by the entire world with no replacement and no inevitable technical successor.
CERN buying their own 7nm fab would probably be the best comparable solution (and is the kind of thing the Miterrand government would’ve actually done, infrastructure-wise), which would at least make it an equivalent situation to Airbus, who compete with the worst kind of single-source-enormous-contract-obviously-should’ve-been-nationalized-ages-ago-Boeing, but even that is a) not ideal and b) obviously not going to happen in this case
China’s getting very serious about making their own entrant in the space and we can expect theirs to be heavily subsidized. That’s at least 5 years away though
true, the Chinese government would actually be willing to scale up to the point that they can compete with the established corporate actors. but it’s not likely that any other state or non-state entity could, which is still a grim outlook for the next decade+
the 15w ones apparently also have meltdown fixes in hardware so that’s another small point in their favour
Even if I still think we’re another refresh off of the good stuff
It finally happened that Apple got a much more significant boost to GPU than CPU power out of the newest iPhone chipset despite a fairly significant advance in lithography (it’s not clear exactly how significant because they claimed their 2017 models were on TSMC 10nm which doesn’t seem to exist outside of that one product so we don’t know if that was a heavily optimized 16/14/12nm node or a very early 10/7nm), so we might be seeing the beginnings of an ARM plateau. Funnily, if you take into account clockspeed and IPC, it’s actually very comparable to where x86 started to plateau, except with ~70% lower power consumption across the board.
So right now it’s not clear if a) there will ever be any reason to make an actively cooled ARM chipset in e.g. a higher end Macbook, or b) if we aren’t seeing a semi-universal soft limit on single threaded performance. The good news is that Qualcomm has only ever been like a generation or two behind and will probably catch up at this rate the way AMD did.
Even Android Police is admitting there’s basically no reason to get a high-end Android now: https://www.androidpolice.com/2018/09/12/apples-cheap-iphone-xr-just-ate-googles-lunch/
At the same time I’m still not really tempted to get one. I like my Galaxy S8 even though I know what I paid for it is a bad deal in terms of hardware specs. I think the Android software is still way ahead in terms of app/system integration: obviously including the classic case of notifications, but also stuff like Chrome Custom Tabs are more featureful than Safari View Controller, and Google Maps PIP navigation mode works really well on Android O. And going from Qi/USB-C charging to Lightning would be a pretty painful downgrade for me.
I think I’m going to hold out for 5G handsets, assuming I can get a LineageOS build for this thing that doesn’t have random features not working
goog killed off inbox quietly during the iphone keynote so i assume the staff from that will be working on a new messaging app
curious, what kind of psu do you need for a top end gpu nowadays? my 1060 is happy with 450 watts but i assume i’d have to swap that if i wanted a beefier card in the future?
I’ve got a 1080 on 550w as well as an 8700k at a slight voltage increase and OC. No issues.
I learned that you want a single 12v rail at some point but I don’t remember why
I have a Titan X on 450w with a mildly overclocked 3570k, also fine
as long as you can provide most/all of that 450w as 12v and your CPU is <100w (mine peaks at ~60w because that’s the point beyond which you start needing silly voltage leaps for every 100mhz) you should be OK
that’s cool, i should be fine then
not that i’m planning on replacing the 1060 any time soon honestly, still on a 1080p display
yeah, people who have half a foot in the computer-building world really blanch at the idea of not like getting a gigawatt PSU for any “high-end” hardware but that perception is largely driven by a combination of a) ridiculous enthusiast-level stuff from 5-10 years ago (where you’d have like 150w+ of CPU and 500w of GPU) and b) people’s inability to consider 12v amps in addition to total wattage (as well as the overhead needed on cheap PSUs, but the difference between a cheap PSU and a good one is like $20).
a good rule is “don’t buy a PSU for which 90% of the wattage isn’t available on 12v, and don’t plan to use more than 90% of that 12v across all the components that have available power consumption data.” beyond that you’re fine.
rip iphone se T_T