or did you mean to type 4k or 2560x1440? because afaik there’s nothing out yet that can accept a >1080p signal and do a >60hz display, because it requires a newer displayport standard than any GPUs are currently shipping with to handle all that bandwidth (and also because the hardware basically doesn’t exist to drive games at such high resolutions and framerates unless you’re relying on gsync to even stuff out anyway and at that point you arguably don’t need 120hz).
I guess I wouldn’t mind having an environment that could switch between a 4k 60hz desktop and a 1080p pixel-doubled 120hz game but then again I’m still using a 1680x1050 + a 1280x1024 display side by side because the price gap between “decent monitor” ($100, 1080p) and “really cool monitor” is still really wide.
also because I’m already pushing the limits of “hardware that’s functionally better than linux can make use of”
There exists 27 in. 2k (1080p or 1440p) 120-144hz monitors, but none in 16:10 because fuck you, productivity is for 60hz. The most I usually see for monitors in not 16:9 aspect ratios is 75 hz. Funnily enough, LG seems to having a firesale on their current 21:9 displays to make room for newer models, but again, those top out at 75 hz.
Also, current GPUs can drive 4k60hz over DP 1.2 (DP1.2 has been shipping on cards for a few years now; my GTX 760 is from 2014 and has it) and HDMI 2.0 (assuming they have the relevant connections, of course). So, in theory, you could buy one of those cheap and probably sketchy Seiki 4k sets and use that at 4k60 for desktop and 1080p120 for games assuming you had a HDMI 2.0-capable card (which I think is literally newer Nvidia cards).
oh, I didn’t think of plugging in to the same monitor with both DP and HDMI depending, that’s kind of silly but I guess it would work.
calling 2560x1440 or 2560x1600 “2k” is kind of silly. is that a real thing? logically 1080p is a lot closer to being “half” of 4k.
I got my PS4 from one of those fell off the truck amazon warehouse things while the canadian dollar was in free fall the week before bloodborne came out, still no idea how I managed that, I just thought to myself as I was coming out of the shower one day “maybe I’ll check”
All I know is Newegg has 2560x1440 monitors under the nebulous descriptor of 2k, even though 2k doesn’t even refer to 1080p to begin with (cinema 2k is 2048x1080) so why not use it.
I think I wasn’t quite clear; DP 1.2 and up can drive 4k60 by itself and so can HDMI 2.0 (1.4 tops out at 4k30 IIRC). DP 1.3 is going to be even crazier and will outpace both reasonably available tech and consumer demand and spending (4k at 120hz, 5k (don’t ask me) at 60hz, 8k at 30hz or 4k at 60hz with HDR support).
oh I see. so there’s currently existing GPU technology (that I apparently own) that can handle the “4k 60hz desktop / 1080p 120hz gaming” concept, there just aren’t any monitors that support both of those things that aren’t in huge TV form factors, and there probably won’t be any in wide production until DP 1.3 is available to support both 4k and 120hz simultaneously, at which point it’s a question of whether those monitors will have “legacy” support for DP 1.2 to support those two other display modes, as well as whether their scalers will just do straight up pixel doubling of 1080p to 4k (since I’ve heard some ostensibly high-end monitors manage to make a smeary mess of even that basic task), and finally whether windows and/or linux will be able to gracefully switch between the two when games are launched.
oh and I guess they have to have gsync too
so this hypothetical product will probably be available at a price I’m willing to way for it in what, like, three years?
God I hope G-sync dies and I say this as someone who’s spend a bunch of time on the green team and think it’s probably better anyway. Proprietary tech does favors for no one, especially when AMD’s solution is open and part of the VESA standard anyway (nevermind that it and Freesync are absurdly expensive, with Freesync being slightly less absurdly expensive).
Anyway, what I think is the current monitor tech hotness is backlight strobing to reduce motion blur, mostly because it’s only available on monitors with sane resolutions
yeah I think the final death knell for my vaguely-being-endeared to AMD from the 9600 days (other than the state of their linux drivers) was when Nvidia started to make reasonably-sized, power efficient cards after years of WHAT’S THIS BIG HONKIN THING and because AMD has hard a hard time actually developing more efficient architectures they just keep piling it on lately
Honestly, it wasn’t too loud until I saw core clocks were dropping at max utilization/high temps, implying some thermal throttling, at which point I was like “Okay, let’s mess around with a custom fan profile” which had the immediate effect of shooting the fan to 100% and achieving leafblower status. Good thing I game with headphones so I can’t hear any of that nonsense. Well, most of it.
Hmm… I’ve got a 7 year old desktop with a C2D E8500 and a 9600GT. I’m not sure whether to upgrade or to just buy a console and a new laptop for convenience and indies. It does seem redundant to own two AAA game-playing boxes when the only AAA games of interest are two things by From Software. This isn’t really a question I’m just hoping to commiserate on first world problems.
Get a PS4 and a macbook for sure. That’s very close to the specs of my own last-gen desktop and though I do have a more powerful machine now, virtually none of the games I play on it (as opposed to the PS4) wouldn’t have run on the old machine, and I needed it for work anyhow.
that PC OEMs are suddenly making decent hardware is nice enough but I think the tech media are going out of their way to appear excited relative to their desire to actually run and deal with Windows
it’s nice if you’re going to run Linux, but you should really just buy a Mac. particularly as all of this ostensibly-exciting PC hardware means that most of these machines are no longer cheaper (or more expandable) than a mid-end Macbook.
I’m pretty chuffed that Microsoft + PC OEMs chasing Apple hardware margins and pushing uncomfortable OS-as-a-service design (which is totally understandable as the race to the bottom sucks) means that average guy finally no longer (erroneously) looks at a mac as “locked down” or having an “apple tax” or what have you (although that everyone’s first impulse for referring to windows 10 as a “keylogger” implies there’s still a lot of unlearning to go). Sure, maybe you never saw the point of having a *nix scripting environment, but now you can’t upgrade anything else’s ram either because form factor, so you can finally give up the misdirected anti-elitism.
Windows is still perfectly fine for an old desktop that runs steam games (although wine is getting a lot better for legacy dx9 stuff and you shouldn’t underestimate it!) but if you’re already buying a new laptop then…
If you’re not adverse to mucking about with BIOS and innards, you can try getting a C2 Quad and overclocking that and pairing it with a GPU in the 100-150 range. That should be enough to do 1080p @ 60 fps on most games. Or, you know, buy a PS4. But that’s less interesting. Also, DDR2 memory has gotten/is getting expensive on the aftermarket, so good luck getting 8 gigs of that stuff.
I have considered grabbing a 760 GTX to see if I could muddle through the current generation, but I’ve got compelling enough reasons for a full upgrade. No sense of urgency though.
760 is a quality GPU, moreso now that you should be able to find one on the cheap now that the 9xx series is in full force (I believe it’s roughly on par with the 950/960). It’s actually what the 290 I got replaced (I am greedy and want every single of my 120 hz to be shot directly into my eyes).
I’d be inclined to recommend a 750 ti instead (the 2gb version) purely because of the form factors it enables and the value – it’s seriously impressive for a $100 half-height card, even after a year and a half on the market. but if you’re already lukewarm may as well wait until the new batch of stuff this year.
hmm, that 750ti sounds like it’d be a good card for my 6 year old PhenomII 965 X4 BE-rig … now if only i didn’t already have a R9 380X OC in sight, that is.
Yeah I know, hilariously op for my machine, let alone my needs. But a man can dream, right…