I honestly can't fathom the loathing and disdain for new video technology like 4K, VR, and (in the past) even HD.
You play VIDEO games, why wouldn't you want technology in VIDEO to keep advancing and keep going further and further? "WELL I DON'T LIKE/UNDERSTAND/SEE THE POINT OF IT SO STOP TALKING ABOUT IT" is just about the most bizarre thing I've read on forums in my lifetime.
I notice you didn't mention 3D, which many of us called as total bullshit and would never catch on with the consumer, and we were right.
4K is not that kind of bullshit, 4K is a viable technology...it's also completely pointless for most applications and has very serious bandwidth and processing power concerns that are not handled yet. 4K will be awesome in five years, but not right now. The REST of the technology stack is not as ready for 4K as TV makers are to push it.
I somewhat agree with your first sentence. 1080p is fine for me. I think I'd be overwhelmed with 60-120 FPS at 4K but that's just me. I'd welcome it if it were affordable to me, I guess, and I understand completely why others would love it.
For your second part yes, it's like the first thing enthusiasts were doing when they came out. Building new SLI Titan rigs and running 120fps or greater on 4k (actually UHD mostly) monitors. /r/pcmasterrace/ was going bonkers a few months ago with it. Pretty sure Nvidia has some 4k g-sync monitors on the way too.
When you need a pair of thousand dollar video cards (and I would assume very high end OC'ed i7's) to push the resolution, it kind of proves my point. The entire technology stack is just not ready for 4k yet. It's certainly coming, and I think it'll be awesome, but I'm not rushing out to buy a 4k anything until a single 400 dollar video card can push it.
4K at least doesn't have the encumbrance of "shit you wear on your face" that 3D had to face. But unless it drops in price to the point where it's just a fact of life when you go buy a new TV, I can't really see it hitting wide spread adoption until the next big leap in consumer television size. Like, 15 years ago most people had 20" to 27" sets, and only the rich fuckers had 50+ sets. Now, everybody has somewhere between 30" to 55". Sure, the people with hawk eyes can see the difference in the sets now, but for us non Übermensch 4K just doesn't offer enough practical benefits unless we're rocking 80" sets.
And to follow that train of thought, won't there be only a certain percentage of consumers who have the space in their houses to support such screens that would make 4K a noticeable payoff?
Regarding 4k, I see no reason (other than the time it'd take to draw!) why we couldn't have 4k Dragon's Crown 2, or 4k Muramasa Rebirth.
4K GT5 was already a thing:
http://www.eurogamer.net/articles/2012-10-18-off-screen-video-of-sonys-gran-turismo-at-4k-resolution-demo
This runs on four networked PS3s (presumably each quarter of the screen is rendered by a single PS4, given the comments in the euro gamer article). I'm no expert on comparing specs but I'm pretty sure that could be made to run on a single PS4.
The PS4 has been said to have no support for 4K games at present, I'm betting there'll be a firmware update for the HDMI port to HDMI 2.0 at some point in the future for 4k gaming output.
Sure, highly detailed, effect laden games couldn't reach the lofty heights of 4k but games like Flower, Journey... simplistic-ish games with gorgeous design, 2D indie games, that kind of thing. I don't see why, given the spec of the PS4, that couldn't be achieved.
Similar spec'd pcs are certainly capable.
Unrelated but I didn't really get in to Tomb Raider, despite completing it it never clicked with me, but after reading through what's been added I'm certainly considering picking it up when I get a PS4.
::quick edit:: Enabling 4k support wouldn't just mean 4k, we could also have 2k resolutions used in certain games.
I will have a very very hard time believing the Xbox One or PS4 will be capable of running games with good graphics at 3840x2160. If that was the case, you'd expect EVERY PS4 and XOne game that runs at 1080p to run at 60 FPS with perfect performance and without ever having a hint of slowdown. I just don't see it.
This generation's hardware is designed to run games at 1080p. 4k is a HUGE leap over that.
Yep. Getting an acceptable 4K experience on PC right now requires a high-end GPU that costs more than either of these consoles.
Sony claims that some of their stuff is software upgradeable to HDMI 2.0, but I'll believe it when I see it. And HDMI 1.4 already supports 4K, just at 30fps...I think I'd rather have 1080p60 for low-graphics games than 4K30, though.
Posts
However, assuming you had the best processing power money could buy (appropriate TV/computers here) and the studios willing to create content that took advantage of 4k, your ISPs bandwidth will put the kibosh on streaming that game/movie in a heartbeat. It's not in Comcast/TimeWarner/Cox/Verizon/Etc.'s best interest to support that sort of data usage, since it would require a better infrastructure than we have now.
For consumer delivery, I think it's overkill a vast majority of the time. (Good) 1080p looks great on even ridiculously large screens. I said this in the PS4 thread, but there are a lot of factors besides resolution that go into graphical fidelity and picture quality. Contrast, color, black level, bit rate... These all play an important factor in the quality of an image and seem to get ignored in favor of "moar pixels!"
Let's not forget: Avatar, shown on IMAX screens, was shot & produced in 1080p, and I don't think people complained about the resolution on that movie (even though there were plenty of other things about the movie to complain about!)
Here's a couple of blog posts from an indie movie/commercial producer (and one of the founders of The Orphanage, a major FX house), that address these issues pretty well, I think:
CES 2014: TVs You Don't Need
4k In the Home
In the first one he talks about his recently completed home theater: 132" screen, 1080p from a high quality Sharp projector. Seating distance is 12.5 feet. He says people rave about the sharpness of the image, even though it's "only" 1080p.
When you consider the bandwidth costs (both in terms of US infrastructure and computation power needed), I think 4k is overkill in the home.
5 years from now may be a different story in what it takes for good quality 4k, but for now I'll take good 1080p over rushed/highly compressed/high-infrastructure-cost 4k.
It may be overkill, but I'm all about overkill. The more pixels we can pack on to a screen, the better.
That doesn't make me completely ignorant that it's not the only factor to an improved image though. As jimb213 says far better than I could, so much more goes in to making a good image.
I'm a fan of plasma sets for the awesome black levels (Years back I'd have killed for a Kuro plasma, nowadays though I'm pretty content with my Panasonic GT30), I love the way they handle motion too and the slightly more natural colours.
But the better TV sets get at reproducing the image, the more the cracks are going to show. Like (sort of) playing an old NES game on an LCD vs playing it on an old CRT.
Then again I'm always for bigger and better, even with diminishing returns.
PSN: SirGrinchX
Oculus Rift: Sir_Grinch
Currently playing: GW2 and TSW
I figure it's mostly CES happening this week and all the major manufacturers pushing 4k TVs since 3D is dead in the water and they need some way to get people to keep buying new TVs.
Yup. We have a 5 year old 32" in my bedroom and 7 year old 55" in the living room, both Sony's with beautiful IQ at 1080p. There's been no reason to upgrade, but reasonably priced 4k sets might do it in a few years.
You don't need a ridiculously large screen or overly expensive home setup to get a benefit out of 4k definition. Larger, yeah, but still in feasable range. The big reason why there's some controvery on 4k right now is that people want to jump on the bandwagon called "publicity". Give it a few years and we'll be sitting behind 4k monitors/tvs and none of us will even remember this.
Steam ID: 76561198021298113
Origin ID: SR71C_Blackbird
PSN: SirGrinchX
Oculus Rift: Sir_Grinch
I agree, but my point is that at the current cost of bandwidth, storage, and processing power that is required for 4k, it's not very feasible and too expensive for the only slightly perceptible improvement in resolution. As the cost comes down on good 4k TV sets, as more content becomes available, and infrastructure/bandwidth caps improve, I'll change my tune, although for quite a while I'll still say a good 1080p TV beats a comparably-priced 4k TV.
Whether or not the numbers on those TV resolution/distance charts are accurate, the human eye can only resolve so much detail at a given distance (you'd actually be surprised at how low a resolution you can get away with on a roadside billboard that's way up in the air), and 1080p is a pretty good sweet spot for the average sitting distance/screen size, at least in the US.
You can run 4k games on a PS3, but the visuals are going to have to be downgraded rather excessively (and I mean really seriously, I'm talking PS2 era visuals here). You underestimate the GPU power required to throw 4k around at 60fps.
Currently playing: GW2 and TSW
Nintendo Network ID: imperialparadox | 3DS FC: 2294-4029-6793
XBL Gamertag: Paradox3351 | PSN: imperialparadox
2 hours of netflix 4k streaming, 5 nights a week will be about 300-350 gigs a month of bandwidth. (Assuming a 15-20Mbit rate as said by netflix at CES). No way we are there yet but for a very small portion of people. The internet shit itself at the idea of an all-digital delivery system because a single AAA title (which could last you weeks or months) is 25-45 gigs.
50-60" 4K screens only offer up the benefit of getting closer to the screen before the warts show up. You really are not enjoying the extra detail unless you plan on getting computer monitor close to your television.
There is not enough storage in Blu Ray to house a high quality 4K rip. Which means we need a new standard, which has oddly enough not been offered up at CES yet. And the general consensus from theater buffs is that a high quality blu ray is a better picture than a 4K internet stream.
There really isn't anything to be done about option 2, except to eventually have a redesign of the living room space again so that televisions can be even bigger than they are... But 1 and 3 are "soft" hurdles that can more easily be overcome... In time. If you buy a 4K television now, be ready to deal with a lot of upscaled content and sub-par streams for the first few years. Also be ready for your TV to possibly not be ready for the real 4K launch, as the content providers demand a better DRM / HDCP method that your set wasn't built for. (See: all.the pre-HDCP 720p HDTVs out there).
Let's play Mario Kart or something...
4K is coming, and in a few years it's going to be awesome, but that time is not now. The displays themselves have to come in to the realm of affordable by normal humans, the computing power needed to run them needs to come in to the realm of affordable by normal humans, and the network infrastructure most of us use to connect to the interwebs needs to jump forward to handle the ridiculous bandwidth that 4K at a reasonable bitrate will require.
But, as many are pointing out, the streaming of 4k is absolutely not in the cards. ISPs are loathed to even do minor upgrades, let alone the overhauls (at least in the US).
IMHO the best chance is that Google gets serious with their Google Fiber and starts rolling it out everywhere, forcing traditional ISPs to get their shit together.
And that's really just the last mile protocol, that doesn't take in to account the back end infrastructure required to support your customers have 10x the bandwidth they do now.
4K is a solution looking for a problem; it's the industry's answer to the failure of 3D - consumers haven't embraced 3D and probably never will so let's throw something else at them and hope they bite. The problem with television isn't an issue of technology, it's a content issue. Most of what we get is crap. It's been the problem forever. Are the masses clamouring for high resolution? No, but the industry desperately has to figure out a way to sell new units; and they need us to pay again for content we already own. I'm not against technological advances by any means, but 4K doesn't strike me as the industry looking to solve any problem that consumers care about.
Sony's big showcase for 4k? After Earth, a movie that is currently rocking an amazing 11% on Rotten Tomatoes and flopped hard at the box office. I'd rather watch Wrath of Khan in SD on a 27" CRT. Will Big Bang Theory on 4k vs 1080p be any more or less entertaining?
As to gaming, how much farther do we need to go in terms of graphic fidelity? Take a look at the top sellers on Steam and tell me that the graphics really matter to most of them. #4 looks like it escaped 1991. I know there's a vocal group who love to wax on lovingly about maxed out graphics, but do you really notice a lot of those bells and whistles while you're flipping grenades at an enemy sniper while you stare at a brick wall that you're cowering behind? It's the same problem all over again for me; Crysis 2 is a technical marvel - but it plays like crap. It's a simplistic corridor shooter. I'd rather go back to Far Cry which still looks good and allows for a more immersive play style due to the open nature of the playfield.
4K will be a thing when it's affordable, and no sooner.
Nobody is arguing it doesn't look great....but you seem to have conveniently left out the type of computer you need to run any modern game at 4K. It's fun to go in to /r/pcmasterrace and see all those AC4 and Skyrim screenshots/videos running at 4k, and then you realize the rigs they are running those on are four and five thousand dollar rigs, with more money sunk in to video cards than my entire computer (which pushes current gen 1200p at max settings flawlessly). Until a single 400 dollar video card can push 4K reliably, it's a pie in the sky resolution that is effectively meaningless to most people.
If rich early-adopters want to jump on board, great for them. They'll help pay the R&D to get it to the rest of us in a few years.
Actually, the future is 8K, 4K is just a stepping stone
Remember, modern rendering is almost entirely pixel bound. We long ago left behind the notion of being vertex or geometry bound (there are still limits of course, but we reach our pixel boundaries long before our geometry ones except in extreme cases). Nearly everything sparkly in modern graphics is done using post processing and pixel level effects. It's why going up a step in resolution can absolutely kill your graphical performance if your card does not have the fill rate and shader units to handle it.
TL;DR: 4K as a standard is not even codified yet, and even when it is, our video hardware at the mainstream level is several years from having the raw processing power needed to handle that many pixels.
e: Had my math backwards, sorry, 4K is ~56% more pixels than the current high res, not 44%. Corrected.
Nu uh. The future is 64k! 8k is just a canoe.
Currently playing: GW2 and TSW
The point is how far away from that level of tech we are. It's not six months, or the end of this year, or even the middle of next year. It's years away, possibly up to five or six for the mainstream. The point of this thread is "how far off is 4K as a gaming technology", the answer is "a long way".
No one ever said 4K wouldn't happen, or wasn't going to look good...but if you go buy a 4K monitor right now, today, expect to be very disappointed when a) the standard fluxes several times over the next 24-48 months, and b) you need the super computer at Oakridge to run anything useful on it. I mean, it's not even a codified standard yet. I'm not sure how much of a clear "this is not technology for today" marker you need. There are currently seven competing resolutions all marked as "4k".
The PS5 and Xbox Two or whatever the fuck will have to worry about this. The PS4 and XB1 will not.
It'll be at least 5-10 years before we start seeing that in VR tech. However, the improvement from 1080p or 1440p to 4k will be enormous for VR. Comparable to the leap from SD to 1080p with regular TVs.
I'm of the opinion that 4k resolution won't mean much in the long run for normal TVs, though. But VR benefits greatly from it.
In terms of bottle necks, it's not the speed or processing power of these GPUs, a top of the line GPU today can push 4k pixels without much trouble. The bottleneck is VRAM - the amount of memory needed to double buffer a 4k framebuffer is measured in gigabytes, not megabytes. Remember when people scoffed at the nvidia titan's cost, or marveled at the PS4's 8 gb GDDR5? Those become necessary - 6 gb of VRAM is just about where you need to be for 4k to become honestly feasible. In the PC realm, that means that SLI/crossfire configurations don't do much to support 4k resolution, because the addressable memory doesn't stack - 2 3gb 7970's can only address out 3 gb of VRAM despite having 6 gb on hand. The only video card at the moment viable for 4k processing is the titan, which comes in a 6gb configuration.
The price of ram usually plummets, though. In the PC space, 4k gaming isn't that far off. With regards to the PS4, it has enough memory but its GPU likely isn't going to be strong enough to push 4k, save for some more primitive indie games. The xbone can render 4k video, but 4k gaming will likely never be a thing on the xbone.
I can get behind this, if only because the closer you sit to a screen the more the pixels themselves become apparent.
It's why cellphone displays, starting with Apple and then taken to amazing lengths by others, became so much nicer after we crossed 300dpi.
A screen sitting maybe an inch from your eye needs to be really, really dense.
edit: I also think we are a half decade or more from product capable of making something like that, let alone two of them, in a device that people could hope to afford.
Let's play Mario Kart or something...
VR uses a single screen split in half. You don't need 2 4k screens, just one.
Goes to show how much I follow this.
So its a screen roughly the size of a phablet, that needs twice the pixels of the iPad Mini Retina.
that really isn't impossible, or as far off as it would seem. Huh.
So that level of screen, plus 4-5 years of PC hardware getting cheaper, and the VR revolution could be very very real (looking... it may have already happened by then)
Let's play Mario Kart or something...
The upside to doing VR like that is cheaper hardware. The downside is that the effective horizontal resolution for each eye is halved, which is why 4k will be more important in VR than it will be for normal television. At 1080p, what you're actually getting is a 960x1080 image per eye, instead of a single 1920x1080 image. So like the current dev kit? Runs effectively at 720p? It's actually like 640x720 per eye. They already have 4k VR headset prototypes, it's just that they aren't feasible to mass produce. Yet.
EDIT: As for the physical size, that doesn't matter so much because the lenses warp the image. So, originally, before they used lenses, they needed, lik, 7" screens. The current prototypes are using 5.5" screens and they want to get it down to about 5". So ya, about phablet sized.
and for the record, assuming a REAL 4k resolution... remember this stuff isn't standardized yet, so let's assume 4096 x 2160, a presumed 4k screen would net each eye a resolution of 2048 x 2160. Which would be enormous for VR.
darkhorizons.com/news/30365/blu-ray-ready-waiting-to-go-4k
I'd love it if the industry went with an open standard codec like ogm instead of a proprietary codec like, say, some mpeg-derived standard