The new forums will be named Coin Return (based on the most recent vote)! You can check on the status and timeline of the transition to the new forums here.
The Guiding Principles and New Rules document is now in effect.
Do you think that graphics in video games have slowed down a bit?
Maybe it is because I am not playing the latest games as they come out but it seems to me that video cards seem to last much longer. As I remember a time where it seemed like you had to buy a new video card every year to be able to play upcoming video games. I have been using a 9800 GTX+ for quite sometime with no need to upgrade.
To me it seems once crisis came out and everyone could play that game graphic improvement kinda came to a stop.
I think it's mainly because developers are pushing the hardware of the 360 and PS3 to it's limits and because of whatever reason, be it piracy, lower sales in general or not wanting to make a game exclusively for a non console platform they don't want to invest resources into pushing the graphical ceiling. Also for exclusives better graphics = less user base can play them = less money in total.
PC games will always look better though, even if it's just because of higher resolutions and it's nice that stuff like ME2 has a better interface for the M+KB. Plus we always have Valve and for whats it's worth there have been some excellent PC games in the past few years whose graphics aren't exactly bleeding edge.
There hasn't been any major advances in real time rendering for a few years now. That's what's pushed video cards for most of the last ten years.. higher poly counts, shader effects, lighting, fun stuff like that. So yes, it's slowed down. It's why stuff like PhysX and 3D is getting pushed on us, it's the hardware companies trying to keep their sales up.
I think it's a good thing, it lets developers learn the tech better and squeeze more performance out of it, and saves us gamers a bunch of cash.
My expectation for the "next level" is going to be real time ray tracing. People have been researching it for a while, proof of concept are out there, it's just a question of making it marketable.
And when you've got systems like the Wii succeeding, as well as games with a style like World of Warcraft taking off in a big way, you begin to realize that polycounts don't matter so much. As was said above, piracy can just destroy sales for big name games and invalidates the work required to create them. Crysis was a disaster in that respect - it was a curiosity, nobody wanted to buy it for fear their computer couldn't run it, so everyone pirated it.
At some point developers were going to hit a wall where they could decide if spending entire years on 3D graphics alone was really worth it.
I think they've found a balance and GPU's have had a chance to catch up.
That doesn't mean they can't break a quad SLI box if they wanted to. It just means that there's probably no profit to be had in it.
Yeah but that Quad SLI just gets you a high FPS not better graphics overall.
What's your point? You could theoretically rewrite Crysis to run on a Voodoo 3 but only get 1 frame per minute.
My point was that graphics aren't getting better because they *can't* get better. They aren't getting better because there's likely not sufficient demand for them.
And when you've got systems like the Wii succeeding, as well as games with a style like World of Warcraft taking off in a big way, you begin to realize that polycounts don't matter so much. As was said above, piracy can just destroy sales for big name games and invalidates the work required to create them. Crysis was a disaster in that respect - it was a curiosity, nobody wanted to buy it for fear their computer couldn't run it, so everyone pirated it.
Crysis did alright in the end, didn't it?
Fats on
0
Zxerolfor the smaller pieces, my shovel wouldn't doso i took off my boot and used my shoeRegistered Userregular
edited March 2010
Yeah, if you consider selling about 2-3 million copies as "alright." That game has so much fud spread around it, it's strange.
It sold like ass
It requires a $10,000 PC to run
It's just a pretty tech demo with generic gameplay
etc.
Telling, though, is that it and Warhead are still on the top end of the graphics spectrum despite being about 3 years old. I had a GPU from that time that's still more than capable of maxing out today's games, and the only reason I replaced it was because its VRAM got fried.
And, from what I've heard, Crysis's problem isn't so much a GPU bottleneck, but a disk throughput bottleneck. I've heard it runs better with SSD's.
And apparently CryTek says Crysis didn't sell as well as they were expecting, but EA says it did better than they expected. As a whole, I'd just say CryTek set their expectations way too high.
Anyone hear of Project Offset? I kind of thought that was going to be the next big graphics thing based on some of their videos, but over several years by all outward appearances the thing has gone nowhere.
And, from what I've heard, Crysis's problem isn't so much a GPU bottleneck, but a disk throughput bottleneck. I've heard it runs better with SSD's.
And apparently CryTek says Crysis didn't sell as well as they were expecting, but EA says it did better than they expected. As a whole, I'd just say CryTek set their expectations way too high.
If it's a disk bottleneck then it's because the GPU's don't have enough dedicated RAM to support the textures.
Which, by proxy, sort of means it is a GPU bottleneck. Just not the usual kind.
Anyone hear of Project Offset? I kind of thought that was going to be the next big graphics thing based on some of their videos, but over several years by all outward appearances the thing has gone nowhere.
I think this is the one that got bought by Intel, with a hope of really showing off Larabee. But Intel have pretty much quietly canned their own foray into the graphics card market, so...
Anyone hear of Project Offset? I kind of thought that was going to be the next big graphics thing based on some of their videos, but over several years by all outward appearances the thing has gone nowhere.
Looks like they put out a new trailer. The game looks amazing.
I almost have to wonder if there's an 'artist cap'. I mean, you can always scan photographs and shapes to get a photorealistic image, but if you want to do something that there is no real world version of, or you can't get your hands on it, or you're trying to do something stylish with it, you're relying on artist skill to make it. Models, textures, animations, etc. When it comes right down to it, just how realistic can you make a guy with snakes for legs and a bug face?
I almost have to wonder if there's an 'artist cap'. I mean, you can always scan photographs and shapes to get a photorealistic image, but if you want to do something that there is no real world version of, or you can't get your hands on it, or you're trying to do something stylish with it, you're relying on artist skill to make it. Models, textures, animations, etc. When it comes right down to it, just how realistic can you make a guy with snakes for legs and a bug face?
Oil paintings that look like photo's. If an artist can think it he can make it.
Yeah, if you consider selling about 2-3 million copies as "alright." That game has so much fud spread around it, it's strange.
It sold like ass
It requires a $10,000 PC to run
It's just a pretty tech demo with generic gameplay
etc.
Telling, though, is that it and Warhead are still on the top end of the graphics spectrum despite being about 3 years old. I had a GPU from that time that's still more than capable of maxing out today's games, and the only reason I replaced it was because its VRAM got fried.
Wasn't Quake 3 the graphics benchmark for several years as well? We got past that and now it's Crysis at the top of the heap and, let's face it, 3 years on and even the fastest machines out there have problems getting that thing to run at 60fps with all the goodies turned on irregardless of what single video card you have. With Q3, you could turn everything on within a couple of years and crank the resolution up and get 60+fps.
There hasn't been any major advances in real time rendering for a few years now. That's what's pushed video cards for most of the last ten years.. higher poly counts, shader effects, lighting, fun stuff like that. So yes, it's slowed down. It's why stuff like PhysX and 3D is getting pushed on us, it's the hardware companies trying to keep their sales up.
I think it's a good thing, it lets developers learn the tech better and squeeze more performance out of it, and saves us gamers a bunch of cash.
My expectation for the "next level" is going to be real time ray tracing. People have been researching it for a while, proof of concept are out there, it's just a question of making it marketable.
Personally I don't think we'll see games using gpu raytracing for at least five years, possibly a lot longer if ever. Raytracing is so ridiculously computationally expensive as to be ridiculous.
EDIT: I remember a real time ray tracing demo on the Amiga, it was essentially a rotating cog or something. Ran like molasses and was no doubt written in assembly in order to get the one or two frames a second. Thing is, people have been experimenting with this stuff for a long, long time. But you get computational problems.. for example, the more light sources and the more objects you have even by a little bit increases the calculation time massively as it calculates all the surfaces rays bounce off etc.
GrimReaper on
PSN | Steam
---
I've got a spare copy of Portal, if anyone wants it message me.
Uhh, you can have ambient light. There's really nothing special with the lighting model of poly-based rendering that is inherently impossible to do with raytracing.
RandomEngy on
Profile -> Signature Settings -> Hide signatures always. Then you don't have to read this worthless text anymore.
The advantage of raytracing is that render speed does not change with scene complexity. It takes just as long to draw a smooth sphere as it does a mountain. Lighting is what kills it, especially if you're going to do anything approximating sunlight filling up a room. Looks spectacular, takes ages to render.
The last real time ray tracing demo I saw was of some guys doing it with quake, using a handful of normal desktop PC's as a render farm. Was a few years ago, and it looked kind neat. The accurate shadows looked really cool. Reflections were perfect as well.
Tried to find it on youtube, came up empty. But I did stumble on this:
I myself am glad of the slowdown of graphics quality. I've actually been, for the most part increasingly dissapointed by modern graphics. Most people tend to disagree with me, but my own impression is that without a bleeding edge SLI or crossfire computer, running a quad core with 8GB of ram or whatever the newest, best of the best hardware is... Modern games look like crap. I hear nothing but praise for the graphics of Oblivion, Bioshock, Crysis, etc. The problem with these games is the absolute horrid scaling of the graphics quality.
Lets compare for a second, I'll use Bioshock for my example. At maximum settings, the game does look quite good. But why is everything so damn shiny? I look down at my hands, and even under an intense spotlight they dont glisten like a freshly windexed mirror. I look at these games, and despite the claims of them being so realistic, everything looks... plastic. Maybe my problem is that I don't have a brand new high end gaming PC to push to its limits, but in reality, who does? Modern games have been losing the fun factor (IMO, I don't expect everyone will agree) in place of trying to showcase the amazing advances and graphical 1-ups of the last in the series.
My personal opinion on one of the best looking games out there right now, is HL2: Ep2. The source engine is incredible. It has never required top end computers to run at its peak settings, and over the years the engine has managed to scale to still look amazing and in my opinion, feel much more realistic than most of the over-glossed modern engines. Blizzard has also managed to capture this effect. Game designers need to stop focusing on pushing hardware to its limits, and start focusing on creating a style or theme that scales well from the bottom up, while still allowing those extra little goodies for those out there who want to really stress the engine and stretch their e-peen to its limits.
I guess that's my problem, I don't want to play a tech demo. I want to play a game, when the gameplay itself loses priority to trying to be the best of the best on the upper echelon of PC hardware it to me, loses its point of interest.
ImDrawingABlank on
0
Zxerolfor the smaller pieces, my shovel wouldn't doso i took off my boot and used my shoeRegistered Userregular
edited March 2010
Eh, really, no PC developer has tried to push the graphics barrier since Crytek several years back. Unreal Engine 3, which is powering like damn near every game nowadays (incl. Bioshock) can run max on the lowest-end entry level gaming PC. No one's really tried to out-tech-demo anyone else in freaking years, at least in concerns with the PC platform. (You can argue for artistic merits, then, which I cannot dispute.)
On the other side, though, you have fanboys spooging on how awesome Killzone 2/Uncharted 2/God of War 3 look. So, there you go.
IRT raytracing: Nvidia has demoed a raytracing engine running on Fermi (using CUDA instead of the rasterizer, obviously). Interesting proof-of-concept, if anything else.
My own crackpot theory is that Crysis sort of put the cart before the horse. It was kind of irresponsibly ahead of its time in that when it first came out only a tiny proportion of even "enthusiast" gamers (not "enthusiast PC builders, who are insane people") had hardware capable of running that game at anywhere near the level where it was marketed. The past few years since have seen computer hardware and other developers "catch up" and start to surpass cryengine. Crytek just kind of threw all the graphical tricks they could manage into the pot to create a really technically impressive game with kind of soulless art direction.
And no, to the person who suggested that the game assets have gotten as "good as the artists can possibly make them", we are not even close. Take a look at the starcraft 2 cinematic trailers to get a good sense of the gap between artistic capability and technology right now. Hell, even look at some of the high-resolution meshes that 3D artists make for normal maps for game assets.
Yeah, if you consider selling about 2-3 million copies as "alright." That game has so much fud spread around it, it's strange.
It sold like ass
It requires a $10,000 PC to run
It's just a pretty tech demo with generic gameplay
etc.
Telling, though, is that it and Warhead are still on the top end of the graphics spectrum despite being about 3 years old. I had a GPU from that time that's still more than capable of maxing out today's games, and the only reason I replaced it was because its VRAM got fried.
Wasn't Quake 3 the graphics benchmark for several years as well? We got past that and now it's Crysis at the top of the heap and, let's face it, 3 years on and even the fastest machines out there have problems getting that thing to run at 60fps with all the goodies turned on irregardless of what single video card you have. With Q3, you could turn everything on within a couple of years and crank the resolution up and get 60+fps.
Quake 3, while a standard benchmark of it's time, was never the best looking game on the market.
A list of the "prettiest" games on PC: (The list is subjective. These were the games I at one time would claim to be the most graphically impressive game on PC)
It does feel like the race towards shiny has slowed down.
I'm guessing the game makers are suffering from diminishing returns when it comes to making PC games even more impressive/realistic.
If Crytek remade Crysis, and doubled the man hours spent creating art assets, would people notice the difference? And how many would have the machines to play it?
ArcticMonkey on
"You read it! You can't unread it!"
0
Blake TDo you have enemies then?Good. That means you’ve stood up for something, sometime in your life.Registered Userregular
edited March 2010
Yes.
Look at what the majority of people usually buy to play games on.
Laptops and consoles.
These are cheap items (comparitively to a top line gaming PC).
Why would you want to spend more money and cater to perhaps 10% of the population when you can spend less money and cater to 90% of the market.
Nowdays with the strength of consoles and what they can do it is far more pratical to spend large amounts of time in art direction rather than an efficient engine that can push a bazillion polys per second.
Good art direction can hide a myriad of mistakes, polygons can but they have declining returns and it takes more time for artists to generate all of these polygons in the first place.
Mainly there is a change to where the main limiter of graphics isn't the hardware but the art budget. And the problem is that Moor's law does not apply to artists.
Secondly, even making profitable games for the HD consoles is pretty hard. A million seller is far from sure to make a profit these days. The fact that the PC gaming userbase (at least those that would drive graphical improvements) is smaller than that means that there isn't room for a lot of games that push the envelope.
Finally the HD consoles cost their makers a lot of money this gen (especially the PS3) so it isn't that likely that they will be willing to either go with high prices or large loses for the next gen. So I don't expect any kind of breathtaking leap in graphics next gen.
Oh, and gaming PCs have never been cheaper than now so it's hard to argue that the high price of gaming PCs is what is causing any slowdown.
lowlylowlycook on
(Please do not gift. My game bank is already full.)
0
Dhalphirdon't you open that trapdooryou're a fool if you dareRegistered Userregular
edited March 2010
I think the second poster on this nailed the issue.
Most games now are either coming out on the PS3/Xbox360/PC, or just the PS3/360. I can't offhand think of a game thats only coming out on the PC. Thus, anything that is released has to be able to run on those two consoles. They are powerful, but they are still limited by fixed hardware, and so they create a sort of ceiling where the graphics won't advance beyond the capabilities of the consoles.
Now, I might be misremembering, but back in the days the OP is thinking of, when videocards advanced in huge leaps every few months, those were the days where the PS2 had just been released, the Gamecube's impending release was being drooled over (possibly still being known as the Nintendo Dolphin), leaving PCs still significantly ahead in the graphical department, thus any highly graphical game would be developed solely for the PC, and was limited only by such hardware. It wasn't until the PS3 and 360 came out and were able to match existing PC games for graphical capability that graphics really stabilised around those two consoles.
I think the second poster on this nailed the issue.
Most games now are either coming out on the PS3/Xbox360/PC, or just the PS3/360. I can't offhand think of a game thats only coming out on the PC. Thus, anything that is released has to be able to run on those two consoles. They are powerful, but they are still limited by fixed hardware, and so they create a sort of ceiling where the graphics won't advance beyond the capabilities of the consoles.
Now, I might be misremembering, but back in the days the OP is thinking of, when videocards advanced in huge leaps every few months, those were the days where the PS2 had just been released, the Gamecube's impending release was being drooled over (possibly still being known as the Nintendo Dolphin), leaving PCs still significantly ahead in the graphical department, thus any highly graphical game would be developed solely for the PC, and was limited only by such hardware. It wasn't until the PS3 and 360 came out and were able to match existing PC games for graphical capability that graphics really stabilised around those two consoles.
I still think an average gaming machine pushed to the limits can have better looks games than the ps3.
And, from what I've heard, Crysis's problem isn't so much a GPU bottleneck, but a disk throughput bottleneck. I've heard it runs better with SSD's.
Upgrading literally any part of the machine will produce measurable gains in Crysis. It is an all-out stress test for gaming PCs, and after three years it's still more technologically advanced than virtually any other game, and more than either the PS3 or 360 can handle -- mainly because both machines have jack-shit for RAM.
In a few years, when the next batch of consoles come out, at least one of which will be built largely from commodity gaming PC parts, we'll see another leap like we got in 2005-2007.
Azio on
0
Dhalphirdon't you open that trapdooryou're a fool if you dareRegistered Userregular
I think the second poster on this nailed the issue.
Most games now are either coming out on the PS3/Xbox360/PC, or just the PS3/360. I can't offhand think of a game thats only coming out on the PC. Thus, anything that is released has to be able to run on those two consoles. They are powerful, but they are still limited by fixed hardware, and so they create a sort of ceiling where the graphics won't advance beyond the capabilities of the consoles.
Now, I might be misremembering, but back in the days the OP is thinking of, when videocards advanced in huge leaps every few months, those were the days where the PS2 had just been released, the Gamecube's impending release was being drooled over (possibly still being known as the Nintendo Dolphin), leaving PCs still significantly ahead in the graphical department, thus any highly graphical game would be developed solely for the PC, and was limited only by such hardware. It wasn't until the PS3 and 360 came out and were able to match existing PC games for graphical capability that graphics really stabilised around those two consoles.
I still think an average gaming machine pushed to the limits can have better looks games than the ps3.
That's exactly what I'm saying. The limitation on graphics is coming from the PS3 and 360, as there is a limit to how high you can go with graphical quality with a fixed set of hardware, and in today's market it is utterly insane to produce a normal game for only the pC.
It's actually perfectly fine to produce a PC game, just don't spend two years and 20 million dollars doing it. A good way to avoid this happening is to be conservative and use proven, documented, and well supported technology.
I think that graphics are at a much needed plateau. The Wii has last gen specs, but some of the art direction is astounding in games like Metroid Prime 3 and Mario Galaxy. There's points in Prime 3 where I'll just linger and stare at the scenery they've built into some of the environments. Mario Galaxy serves its purpose and I don't see how 10x more polygons could help it look better. Games like Bioshock and Uncharted look great on the PS3. It's the art direction that sets these games apart, and I'm fine with the current generation lasting 3-4 more years just so developers can make do with what they have.
Eh, I don't think art direction is going to improve as a result of some sort of graphical standstill. Black & White 2 is still one of the most beautiful games out there, and it preceded Crysis by 2 years. And it could be maxed out just fine on a 6800.
Well, that's purely more physics than graphics. You can do that with current graphics cards, you could even do that on the 360 and PS3. Although there would probably be less hair on the consoles due to processing power, the PS3 might manage it though.
GrimReaper on
PSN | Steam
---
I've got a spare copy of Portal, if anyone wants it message me.
Yeah, problem is, if you listen, they say something about 25 FPS. That's on Fermi. I honestly wouldn't be surprised if that were on SLIed Fermi cards since I don't think they specifically mentioned the hardware in the video, and the other tech demo they did for Fermi involved Tri-SLI (The raytracing one). As it stands, it's more of a pretty tech demo than something with any potential application at the moment.
Posts
PC games will always look better though, even if it's just because of higher resolutions and it's nice that stuff like ME2 has a better interface for the M+KB. Plus we always have Valve and for whats it's worth there have been some excellent PC games in the past few years whose graphics aren't exactly bleeding edge.
At some point developers were going to hit a wall where they could decide if spending entire years on 3D graphics alone was really worth it.
I think they've found a balance and GPU's have had a chance to catch up.
That doesn't mean they can't break a quad SLI box if they wanted to. It just means that there's probably no profit to be had in it.
we also talk about other random shit and clown upon each other
I think it's a good thing, it lets developers learn the tech better and squeeze more performance out of it, and saves us gamers a bunch of cash.
My expectation for the "next level" is going to be real time ray tracing. People have been researching it for a while, proof of concept are out there, it's just a question of making it marketable.
Yeah but that Quad SLI just gets you a high FPS not better graphics overall.
What's your point? You could theoretically rewrite Crysis to run on a Voodoo 3 but only get 1 frame per minute.
My point was that graphics aren't getting better because they *can't* get better. They aren't getting better because there's likely not sufficient demand for them.
we also talk about other random shit and clown upon each other
Crysis did alright in the end, didn't it?
- It sold like ass
- It requires a $10,000 PC to run
- It's just a pretty tech demo with generic gameplay
- etc.
Telling, though, is that it and Warhead are still on the top end of the graphics spectrum despite being about 3 years old. I had a GPU from that time that's still more than capable of maxing out today's games, and the only reason I replaced it was because its VRAM got fried.And apparently CryTek says Crysis didn't sell as well as they were expecting, but EA says it did better than they expected. As a whole, I'd just say CryTek set their expectations way too high.
PSN: TheScrublet
If it's a disk bottleneck then it's because the GPU's don't have enough dedicated RAM to support the textures.
Which, by proxy, sort of means it is a GPU bottleneck. Just not the usual kind.
we also talk about other random shit and clown upon each other
I think this is the one that got bought by Intel, with a hope of really showing off Larabee. But Intel have pretty much quietly canned their own foray into the graphics card market, so...
Looks like they put out a new trailer. The game looks amazing.
http://www.youtube.com/watch?v=JOPbyl7r8gk
Oil paintings that look like photo's. If an artist can think it he can make it.
NSFW
http://www.paulrobertspaintings.co.uk/
edit: These are amazing
Wasn't Quake 3 the graphics benchmark for several years as well? We got past that and now it's Crysis at the top of the heap and, let's face it, 3 years on and even the fastest machines out there have problems getting that thing to run at 60fps with all the goodies turned on irregardless of what single video card you have. With Q3, you could turn everything on within a couple of years and crank the resolution up and get 60+fps.
Personally I don't think we'll see games using gpu raytracing for at least five years, possibly a lot longer if ever. Raytracing is so ridiculously computationally expensive as to be ridiculous.
EDIT: I remember a real time ray tracing demo on the Amiga, it was essentially a rotating cog or something. Ran like molasses and was no doubt written in assembly in order to get the one or two frames a second. Thing is, people have been experimenting with this stuff for a long, long time. But you get computational problems.. for example, the more light sources and the more objects you have even by a little bit increases the calculation time massively as it calculates all the surfaces rays bounce off etc.
---
I've got a spare copy of Portal, if anyone wants it message me.
There's a reason they soak movie sets with light. Because natural, realistic lighting is bad for viewing.
we also talk about other random shit and clown upon each other
The last real time ray tracing demo I saw was of some guys doing it with quake, using a handful of normal desktop PC's as a render farm. Was a few years ago, and it looked kind neat. The accurate shadows looked really cool. Reflections were perfect as well.
Tried to find it on youtube, came up empty. But I did stumble on this:
http://www.youtube.com/watch?v=yvCVD3xtvTg
Which is pretty neat, and I figure a decent example of the state of the tech (ie: nowhere near ready for prime time).
Lets compare for a second, I'll use Bioshock for my example. At maximum settings, the game does look quite good. But why is everything so damn shiny? I look down at my hands, and even under an intense spotlight they dont glisten like a freshly windexed mirror. I look at these games, and despite the claims of them being so realistic, everything looks... plastic. Maybe my problem is that I don't have a brand new high end gaming PC to push to its limits, but in reality, who does? Modern games have been losing the fun factor (IMO, I don't expect everyone will agree) in place of trying to showcase the amazing advances and graphical 1-ups of the last in the series.
My personal opinion on one of the best looking games out there right now, is HL2: Ep2. The source engine is incredible. It has never required top end computers to run at its peak settings, and over the years the engine has managed to scale to still look amazing and in my opinion, feel much more realistic than most of the over-glossed modern engines. Blizzard has also managed to capture this effect. Game designers need to stop focusing on pushing hardware to its limits, and start focusing on creating a style or theme that scales well from the bottom up, while still allowing those extra little goodies for those out there who want to really stress the engine and stretch their e-peen to its limits.
I guess that's my problem, I don't want to play a tech demo. I want to play a game, when the gameplay itself loses priority to trying to be the best of the best on the upper echelon of PC hardware it to me, loses its point of interest.
On the other side, though, you have fanboys spooging on how awesome Killzone 2/Uncharted 2/God of War 3 look. So, there you go.
IRT raytracing: Nvidia has demoed a raytracing engine running on Fermi (using CUDA instead of the rasterizer, obviously). Interesting proof-of-concept, if anything else.
http://www.youtube.com/watch?v=PbHRsca3vkk
And no, to the person who suggested that the game assets have gotten as "good as the artists can possibly make them", we are not even close. Take a look at the starcraft 2 cinematic trailers to get a good sense of the gap between artistic capability and technology right now. Hell, even look at some of the high-resolution meshes that 3D artists make for normal maps for game assets.
A list of the "prettiest" games on PC: (The list is subjective. These were the games I at one time would claim to be the most graphically impressive game on PC)
1996 Quake
1997 Quake2
1998 Unreal
1999 Unreal Tournament
2001 Max Payne
2002 UT2003
2003 Unreal 2
2004 Far Cry
2005 Doom 3
2007 Crysis
It does feel like the race towards shiny has slowed down.
I'm guessing the game makers are suffering from diminishing returns when it comes to making PC games even more impressive/realistic.
If Crytek remade Crysis, and doubled the man hours spent creating art assets, would people notice the difference? And how many would have the machines to play it?
Look at what the majority of people usually buy to play games on.
Laptops and consoles.
These are cheap items (comparitively to a top line gaming PC).
Why would you want to spend more money and cater to perhaps 10% of the population when you can spend less money and cater to 90% of the market.
Nowdays with the strength of consoles and what they can do it is far more pratical to spend large amounts of time in art direction rather than an efficient engine that can push a bazillion polys per second.
Good art direction can hide a myriad of mistakes, polygons can but they have declining returns and it takes more time for artists to generate all of these polygons in the first place.
Satans..... hints.....
Mainly there is a change to where the main limiter of graphics isn't the hardware but the art budget. And the problem is that Moor's law does not apply to artists.
Secondly, even making profitable games for the HD consoles is pretty hard. A million seller is far from sure to make a profit these days. The fact that the PC gaming userbase (at least those that would drive graphical improvements) is smaller than that means that there isn't room for a lot of games that push the envelope.
Finally the HD consoles cost their makers a lot of money this gen (especially the PS3) so it isn't that likely that they will be willing to either go with high prices or large loses for the next gen. So I don't expect any kind of breathtaking leap in graphics next gen.
Oh, and gaming PCs have never been cheaper than now so it's hard to argue that the high price of gaming PCs is what is causing any slowdown.
(Please do not gift. My game bank is already full.)
Most games now are either coming out on the PS3/Xbox360/PC, or just the PS3/360. I can't offhand think of a game thats only coming out on the PC. Thus, anything that is released has to be able to run on those two consoles. They are powerful, but they are still limited by fixed hardware, and so they create a sort of ceiling where the graphics won't advance beyond the capabilities of the consoles.
Now, I might be misremembering, but back in the days the OP is thinking of, when videocards advanced in huge leaps every few months, those were the days where the PS2 had just been released, the Gamecube's impending release was being drooled over (possibly still being known as the Nintendo Dolphin), leaving PCs still significantly ahead in the graphical department, thus any highly graphical game would be developed solely for the PC, and was limited only by such hardware. It wasn't until the PS3 and 360 came out and were able to match existing PC games for graphical capability that graphics really stabilised around those two consoles.
I still think an average gaming machine pushed to the limits can have better looks games than the ps3.
In a few years, when the next batch of consoles come out, at least one of which will be built largely from commodity gaming PC parts, we'll see another leap like we got in 2005-2007.
That's exactly what I'm saying. The limitation on graphics is coming from the PS3 and 360, as there is a limit to how high you can go with graphical quality with a fixed set of hardware, and in today's market it is utterly insane to produce a normal game for only the pC.
Well, that's purely more physics than graphics. You can do that with current graphics cards, you could even do that on the 360 and PS3. Although there would probably be less hair on the consoles due to processing power, the PS3 might manage it though.
---
I've got a spare copy of Portal, if anyone wants it message me.