nvida is releasing a 3 card SLI and from my understanding the 3rd card is a Physics card.
Jesus, that would be at least $1200 wouldn't it?
And now you know why PC gaming is dying!
Hmm, $1200 for a top of the line PC component... Or that much for a good console and pretty tv?
Its not dying and I am tired of people who are too fucking poor to spend 700 on a fucking computer. PC gaming drives all the latest advancements in consoles, so be fucking thankful that you have ATI in your Wii and I think x-box 360 has nvidia. By the way my video card cost 99 bucks and plays Call of Duty 4 with high settings, it is a 7600 gt.
The big blunder with PhysX is that they made their drivers too good. They emulate all the functions of the card on a normal CPU, so if the game supports offloading of physics to a separate core, you'll get the same (or even better) performance. This might be different if every game was just arbitrarily doing liquid/gas simulations for everything, but by the time developers start wanting to use that level of detail we'll have integrated physics on our GPUs.
Edit: The only thing that annoys me more than the very idea of spending a bunch of money on a fairly useless component is the fact that developers feel/are compelled to arbitrarily restrict settings if they want to support PhysX. City of Heroes did this when they first added support, not allowing people to run on higher physics settings w/o the hardware. When they unlocked the settings for everyone, I discovered that they ran just fine using the CPU.
nvida is releasing a 3 card SLI and from my understanding the 3rd card is a Physics card.
Jesus, that would be at least $1200 wouldn't it?
And now you know why PC gaming is dying!
Hmm, $1200 for a top of the line PC component... Or that much for a good console and pretty tv?
Its not dying and I am tired of people who are too fucking poor to spend 700 on a fucking computer. PC gaming drives all the latest advancements in consoles, so be fucking thankful that you have ATI in your Wii and I think x-box 360 has nvidia. By the way my video card cost 99 bucks and plays Call of Duty 4 with high settings, it is a 7600 gt.
Actually, it's the other way around; console research tends to carry over into computer GPUs more than vice versa. The Gamecube's graphics chip was what led ATi to make the 9700 Pro (you know, that card that somehow kicked everything else's ass for the next five years), the X360's GPU design went on to feature prominently in the Radeon HD2900, etc. But hey, you don't even know what company makes the X360's GPU, so whatever.
edit: and saying "PC gaming isn't dying, you're just too damn poor!" is at least as retarded as saying that PC gaming is dying.
I wasn't impressed. The explosions didn't look particularly realistic at all. Sure, they are modeling how each piece moves, but it doesn't matter when most of the boards travel bolt upwards and out and then break into smaller chunks simultaneously. It just looks wrong. When the guy used the grenade launcher on the top of the tower, the entire tower, including the base, shattered at the same time. If I am going to spend cash on a dedicated physics processor, it had better have a good engine with it - those might as well have been scripted, or done with cpu + havok.
Have you seen some scenes from HL2:Ep2? Like the bridge falling down or a strider blowing up a house. Sure, the act of them asploding is scripted, but it's all physics from there. Looks damn awesome.
It's already happening. That's why the physics cards are making absolutely no noise whatsoever. NVidia and ATI have already put out industry demonstrations of graphics cards handling complex physics simulations such as water particles.
Also, the multi-core nature of modern CPU development pretty much ensures that the only thing guaranteed to have it's own core in the future of gaming is graphics, simply because CPUs are unsuitable for it. (Honestly, when's the last time you played a game in software rendering mode? Can any game even do it anymore?)
Surprised no-one has really crushed this part of the story. But the company that was demonstrating this offloading of physics candy onto the GPU was Havok.
Guess who bought Havok recently. Intel. Guess who want's physics to be carried out on multicore processors rather that graphics cards?
It's already happening. That's why the physics cards are making absolutely no noise whatsoever. NVidia and ATI have already put out industry demonstrations of graphics cards handling complex physics simulations such as water particles.
Also, the multi-core nature of modern CPU development pretty much ensures that the only thing guaranteed to have it's own core in the future of gaming is graphics, simply because CPUs are unsuitable for it. (Honestly, when's the last time you played a game in software rendering mode? Can any game even do it anymore?)
Surprised no-one has really crushed this part of the story. But the company that was demonstrating this offloading of physics candy onto the GPU was Havok.
Guess who bought Havok recently. Intel. Guess who want's physics to be carried out on multicore processors rather that graphics cards?
Like I said, none of this shit is going anywhere until we get an independent physics library that everyone can use.
Come on, Microsoft. This is all you. DirectX 11, now with DirectPhysics? Do it. Do it now. We'd have dedicated physics cards for the high end and physics/GPU combos for the midrange, with CPU physics if you turn the options way down.
I wasn't impressed. The explosions didn't look particularly realistic at all. Sure, they are modeling how each piece moves, but it doesn't matter when most of the boards travel bolt upwards and out and then break into smaller chunks simultaneously. It just looks wrong. When the guy used the grenade launcher on the top of the tower, the entire tower, including the base, shattered at the same time. If I am going to spend cash on a dedicated physics processor, it had better have a good engine with it - those might as well have been scripted, or done with cpu + havok.
Have you seen some scenes from HL2:Ep2? Like the bridge falling down or a strider blowing up a house. Sure, the act of them asploding is scripted, but it's all physics from there. Looks damn awesome.
I haven't finished Ep2 yet. But what I have seen was obviously the work of physics optimizations/dual-core utilization.
There was one jaw dropping physics action during the driving section.
victor_c26 on
It's been so long since I've posted here, I've removed my signature since most of what I had here were broken links. Shows over, you can carry on to the next post.
It's already happening. That's why the physics cards are making absolutely no noise whatsoever. NVidia and ATI have already put out industry demonstrations of graphics cards handling complex physics simulations such as water particles.
Also, the multi-core nature of modern CPU development pretty much ensures that the only thing guaranteed to have it's own core in the future of gaming is graphics, simply because CPUs are unsuitable for it. (Honestly, when's the last time you played a game in software rendering mode? Can any game even do it anymore?)
Surprised no-one has really crushed this part of the story. But the company that was demonstrating this offloading of physics candy onto the GPU was Havok.
Guess who bought Havok recently. Intel. Guess who want's physics to be carried out on multicore processors rather that graphics cards?
And really, if anyone is going to be able to make a standardized physics engine, it's Havok, not PhysX, since they actually, you know, have companies using their engine
I wasn't impressed. The explosions didn't look particularly realistic at all. Sure, they are modeling how each piece moves, but it doesn't matter when most of the boards travel bolt upwards and out and then break into smaller chunks simultaneously. It just looks wrong. When the guy used the grenade launcher on the top of the tower, the entire tower, including the base, shattered at the same time. If I am going to spend cash on a dedicated physics processor, it had better have a good engine with it - those might as well have been scripted, or done with cpu + havok.
Have you seen some scenes from HL2:Ep2? Like the bridge falling down or a strider blowing up a house. Sure, the act of them asploding is scripted, but it's all physics from there. Looks damn awesome.
Right, scripted + havok, not Physx. I haven't seen any PhysX demo that couldn't have been done with Havok.
I wasn't impressed. The explosions didn't look particularly realistic at all. Sure, they are modeling how each piece moves, but it doesn't matter when most of the boards travel bolt upwards and out and then break into smaller chunks simultaneously. It just looks wrong. When the guy used the grenade launcher on the top of the tower, the entire tower, including the base, shattered at the same time. If I am going to spend cash on a dedicated physics processor, it had better have a good engine with it - those might as well have been scripted, or done with cpu + havok.
Have you seen some scenes from HL2:Ep2? Like the bridge falling down or a strider blowing up a house. Sure, the act of them asploding is scripted, but it's all physics from there. Looks damn awesome.
Hell, all of the physics on HL2 in general looked awesome.
I wasn't impressed. The explosions didn't look particularly realistic at all. Sure, they are modeling how each piece moves, but it doesn't matter when most of the boards travel bolt upwards and out and then break into smaller chunks simultaneously. It just looks wrong. When the guy used the grenade launcher on the top of the tower, the entire tower, including the base, shattered at the same time. If I am going to spend cash on a dedicated physics processor, it had better have a good engine with it - those might as well have been scripted, or done with cpu + havok.
Have you seen some scenes from HL2:Ep2? Like the bridge falling down or a strider blowing up a house. Sure, the act of them asploding is scripted, but it's all physics from there. Looks damn awesome.
Hell, all of the physics on HL2 in general looked awesome.
Even on the Xbox.
The original Xbox.
This is why we don't need physics cards yet.
The physics in HL2 are shit. They're just a few solid bodies with collision and gravity. It's like saying the graphics on the Wii are good, it's ok at what it does but there is so much more potential out there.
nvida is releasing a 3 card SLI and from my understanding the 3rd card is a Physics card.
Jesus, that would be at least $1200 wouldn't it?
And now you know why PC gaming is dying!
Hmm, $1200 for a top of the line PC component... Or that much for a good console and pretty tv?
Its not dying and I am tired of people who are too fucking poor to spend 700 on a fucking computer. PC gaming drives all the latest advancements in consoles, so be fucking thankful that you have ATI in your Wii and I think x-box 360 has nvidia. By the way my video card cost 99 bucks and plays Call of Duty 4 with high settings, it is a 7600 gt.
$700 is going to buy you a computer capable of nothing (once you include everything). I mean $200 on a monitor, $100 on keyboard and mouse, $100 on a case and power supply, then you can get onto the insides...
And my point was that it's really really expensive to buy a top of the line machine.
and $100 on a case (for a $700 PC O_o)? only if you're going with the very top of the line with all that rediculous lights and crap. (or water cooling, but again, for a $700 pc?) I got my Sonata II for like $50 with no sales, and that's more than I really needed to spend on it.
I don't disagree with you, but $100 on a keyboard and mouse? Try $20, if that.
A decent mouse is going to cost more than that. A MX518 will run around $40 by itself.
Decent is subjective, for gaming I consider my $10 wired laser mouse more than adequate and I'm using a 10 year old keyboard I got for free.
Anyway, the physics in the GRAW 2 trailer linked earlier are pretty cool. However they are nothing that can't be done with scripting and existing hardware. I don't really see the point for a physics card with multi-core cpus becomming the norm and multicore support for games just starting. World of Warcraft's most recent content patch added preliminary support for multicore processors. As the game support for multicore grows I think design studios will realize there is no need for a seperate physics card and will just offload physics work to a seperate core.
nvida is releasing a 3 card SLI and from my understanding the 3rd card is a Physics card.
Jesus, that would be at least $1200 wouldn't it?
And now you know why PC gaming is dying!
Hmm, $1200 for a top of the line PC component... Or that much for a good console and pretty tv?
Its not dying and I am tired of people who are too fucking poor to spend 700 on a fucking computer. PC gaming drives all the latest advancements in consoles, so be fucking thankful that you have ATI in your Wii and I think x-box 360 has nvidia. By the way my video card cost 99 bucks and plays Call of Duty 4 with high settings, it is a 7600 gt.
$700 is going to buy you a computer capable of nothing (once you include everything). I mean $200 on a monitor, $100 on keyboard and mouse, $100 on a case and power supply, then you can get onto the insides...
And my point was that it's really really expensive to buy a top of the line machine.
Wow wtf.. My keyboard is 10 years old and I used the same mouse for 5 years. Oh and my computer cost about 700 shipped and it plays Call of Duty 4 on max settings...
You know buying a Wii is going to run you 800 because you have to buy a TV to play it.... Oh and you can get a decent screen for about 150 that is 19 inch...
Oh a good case only costs about 40 bucks and I got my Thermaltake PSU you 19.99 on sale.
Hey, man, I've got a 7600GT, too, and it's a nice card for cheap, but let's not make some shit up about it doing max settings on anything recent, at all.
The "olol pc gaming is fine; you're just poor!" argument is retarded because why the fuck do you think that PC games don't sell all that well? I mean, shit, people won't even buy a $500 PS3.
All I know is that I got a 360 when my PC went obsolete, and everything I was looking forward to is getting ported to it, except Starcraft 2, which will run on my obsolete computer because Blizzard isn't as retarded as most of the other PC game studios.
But then, this is for a different thread. Back to how fucking pointless PhysX is, which I think everyone here can agree on.
Hey, man, I've got a 7600GT, too, and it's a nice card for cheap, but let's not make some shit up about it doing max settings on anything recent, at all.
The "olol pc gaming is fine; you're just poor!" argument is retarded because why the fuck do you think that PC games don't sell all that well? I mean, shit, people won't even buy a $500 PS3.
All I know is that I got a 360 when my PC went obsolete, and everything I was looking forward to is getting ported to it, except Starcraft 2, which will run on my obsolete computer because Blizzard isn't as retarded as most of the other PC game studios.
But then, this is for a different thread. Back to how fucking pointless PhysX is, which I think everyone here can agree on.
Thing to remember about any PC cost debate is that PCs have a ton of other uses. If you're a college student these days you have to get a computer, no way around it. At that point the only extra cost for gaming is the video card.
As a huge cheapass gamer who mostly looks at high end compenents with a "when I win the lotto" look, I wouldn't bother with a PhysX even if I had the million dollars.
Thing to remember about any PC cost debate is that PCs have a ton of other uses. If you're a college student these days you have to get a computer, no way around it. At that point the only extra cost for gaming is the video card.
As a huge cheapass gamer who mostly looks at high end compenents with a "when I win the lotto" look, I wouldn't bother with a PhysX even if I had the million dollars.
Im sorry, but this is exceptionally unture - you need a pc for your basic college work? Ok, a good ol last gen celeron, 1gb ram (if that) , xp, office 03 and a couple other apps depending on your needs (like visual studio, or whatever) will be more than sufficent. Throwing in a graphics card wont cut it.
You can always talk yourself and everyone else into believing that you need a quad-core 4gb 64bit with atleast 1TB of storage and a giant flat screen monitor to make it through college, but in the end your only BS'ing yourself. Truth be told you really only need to be able to use office if you can tolerate using college labs to do your work, and unless your in a graphics oriented program you dont need anything too excessive.
As for the PhysX itself, ive looked at it from time to time, and its always seemed a luxury I didnt need. However it should not technically be responsible for your game slowing down when you use it, if it has its own processor dedicated to processing physics there shouldnt be any problem. I think the problem would occur with poor drivers and poor processes between your system, your gpu and your "PhysX" infact a lag between the components due to hardware or driver's being poor would cause a good deal of slow down. I am also much more likely to believe it is simply hardware related as your GPU is not likely built to take advantage of a PhysX card and your GPU manufacturer probabally wont want you wasting your time with one if they can claim to do it better
jlrx on
[SIGPIC][/SIGPIC]
0
Blake TDo you have enemies then?Good. That means you’ve stood up for something, sometime in your life.Registered Userregular
It's already happening. That's why the physics cards are making absolutely no noise whatsoever. NVidia and ATI have already put out industry demonstrations of graphics cards handling complex physics simulations such as water particles.
Also, the multi-core nature of modern CPU development pretty much ensures that the only thing guaranteed to have it's own core in the future of gaming is graphics, simply because CPUs are unsuitable for it. (Honestly, when's the last time you played a game in software rendering mode? Can any game even do it anymore?)
Surprised no-one has really crushed this part of the story. But the company that was demonstrating this offloading of physics candy onto the GPU was Havok.
Guess who bought Havok recently. Intel. Guess who want's physics to be carried out on multicore processors rather that graphics cards?
But the difference is that a single core can handle this and it's a sensible and cost effective solution.
I mean the phyx card is used just for physics, that spare core will be used when not playing games.
Thing to remember about any PC cost debate is that PCs have a ton of other uses. If you're a college student these days you have to get a computer, no way around it. At that point the only extra cost for gaming is the video card.
As a huge cheapass gamer who mostly looks at high end compenents with a "when I win the lotto" look, I wouldn't bother with a PhysX even if I had the million dollars.
Im sorry, but this is exceptionally unture - you need a pc for your basic college work? Ok, a good ol last gen celeron, 1gb ram (if that) , xp, office 03 and a couple other apps depending on your needs (like visual studio, or whatever) will be more than sufficent. Throwing in a graphics card wont cut it.
You can always talk yourself and everyone else into believing that you need a quad-core 4gb 64bit with atleast 1TB of storage and a giant flat screen monitor to make it through college, but in the end your only BS'ing yourself. Truth be told you really only need to be able to use office if you can tolerate using college labs to do your work, and unless your in a graphics oriented program you dont need anything too excessive.
As for the PhysX itself, ive looked at it from time to time, and its always seemed a luxury I didnt need. However it should not technically be responsible for your game slowing down when you use it, if it has its own processor dedicated to processing physics there shouldnt be any problem. I think the problem would occur with poor drivers and poor processes between your system, your gpu and your "PhysX" infact a lag between the components due to hardware or driver's being poor would cause a good deal of slow down. I am also much more likely to believe it is simply hardware related as your GPU is not likely built to take advantage of a PhysX card and your GPU manufacturer probabally wont want you wasting your time with one if they can claim to do it better
I wouldn't say that it's absolutely necessary, but upgrading to a c2d from an ancient AMD Athalon definitely had a noticeable effect in matlab, especially when multitasking, ie web-browsing, chating and listening to music at the same time, And it's almost painful to have to wait for the same calculations on lab's computers after having them done near instantly.
And Re: your comment on PhysX, I'm not entirely sure what you're trying to say, but I think that you're suggesting that the GPU drivers make PhysX slower? Wouldn't that be a software problem and not hardware related? Either way, graphics are done using a standardized API, which is why both ATi and nVidia cards work for all the same games, and there is no standard physics engine, which is why only a half-dozen games acutally use physX. I really don't get your last sentence, since physics are calculated in the CPU, it doesn't really matter what the GPU do, since they don't touch physics anyway. Also, yes, the extra physX effects it makes creates extra geometry to render, so even if all the physics calculations are done, there's still a lot more crap the GPU has to render, which is why it's slower. And
It's already happening. That's why the physics cards are making absolutely no noise whatsoever. NVidia and ATI have already put out industry demonstrations of graphics cards handling complex physics simulations such as water particles.
Also, the multi-core nature of modern CPU development pretty much ensures that the only thing guaranteed to have it's own core in the future of gaming is graphics, simply because CPUs are unsuitable for it. (Honestly, when's the last time you played a game in software rendering mode? Can any game even do it anymore?)
Surprised no-one has really crushed this part of the story. But the company that was demonstrating this offloading of physics candy onto the GPU was Havok.
Guess who bought Havok recently. Intel. Guess who want's physics to be carried out on multicore processors rather that graphics cards?
But the difference is that a single core can handle this and it's a sensible and cost effective solution.
I mean the phyx card is used just for physics, that spare core will be used when not playing games.
Well, I think the point is, a single core from a general purpose CPU can't handle complex physics interactions. It's just not designed for it.
It's already happening. That's why the physics cards are making absolutely no noise whatsoever. NVidia and ATI have already put out industry demonstrations of graphics cards handling complex physics simulations such as water particles.
Also, the multi-core nature of modern CPU development pretty much ensures that the only thing guaranteed to have it's own core in the future of gaming is graphics, simply because CPUs are unsuitable for it. (Honestly, when's the last time you played a game in software rendering mode? Can any game even do it anymore?)
Surprised no-one has really crushed this part of the story. But the company that was demonstrating this offloading of physics candy onto the GPU was Havok.
Guess who bought Havok recently. Intel. Guess who want's physics to be carried out on multicore processors rather that graphics cards?
But the difference is that a single core can handle this and it's a sensible and cost effective solution.
I mean the phyx card is used just for physics, that spare core will be used when not playing games.
Well, I think the point is, a single core from a general purpose CPU can't handle complex physics interactions. It's just not designed for it.
But it totally can, and in fact does now, just look at every game with a physics engine. Also, as an added bonus, you don't need to worry about a card tied to a specific engine. I mean, really. Remember when everyone moved away from Karma and over to Havok? Imagine if you had a Karma-specific card. That's what Ageia will be in a few years.
It's amazing how this thread turned into a "PC gaming is dying because it costs too much" thread.
Personally, I won't be spending nearly as much on my PC as I will be on consoles in the future. Lets not kid, PC gaming does cost more than console gaming, but it's not as much of a price difference as people make it out to be.
One thing about buying midrange parts(i.e. the 7600gt), sure it can play games right now, but what about in a year when you have to start turning down the settings?
Look at the 6600GT, when it came out it was a pretty good part. Decent price, played games pretty well at the time. Look now, the 6600GT cannot handle newer games very well at all.
I bought a 7800GT when it came out 2 years ago, and UT3 is the first game that I've had to turn down the settings to run properly(don't have crysis yet). that's after over 2 years. I fully expect it to last at least another year before I want to consider changing it out.
My point is, is that you can spend $150 on a midrange part, but that one won't last you as long as the high end part.
$150 every 2 years, or $300 every 4. It's all the same thing.
I just want more realistic physics with ragdolls, and not a corpse losing all muscle control the second I shoot a guy in the foot. Blend in some animations, damnit!
I saw some tech demos that did exactly this. Anyone remember what that engine was called?
I just want more realistic physics with ragdolls, and not a corpse losing all muscle control the second I shoot a guy in the foot. Blend in some animations, damnit!
I saw some tech demos that did exactly this. Anyone remember what that engine was called?
Sounds like the Meqon demos I downloaded years ago. I'd link you to it now but they got bought by Ageia, and I think their physics thing is actually powering the PhysX cards now.
That sounds like Source engine ragdolls Echo. Also UT3 has some awesome headshot ragdolls. The head explodes and the body stands there for a second then ragdolls over onto itself. The rest of the ragdolling is pretty crazy though.
It's already happening. That's why the physics cards are making absolutely no noise whatsoever. NVidia and ATI have already put out industry demonstrations of graphics cards handling complex physics simulations such as water particles.
Also, the multi-core nature of modern CPU development pretty much ensures that the only thing guaranteed to have it's own core in the future of gaming is graphics, simply because CPUs are unsuitable for it. (Honestly, when's the last time you played a game in software rendering mode? Can any game even do it anymore?)
Surprised no-one has really crushed this part of the story. But the company that was demonstrating this offloading of physics candy onto the GPU was Havok.
Guess who bought Havok recently. Intel. Guess who want's physics to be carried out on multicore processors rather that graphics cards?
But the difference is that a single core can handle this and it's a sensible and cost effective solution.
I mean the phyx card is used just for physics, that spare core will be used when not playing games.
Well, I think the point is, a single core from a general purpose CPU can't handle complex physics interactions. It's just not designed for it.
But it totally can, and in fact does now, just look at every game with a physics engine. Also, as an added bonus, you don't need to worry about a card tied to a specific engine. I mean, really. Remember when everyone moved away from Karma and over to Havok? Imagine if you had a Karma-specific card. That's what Ageia will be in a few years.
I see a lot of games with extremely shitty physics (not that Ageia seems to do much better, but their hearts are at least in the right place - games are only going to play better and look better if they integrate better physics - can you imagine how cool an FPS fight would be if levels played out like the Lobby Scene?). It's like saying back in 1996 that games look fine in software and you don't understand why people want this 3D card thing. Can you imagine being stuck with a PowerVR card when everyone else was using Glide? Or what the hells with this Audio thing, my PC speaker is fine. Can you imagine being stuck with Adlib when everyone else is using Soundblaster.
I just want more realistic physics with ragdolls, and not a corpse losing all muscle control the second I shoot a guy in the foot. Blend in some animations, damnit!
I saw some tech demos that did exactly this. Anyone remember what that engine was called?
Sounds like the Meqon demos I downloaded years ago. I'd link you to it now but they got bought by Ageia, and I think their physics thing is actually powering the PhysX cards now.
Wasn't that the stuff that LucasArts and that Football game (Backbreaker) are using. The Euphoria Engine.
Wasn't that the stuff that LucasArts and that Football game (Backbreaker) are using. The Euphoria Engine.
That might have been it. I remember an Indy-style demo with a guy on a rope bridge that got rocked back and forth, and some martial arts where a guy got knocked off a roof, tumbled a bit and then got back on his feet.
I can accept the possibility that physics effects can be taxing enough to require dedicated hardware. However, despite all their work, nobody has yet to create any game with those kinds of effects. What people are doing is implementing fancier versions of stuff we've already seen, then arbitrarily restricting higher settings to folks without cards because of some licensing deal with Aegia.
Good examples of this kind of bullshit are the Cell demo, which could be tweaked to run without a card (and did so with very minor impact on performance), and CoH/V, where most of us knew from the beginning that the limited effects didn't need a card.
It's like saying back in 1996 that games look fine in software and you don't understand why people want this 3D card thing. Can you imagine being stuck with a PowerVR card when everyone else was using Glide?
And that's exactly what happened, and hardware-accelerated 3D didn't take off until OpenGL and Direct3D let every (new) card play any game. Meanwhile, 3DFX rode their proprietary Glide shit into not-being-a-company-anymore.
It's like saying back in 1996 that games look fine in software and you don't understand why people want this 3D card thing. Can you imagine being stuck with a PowerVR card when everyone else was using Glide?
And that's exactly what happened, and hardware-accelerated 3D didn't take off until OpenGL and Direct3D let every (new) card play any game. Meanwhile, 3DFX rode their proprietary Glide shit into not-being-a-company-anymore.
Which is exactly what I was saying about the lack of a unified physics API this whole time. hi5
It's like saying back in 1996 that games look fine in software and you don't understand why people want this 3D card thing. Can you imagine being stuck with a PowerVR card when everyone else was using Glide?
And that's exactly what happened, and hardware-accelerated 3D didn't take off until OpenGL and Direct3D let every (new) card play any game. Meanwhile, 3DFX rode their proprietary Glide shit into not-being-a-company-anymore.
Which is exactly what I was saying about the lack of a unified physics API this whole time. hi5
I think that was kinda my point and why it ought to happen again. Because until it's realised that people actually care about physics, they just won't bother. Same thing happened with sound cards as well before everyone settled for an industry standard.
Posts
Its not dying and I am tired of people who are too fucking poor to spend 700 on a fucking computer. PC gaming drives all the latest advancements in consoles, so be fucking thankful that you have ATI in your Wii and I think x-box 360 has nvidia. By the way my video card cost 99 bucks and plays Call of Duty 4 with high settings, it is a 7600 gt.
Edit: The only thing that annoys me more than the very idea of spending a bunch of money on a fairly useless component is the fact that developers feel/are compelled to arbitrarily restrict settings if they want to support PhysX. City of Heroes did this when they first added support, not allowing people to run on higher physics settings w/o the hardware. When they unlocked the settings for everyone, I discovered that they ran just fine using the CPU.
Actually, it's the other way around; console research tends to carry over into computer GPUs more than vice versa. The Gamecube's graphics chip was what led ATi to make the 9700 Pro (you know, that card that somehow kicked everything else's ass for the next five years), the X360's GPU design went on to feature prominently in the Radeon HD2900, etc. But hey, you don't even know what company makes the X360's GPU, so whatever.
edit: and saying "PC gaming isn't dying, you're just too damn poor!" is at least as retarded as saying that PC gaming is dying.
Have you seen some scenes from HL2:Ep2? Like the bridge falling down or a strider blowing up a house. Sure, the act of them asploding is scripted, but it's all physics from there. Looks damn awesome.
Surprised no-one has really crushed this part of the story. But the company that was demonstrating this offloading of physics candy onto the GPU was Havok.
Guess who bought Havok recently. Intel. Guess who want's physics to be carried out on multicore processors rather that graphics cards?
Like I said, none of this shit is going anywhere until we get an independent physics library that everyone can use.
Come on, Microsoft. This is all you. DirectX 11, now with DirectPhysics? Do it. Do it now. We'd have dedicated physics cards for the high end and physics/GPU combos for the midrange, with CPU physics if you turn the options way down.
Give it a few years, I guess.
I haven't finished Ep2 yet. But what I have seen was obviously the work of physics optimizations/dual-core utilization.
There was one jaw dropping physics action during the driving section.
And really, if anyone is going to be able to make a standardized physics engine, it's Havok, not PhysX, since they actually, you know, have companies using their engine
Right, scripted + havok, not Physx. I haven't seen any PhysX demo that couldn't have been done with Havok.
Hell, all of the physics on HL2 in general looked awesome.
Even on the Xbox.
The original Xbox.
This is why we don't need physics cards yet.
The physics in HL2 are shit. They're just a few solid bodies with collision and gravity. It's like saying the graphics on the Wii are good, it's ok at what it does but there is so much more potential out there.
$700 is going to buy you a computer capable of nothing (once you include everything). I mean $200 on a monitor, $100 on keyboard and mouse, $100 on a case and power supply, then you can get onto the insides...
And my point was that it's really really expensive to buy a top of the line machine.
A decent mouse is going to cost more than that. A MX518 will run around $40 by itself.
Decent is subjective, for gaming I consider my $10 wired laser mouse more than adequate and I'm using a 10 year old keyboard I got for free.
Anyway, the physics in the GRAW 2 trailer linked earlier are pretty cool. However they are nothing that can't be done with scripting and existing hardware. I don't really see the point for a physics card with multi-core cpus becomming the norm and multicore support for games just starting. World of Warcraft's most recent content patch added preliminary support for multicore processors. As the game support for multicore grows I think design studios will realize there is no need for a seperate physics card and will just offload physics work to a seperate core.
Wow wtf.. My keyboard is 10 years old and I used the same mouse for 5 years. Oh and my computer cost about 700 shipped and it plays Call of Duty 4 on max settings...
You know buying a Wii is going to run you 800 because you have to buy a TV to play it.... Oh and you can get a decent screen for about 150 that is 19 inch...
Oh a good case only costs about 40 bucks and I got my Thermaltake PSU you 19.99 on sale.
http://www.newegg.com/Product/Product.asp?Item=N82E16811147006
http://www.newegg.com/Product/Product.asp?Item=N82E16819103068
http://www.newegg.com/Product/Product.asp?Item=N82E16814130062
http://www.newegg.com/Product/Product.asp?Item=N82E16820145527
http://www.newegg.com/Product/Product.asp?Item=N82E16827106070
http://www.newegg.com/Product/Product.asp?Item=N82E16835100007
Total 556.00 bucks and I could have went with a cheaper mobo that did not have SLI. So I think that leaves room for your "200" dollar screen
http://www.newegg.com/Product/Product.aspx?Item=N82E16824254009
119.99 for a 19 inch LCD...
The "olol pc gaming is fine; you're just poor!" argument is retarded because why the fuck do you think that PC games don't sell all that well? I mean, shit, people won't even buy a $500 PS3.
All I know is that I got a 360 when my PC went obsolete, and everything I was looking forward to is getting ported to it, except Starcraft 2, which will run on my obsolete computer because Blizzard isn't as retarded as most of the other PC game studios.
But then, this is for a different thread. Back to how fucking pointless PhysX is, which I think everyone here can agree on.
I really do run Call of Duty 4 on max settings.
As a huge cheapass gamer who mostly looks at high end compenents with a "when I win the lotto" look, I wouldn't bother with a PhysX even if I had the million dollars.
Im sorry, but this is exceptionally unture - you need a pc for your basic college work? Ok, a good ol last gen celeron, 1gb ram (if that) , xp, office 03 and a couple other apps depending on your needs (like visual studio, or whatever) will be more than sufficent. Throwing in a graphics card wont cut it.
You can always talk yourself and everyone else into believing that you need a quad-core 4gb 64bit with atleast 1TB of storage and a giant flat screen monitor to make it through college, but in the end your only BS'ing yourself. Truth be told you really only need to be able to use office if you can tolerate using college labs to do your work, and unless your in a graphics oriented program you dont need anything too excessive.
As for the PhysX itself, ive looked at it from time to time, and its always seemed a luxury I didnt need. However it should not technically be responsible for your game slowing down when you use it, if it has its own processor dedicated to processing physics there shouldnt be any problem. I think the problem would occur with poor drivers and poor processes between your system, your gpu and your "PhysX" infact a lag between the components due to hardware or driver's being poor would cause a good deal of slow down. I am also much more likely to believe it is simply hardware related as your GPU is not likely built to take advantage of a PhysX card and your GPU manufacturer probabally wont want you wasting your time with one if they can claim to do it better
But the difference is that a single core can handle this and it's a sensible and cost effective solution.
I mean the phyx card is used just for physics, that spare core will be used when not playing games.
Satans..... hints.....
I wouldn't say that it's absolutely necessary, but upgrading to a c2d from an ancient AMD Athalon definitely had a noticeable effect in matlab, especially when multitasking, ie web-browsing, chating and listening to music at the same time, And it's almost painful to have to wait for the same calculations on lab's computers after having them done near instantly.
And Re: your comment on PhysX, I'm not entirely sure what you're trying to say, but I think that you're suggesting that the GPU drivers make PhysX slower? Wouldn't that be a software problem and not hardware related? Either way, graphics are done using a standardized API, which is why both ATi and nVidia cards work for all the same games, and there is no standard physics engine, which is why only a half-dozen games acutally use physX. I really don't get your last sentence, since physics are calculated in the CPU, it doesn't really matter what the GPU do, since they don't touch physics anyway. Also, yes, the extra physX effects it makes creates extra geometry to render, so even if all the physics calculations are done, there's still a lot more crap the GPU has to render, which is why it's slower. And
Well, I think the point is, a single core from a general purpose CPU can't handle complex physics interactions. It's just not designed for it.
But it totally can, and in fact does now, just look at every game with a physics engine. Also, as an added bonus, you don't need to worry about a card tied to a specific engine. I mean, really. Remember when everyone moved away from Karma and over to Havok? Imagine if you had a Karma-specific card. That's what Ageia will be in a few years.
Personally, I won't be spending nearly as much on my PC as I will be on consoles in the future. Lets not kid, PC gaming does cost more than console gaming, but it's not as much of a price difference as people make it out to be.
One thing about buying midrange parts(i.e. the 7600gt), sure it can play games right now, but what about in a year when you have to start turning down the settings?
Look at the 6600GT, when it came out it was a pretty good part. Decent price, played games pretty well at the time. Look now, the 6600GT cannot handle newer games very well at all.
I bought a 7800GT when it came out 2 years ago, and UT3 is the first game that I've had to turn down the settings to run properly(don't have crysis yet). that's after over 2 years. I fully expect it to last at least another year before I want to consider changing it out.
My point is, is that you can spend $150 on a midrange part, but that one won't last you as long as the high end part.
$150 every 2 years, or $300 every 4. It's all the same thing.
I saw some tech demos that did exactly this. Anyone remember what that engine was called?
Sounds like the Meqon demos I downloaded years ago. I'd link you to it now but they got bought by Ageia, and I think their physics thing is actually powering the PhysX cards now.
I see a lot of games with extremely shitty physics (not that Ageia seems to do much better, but their hearts are at least in the right place - games are only going to play better and look better if they integrate better physics - can you imagine how cool an FPS fight would be if levels played out like the Lobby Scene?). It's like saying back in 1996 that games look fine in software and you don't understand why people want this 3D card thing. Can you imagine being stuck with a PowerVR card when everyone else was using Glide? Or what the hells with this Audio thing, my PC speaker is fine. Can you imagine being stuck with Adlib when everyone else is using Soundblaster.
Wasn't that the stuff that LucasArts and that Football game (Backbreaker) are using. The Euphoria Engine.
That might have been it. I remember an Indy-style demo with a guy on a rope bridge that got rocked back and forth, and some martial arts where a guy got knocked off a roof, tumbled a bit and then got back on his feet.
e: It's become some douchenozzle always comes in and says "zomg! I have to spend leik $4,000 to get a good gaming pc!!"
Good examples of this kind of bullshit are the Cell demo, which could be tweaked to run without a card (and did so with very minor impact on performance), and CoH/V, where most of us knew from the beginning that the limited effects didn't need a card.
And that's exactly what happened, and hardware-accelerated 3D didn't take off until OpenGL and Direct3D let every (new) card play any game. Meanwhile, 3DFX rode their proprietary Glide shit into not-being-a-company-anymore.
Which is exactly what I was saying about the lack of a unified physics API this whole time. hi5
I think that was kinda my point and why it ought to happen again. Because until it's realised that people actually care about physics, they just won't bother. Same thing happened with sound cards as well before everyone settled for an industry standard.