As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/
Options

PhysX - How good is it?

13»

Posts

  • Options
    SpoitSpoit *twitch twitch* Registered User regular
    edited November 2007
    stigweard wrote: »
    And which industry standard would that be? Miles 2d, dsound, hrtf, eax, etc... Industry leader != industry standard.

    Yeah, and while Microsoft is trying for a standard with OpenAL, the main effect that happend was everyone with creative cards getting screwed over since EAX doesn't work in vista.

    Spoit on
    steam_sig.png
  • Options
    stigweardstigweard Registered User regular
    edited November 2007
    openAL is / was an open spec, and not created by Microsoft. It is fully cross platform 3d sound api and I'd rather they used it than eax, even though it is also being heavily developed by Creative. I'd put on my conspiracy hat and tell you Creative pushed MS to drop eax so that they can sell new openAL hw accelerators (x-Fi and up), but I have no solid proof.

    stigweard on
  • Options
    waterloggedwaterlogged Registered User regular
    edited November 2007
    I've recently got UT3 and Crysis for my Birthday, amazing games. But I've found they sometimes slow down when everything goes boom, or if i collapse a building or something. I have a Geforce 8800 GTX and a 2.4 Ghz AMD dual core, and 2 gigs of RAM. I'm wondering if the solution to this is to buy a PhysX card, and with Christmas coming up I have an opportunity to get one.

    I'm just wondering if anyone has one and how good they actually are. Any advice would be great. :)

    Crysis does not support physx. Only solution is more CPU or GPU horse power.

    waterlogged on
    Democrat that will switch parties and turn red if Clinton is nominated.:P[SIGPIC][/SIGPIC]
  • Options
    stigweardstigweard Registered User regular
    edited November 2007
    Odd, Crysis forces the install of Ageia drivers. I wonder why it does when it doesn't use them?

    stigweard on
  • Options
    DaedalusDaedalus Registered User regular
    edited November 2007
    stigweard wrote: »
    Odd, Crysis forces the install of Ageia drivers. I wonder why it does when it doesn't use them?

    Aegia is also (when you don't have the super-de-dooper accelerator card) a regular physics engine. Hell, there's a Wii game that uses Ageia.

    Daedalus on
  • Options
    stigweardstigweard Registered User regular
    edited November 2007
    Right, but if the drivers are there, and the engine uses that physics model, wouldn't it follow that the card would add benefit? It would seem strange for Ageia to license the engien without sticking that 'added benefit' in there somewhere.

    stigweard on
  • Options
    DaedalusDaedalus Registered User regular
    edited November 2007
    stigweard wrote: »
    Right, but if the drivers are there, and the engine uses that physics model, wouldn't it follow that the card would add benefit? It would seem strange for Ageia to license the engien without sticking that 'added benefit' in there somewhere.

    Could be CryEngine2 nonsense interfering, or the poster was wrong, or I dunno. I don't have Crysis; I'm pretty confident that the game would skull-fuck my computer and even if it didn't I don't buy games that are just tech demos for the company's new game engine; I learned my lesson with Doom 3 thank you very much.

    Daedalus on
  • Options
    SpoitSpoit *twitch twitch* Registered User regular
    edited November 2007
    stigweard wrote: »
    openAL is / was an open spec, and not created by Microsoft. It is fully cross platform 3d sound api and I'd rather they used it than eax, even though it is also being heavily developed by Creative. I'd put on my conspiracy hat and tell you Creative pushed MS to drop eax so that they can sell new openAL hw accelerators (x-Fi and up), but I have no solid proof.

    One might think so, but with the glacial pace that creative releases new products, it's unlikely. Their main product line the audigys, which came out like half a decade ago, and even the X-fi's came out around BF2, with the only added sku being that gimped software only one last year. I place even odds on them not debuting a new card until at least '09.

    EDIT: Crysis uses Ageia? Searching google, the first result of crysis and ageia is an interview on crysis-online
    11) Does Crysis support any specialty hardware such as Ageia PhysX?

    Yes, however Crysis will not support the Ageia PhysX card due to the fact that Crytek have built their own proprietary physics engine which is not only more advanced than Ageia's, but performs very well on ordinary CPU's (especially multi-core platofmrs). Crysis will support various technologies out of the box including 64-bit operating systems, multi-core processors (as mentioned above), DX9 through to DX10 and many gaming devices such as gamepads. On top of that, the Sandbox editor that comes with Crysis supports the use of a webcam (or any other video capturing device or camera) to animate character expressions.

    Spoit on
    steam_sig.png
  • Options
    RookRook Registered User regular
    edited November 2007
    stigweard wrote: »
    And which industry standard would that be? Miles 2d, dsound, hrtf, eax, etc... Industry leader != industry standard.

    Semantics aside I was thinking of directsound, since that's the one they all support.

    Rook on
  • Options
    FreddyDFreddyD Registered User regular
    edited November 2007
    I remember when people found out that Cell Factor would cap the framerate if it found you didn't have a physx card, since there were little to no performance benefits from it. That was epic.

    FreddyD on
  • Options
    MblackwellMblackwell Registered User regular
    edited November 2007
    Rook wrote: »
    stigweard wrote: »
    And which industry standard would that be? Miles 2d, dsound, hrtf, eax, etc... Industry leader != industry standard.

    Semantics aside I was thinking of directsound, since that's the one they all support.

    Wait really? Because Linux and MacOS support that too... right?

    Oh... yeah...

    Mblackwell on
    Music: The Rejected Applications | Nintendo Network ID: Mblackwell

  • Options
    RookRook Registered User regular
    edited November 2007
    Mblackwell wrote: »
    Rook wrote: »
    stigweard wrote: »
    And which industry standard would that be? Miles 2d, dsound, hrtf, eax, etc... Industry leader != industry standard.

    Semantics aside I was thinking of directsound, since that's the one they all support.

    Wait really? Because Linux and MacOS support that too... right?

    Oh... yeah...

    Yeah... uhm I'm sure there's a couple of game developers that care about them. But generally speaking, when we're talking about adopting a standard and stuffing it in DirectX, we're not talking about linux and macs.

    Rook on
  • Options
    stigweardstigweard Registered User regular
    edited November 2007
    I recognize I'm being a nit picky tard, but DirectX isn't an industry standard, it is a Microsoft standard and an industry leader.

    stigweard on
  • Options
    SpoitSpoit *twitch twitch* Registered User regular
    edited November 2007
    It's a standard that's used by most of the industry, just because OpenGL has a fair number of companies that use it also doesn't mean that DX isn't a standard. Really, it's not like being a standard means that there can't be any competitors.

    Spoit on
    steam_sig.png
  • Options
    stigweardstigweard Registered User regular
    edited November 2007
    I guess I am being too pedantic.

    HTML 3.2 is a standard. You can make any browser however you want so long as its rendering complies with the standard. IEEE 802.11 is a standard. You can design any operable hardware you want so long as it meets the standard. DirectX is a standard. It is a collection of APIs for developing games that will run on windows. You can, however make competing games with the same functionality on other platforms without using any part of DirectX. See the difference?

    stigweard on
  • Options
    PlutoniumPlutonium Registered User regular
    edited November 2007
    Edge: You’ve more or less already placed your cards on the table about this, but what do you make of discrete physics cards, like Ageia’s PhysX?

    GN: I think that’s a horrible idea. At the same time that the distinction between the GPU and CPU is going away the PPU guys want to come in and define a new set of abstractions, where we have memory and data that’s really far away from the CPU and CPU... How do I tell when something breaks, or gets pushed by a monster? All these decisions I have on my CPU have to sit around until they are resolved on the PPU and GPU, and you end up with a physics decelerator. This is the reason you want a homogenous architecture.

    Plutonium on
  • Options
    waterloggedwaterlogged Registered User regular
    edited December 2007
    That's a nice quite by GN, and there is another thing he didn't point out that's rather hardware geek level, and coming from experience using a PPU I can sum up what goes on. Well he did say it, just not really explain it.

    PPU lowers frame rate for a lot of reasons. The common one blamed is "it puts more crap on the screen thus your GPU get's stressed" sadly this is one of the smaller problems with it.

    When they released the PPU they chose PCI because most mobo's PCI-E x1 slots weren't truly compliant with the spec. This is one of the same reasons creative and other sound card makers have dodged PCI-E like the plague. Though this has changed.

    The kicker about PCI devices vs PCI-E though is that PCI-E is point to point, whereas PCI devices all take turns over the same bus.

    Now inherently a lot of integrated crap on your mobo is on the PCI bus, toss in an add on sound card, an add on raid card (sounds stupid but those with the cash to throw at SLI PHYSX are not using integrated raid) and you now have a lot of devices pushing a lot of data taking turns over the same bus. This really screws things up and tanks frame rate, since you have to wait for the physx cards turn on the bus.

    If the card was PCI-E (and they do have one in the works) it would help a metric ton.

    From having used a physx card you can remove some of the frame rate drops by disabling mobo items in BIOS, removing your sound card, and other idiocy that shouldn't be needed. In basic terms you take all the other cars off the highway and put that sucker in the HOV lane.

    This is really why a physx card in physx games can tank frame rate in a core2 8800 SLI system that was crushing games prior.

    From speaking to a gamer I know who works for nvidia he gave me a more detailed run down on it that frankly I don't completely grasp, but the jist of it was the same. This is also why nvidia/ati pushed for that third point to point PCI-E slot to use a GPU for physics. Because even though the hardware isn't made exactly for it, you get it off the PCI bus and on direct communications.

    I don't know anybody who has had a chance to play with a PCI-E physx card, so I don't know how good their gains would be, but it's an obvious issue they need to deal with.

    It's a damn shame they haven't released it because all it would take is a simple bridge chip (much as how nvidia made the 6800 agp native and bridged it to PCI-E and the other way around with the 6600). Maybe they don't have the money (we all know physx cards aren't selling), maybe they have it and want to clear out back stock. But as it stands the entire foundation of this card is crap.

    I ended up getting a PCI-E raid controller because even that item (a lot better made, proven tech, costs a hell of a lot more then a physx card) didn't like to play with my sound card over the PCI bus, fuck a lot of sound cards state they don't play well on a shared bus.

    waterlogged on
    Democrat that will switch parties and turn red if Clinton is nominated.:P[SIGPIC][/SIGPIC]
Sign In or Register to comment.