The new forums will be named Coin Return (based on the most recent vote)! You can check on the status and timeline of the transition to the new forums here.
The Guiding Principles and New Rules document is now in effect.

Nvidia buys Ageia

GrimReaperGrimReaper Registered User regular
edited February 2008 in Games and Technology
According to dailytech.

Having a discrete physics card was always a bad idea except to those who were flush with cash, integrating it into current gpu's is the more feasible idea which is what I suspect nvidia is wanting to do here.

Buy Ageia, integrate the tech into nvidia graphics cards. Everybody wins, except maybe DAAMIT..

PSN | Steam
---
I've got a spare copy of Portal, if anyone wants it message me.
GrimReaper on

Posts

  • EinhanderEinhander __BANNED USERS regular
    edited February 2008
    Ageia's physics cards on their own were pretty much a load of bullshit on par with the Killer NIC, weren't they?

    Having this tech integrated into next gen GPUs would only help, but I wonder what the price vs. performance increases will be?

    Einhander on
  • ViscountalphaViscountalpha The pen is mightier than the sword http://youtu.be/G_sBOsh-vyIRegistered User regular
    edited February 2008
    Einhander wrote: »
    Ageia's physics cards on their own were pretty much a load of bullshit on par with the Killer NIC, weren't they?

    Having this tech integrated into next gen GPUs would only help, but I wonder what the price vs. performance increases will be?

    I think there was a point to them but I never really saw the value of them. It doesn't matter now. Both physics card developers are now swallowed up by big companies. Intel as Ageia's competetor which I don't remeber off the top of my head but ya. Phys-x cards never seemed worth the cash.

    Rumor is that intel will have some 8 core beast or even a 16 core beast with 4 cores to act like a gpu and 3 cores to act like a physics processor. nVidia must be afraid of this.

    Viscountalpha on
  • MalkorMalkor Registered User regular
    edited February 2008
    Einhander wrote: »
    Ageia's physics cards on their own were pretty much a load of bullshit on par with the Killer NIC, weren't they?

    Having this tech integrated into next gen GPUs would only help, but I wonder what the price vs. performance increases will be?

    I think there was a point to them but I never really saw the value of them. It doesn't matter now. Both physics card developers are now swallowed up by big companies. Intel as Ageia's competetor which I don't remeber off the top of my head but ya. Phys-x cards never seemed worth the cash.

    Rumor is that intel will have some 8 core beast or even a 16 core beast with 4 cores to act like a gpu and 3 cores to act like a physics processor. nVidia must be afraid of this.

    Not if all 4 cores are Intel integrated graphics.

    Malkor on
    14271f3c-c765-4e74-92b1-49d7612675f2.jpg
  • FyreWulffFyreWulff YouRegistered User, ClubPA regular
    edited February 2008
    Einhander wrote: »
    Ageia's physics cards on their own were pretty much a load of bullshit on par with the Killer NIC, weren't they?

    Having this tech integrated into next gen GPUs would only help, but I wonder what the price vs. performance increases will be?

    I think there was a point to them but I never really saw the value of them. It doesn't matter now. Both physics card developers are now swallowed up by big companies. Intel as Ageia's competetor which I don't remeber off the top of my head but ya. Phys-x cards never seemed worth the cash.

    Rumor is that intel will have some 8 core beast or even a 16 core beast with 4 cores to act like a gpu and 3 cores to act like a physics processor. nVidia must be afraid of this.

    Intel bought Havok.

    I think in a few years we're going to have some sort of computer world civil war..

    FyreWulff on
  • KhavallKhavall British ColumbiaRegistered User regular
    edited February 2008
    FyreWulff wrote: »
    Einhander wrote: »
    Ageia's physics cards on their own were pretty much a load of bullshit on par with the Killer NIC, weren't they?

    Having this tech integrated into next gen GPUs would only help, but I wonder what the price vs. performance increases will be?

    I think there was a point to them but I never really saw the value of them. It doesn't matter now. Both physics card developers are now swallowed up by big companies. Intel as Ageia's competetor which I don't remeber off the top of my head but ya. Phys-x cards never seemed worth the cash.

    Rumor is that intel will have some 8 core beast or even a 16 core beast with 4 cores to act like a gpu and 3 cores to act like a physics processor. nVidia must be afraid of this.

    Intel bought Havok.

    I think in a few years we're going to have some sort of computer world civil war..

    Unlike now, when there are no companies competing.


    Anyways, I saw this and I thought "GPU-PPU single card" and then I thought "Hey what if it's still priced somewhat like a GPU" and then I thought "Jesus fuck that would rule"


    I mean, I'm thinking the 9800XT2 or whatever is just two 8800 Ultras on one card, right? So what if they threw a Phys-Xs guts into an 8800GT, for maybe $50 more, since they can share some materials and the board, and really all that would be needed on top of the vanilla is the chip itself, right? Now if they could do that, you would have faster communication, similar to the dual-core architecture being better than two separate processors, right?

    Am I the only one who thinks that would be pretty cool?

    Khavall on
  • LittleBootsLittleBoots Registered User regular
    edited February 2008
    Khavall wrote: »
    FyreWulff wrote: »
    Einhander wrote: »
    Ageia's physics cards on their own were pretty much a load of bullshit on par with the Killer NIC, weren't they?

    Having this tech integrated into next gen GPUs would only help, but I wonder what the price vs. performance increases will be?

    I think there was a point to them but I never really saw the value of them. It doesn't matter now. Both physics card developers are now swallowed up by big companies. Intel as Ageia's competetor which I don't remeber off the top of my head but ya. Phys-x cards never seemed worth the cash.

    Rumor is that intel will have some 8 core beast or even a 16 core beast with 4 cores to act like a gpu and 3 cores to act like a physics processor. nVidia must be afraid of this.

    Intel bought Havok.

    I think in a few years we're going to have some sort of computer world civil war..

    Unlike now, when there are no companies competing.


    Anyways, I saw this and I thought "GPU-PPU single card" and then I thought "Hey what if it's still priced somewhat like a GPU" and then I thought "Jesus fuck that would rule"


    I mean, I'm thinking the 9800XT2 or whatever is just two 8800 Ultras on one card, right? So what if they threw a Phys-Xs guts into an 8800GT, for maybe $50 more, since they can share some materials and the board, and really all that would be needed on top of the vanilla is the chip itself, right? Now if they could do that, you would have faster communication, similar to the dual-core architecture being better than two separate processors, right?

    Am I the only one who thinks that would be pretty cool?

    <sarcasm>
    No, I hate this idea. I think we should just go back to having CPU software rendering and no physics.</sarcasm>

    LittleBoots on

    Tofu wrote: Here be Littleboots, destroyer of threads and master of drunkposting.
  • Zilla360Zilla360 21st Century. |She/Her| Trans* Woman In Aviators Firing A Bazooka. ⚛️Registered User regular
    edited February 2008
    Nvidia were already experimenting with physics 'shaders' so this is a perfect fit.

    Zilla360 on
  • GungHoGungHo Registered User regular
    edited February 2008
    Maybe this was the only way that their technology would ever be supported by anyone.

    Actually, it's a great business idea: create a decent technology and pray some big company comes along and swallows you up so you can escape with a golden parachute. And they won't even make you do infomercials to hock it like that Oxyclean guy who doesn't know how to say no to Grecian 5.

    GungHo on
  • LewiePLewieP Registered User regular
    edited February 2008
    GungHo wrote: »
    Maybe this was the only way that their technology would ever be supported by anyone.

    LewieP on
  • lowlylowlycooklowlylowlycook Registered User regular
    edited February 2008
    So the problem with having separate cards for this is that parallel programing is hard and anything that can be put on a separate card is a prime candidate for farming off to a different core without too much difficulty. Now that multi-core cpus are the norms it will be very hard to deliver value for money with cards that do just physics or AI or whatever else you care to name.

    Eventually even GPU might become unecessary if we are talking about 100s of cores in the future.

    lowlylowlycook on
    steam_sig.png
    (Please do not gift. My game bank is already full.)
  • dispatch.odispatch.o Registered User regular
    edited February 2008
    So the problem with having separate cards for this is that parallel programing is hard and anything that can be put on a separate card is a prime candidate for farming off to a different core without too much difficulty. Now that multi-core cpus are the norms it will be very hard to deliver value for money with cards that do just physics or AI or whatever else you care to name.

    Eventually even GPU might become unecessary if we are talking about 100s of cores in the future.

    I remember hearing that the AMD/ATI solution to this was going to be writing an instruction set that would allow an AMD (ATI) card calculate the physics of a thing. The example I remember was someone going out and buying a top of the line card, and instead of throwing their old one out they just put it in the secondary video card slot.

    They at the time were claiming it would be more useful than SLI and way less gay.

    Am I hallucinating?

    dispatch.o on
  • The_ScarabThe_Scarab Registered User regular
    edited February 2008
    Is this going to be like Actiblizzion where they rename to like Nvigeia?

    The_Scarab on
  • KhavallKhavall British ColumbiaRegistered User regular
    edited February 2008
    Nvgdeiaia

    Khavall on
  • The_ScarabThe_Scarab Registered User regular
    edited February 2008
    Agedia

    The_Scarab on
  • LewiePLewieP Registered User regular
    edited February 2008
    Nvidiag3iadfxPortalPlayer

    LewieP on
  • SmudgeSmudge Registered User regular
    edited February 2008
    Has anyone thought that maybe this move isn't to put this tech on the GPU, but maybe as part of the motherboard chipset?

    Would it be possible for a physics processor to be part of north or southbridge? I don't know much of how these chips work with the cpu so ???

    But Nvidia does a lot more than just GPUs these days.

    Smudge on
  • DarklyreDarklyre Registered User regular
    edited February 2008
    LewieP wrote: »
    iNvidiag3iadfxPortalPlayer

    Fixed.

    Darklyre on
  • KhavallKhavall British ColumbiaRegistered User regular
    edited February 2008
    Smudge wrote: »
    Has anyone thought that maybe this move isn't to put this tech on the GPU, but maybe as part of the motherboard chipset?

    Would it be possible for a physics processor to be part of north or southbridge? I don't know much of how these chips work with the cpu so ???

    But Nvidia does a lot more than just GPUs these days.

    I'm fairly sure though that the PPU works really closely with the GPU, and they're essentially working on very similar stuff. So I think that the point of combining the two makes more sense then putting it on a mobo integration.

    Khavall on
  • veritas1veritas1 Registered User regular
    edited February 2008
    About time, I hated installing that stupid Ageia driver even though I didn't have the card. I imagine getting swallowed up is exactly what Ageagae wanted anyways, they must have known no one was going to drop $X on a card that isn't even necessary.

    veritas1 on
  • MalkorMalkor Registered User regular
    edited February 2008
    Enough people bought it to keep them solvent I'm guessing. Also I bet whatever they had in the works was impressive enough for NVIDIA to make the buy.

    Malkor on
    14271f3c-c765-4e74-92b1-49d7612675f2.jpg
  • KageraKagera Imitating the worst people. Since 2004Registered User regular
    edited February 2008
    Point is, they just made bank whether their business was viable solo or not.

    Kagera on
    My neck, my back, my FUPA and my crack.
  • DarklyreDarklyre Registered User regular
    edited February 2008
    The real test for the card is pretty simple.

    Does it speed up Crysis? If it can do it by even 5-10 frames for the people running quad-core, SLI 8800GTX monsters, then the technology will eventually be worth it.

    Darklyre on
  • cloudeaglecloudeagle Registered User regular
    edited February 2008
    So does this effectively mean the end of pointless stand-alone physics cards?

    Gawd, I hope so. From what I had seen the physics applications didn't really do much for the few games that even supported it. With this tech firmly integrated into graphics cards, maybe now we can actually see it supported and put to good use.

    cloudeagle on
    Switch: 3947-4890-9293
  • JorilJoril BelgiumRegistered User regular
    edited February 2008
    this just in: Physics card prices plummet, because of physics!


    In other news, graphics card prices soar, because of physics!


    I'm just kidding, unless the market follows suit.
    Anyway, does this make future games faster, or is this just another feature? I haven't played any physics accelerated games exept maybe Wii sports (assuming it uses it).

    Joril on
    bonesnacksig.jpg
  • cloudeaglecloudeagle Registered User regular
    edited February 2008
    Right now it's purely a PC thing, the consoles don't use it.

    From what I can tell the games that use it look ever-so-slightly better... not nearly better enough to justify the cost of the card. And I've heard that some games take a framerate hit with it on.

    Not to mention there was that oh-so hilarious physics tech demo from one of the card makers that supposedly showed off how swank your games would look now that you had the card. (It looked okay without physics, but much better with the physics.) Some folks managed to make a minuscule tweak to the program to trick the demo into thinking the card was installed -- and presto, the demo acted like it had the physics card, even without a physics card installed!

    cloudeagle on
    Switch: 3947-4890-9293
  • KhavallKhavall British ColumbiaRegistered User regular
    edited February 2008
    cloudeagle wrote: »
    Right now it's purely a PC thing, the consoles don't use it.

    From what I can tell the games that use it look ever-so-slightly better... not nearly better enough to justify the cost of the card. And I've heard that some games take a framerate hit with it on.

    Not to mention there was that oh-so hilarious physics tech demo from one of the card makers that supposedly showed off how swank your games would look now that you had the card. (It looked okay without physics, but much better with the physics.) Some folks managed to make a minuscule tweak to the program to trick the demo into thinking the card was installed -- and presto, the demo acted like it had the physics card, even without a physics card installed!

    Right, but I think the problem with the framerate is you have objects being sent to the PPU and then sent to the GPU. If it's just CPU -> GP/PPU then we're talking much faster communication and ergo I would assume better framerates.

    Khavall on
  • cloudeaglecloudeagle Registered User regular
    edited February 2008
    Khavall wrote: »
    cloudeagle wrote: »
    Right now it's purely a PC thing, the consoles don't use it.

    From what I can tell the games that use it look ever-so-slightly better... not nearly better enough to justify the cost of the card. And I've heard that some games take a framerate hit with it on.

    Not to mention there was that oh-so hilarious physics tech demo from one of the card makers that supposedly showed off how swank your games would look now that you had the card. (It looked okay without physics, but much better with the physics.) Some folks managed to make a minuscule tweak to the program to trick the demo into thinking the card was installed -- and presto, the demo acted like it had the physics card, even without a physics card installed!

    Right, but I think the problem with the framerate is you have objects being sent to the PPU and then sent to the GPU. If it's just CPU -> GP/PPU then we're talking much faster communication and ergo I would assume better framerates.

    Like I said, integrating physics chip stuff onto the GPU will eventually make the technology worthwhile, and I'm glad things are going this direction. But trying to justify an entire separate card for physics was just plain silly from cost and technical perspectives.

    cloudeagle on
    Switch: 3947-4890-9293
  • JorilJoril BelgiumRegistered User regular
    edited February 2008
    cloudeagle wrote: »
    Right now it's purely a PC thing, the consoles don't use it.

    Hold it!

    from: http://www.gamespot.com/news/6185534.html
    gamespot wrote: »
    Ageia debuted the world's first dedicated physics processor in 2006. The chip is designed to handle real-time physics calculations that allow for more impressive and diverse visual effects. In addition to PC titles, Ageia's PhysX software is employed in the PS3, Xbox 360, and Wii.

    Just the software?

    there's more:
    from: http://crave.cnet.com/8301-1_105-9864532-1.html?part=rss&tag=feed&subj=Crave
    crave wrote: »
    The PhysX chip can be found in all three of the modern gaming consoles--Playstation 3, Xbox 360, and the Wii--as well as in add-in cards for PC gaming. Developers have to write their games with the processor in mind to unlock the performance, and over 140 titles are available for consoles and PCs that support the PhysX technology.

    Take that!
    So, I guess AMD/ATI will have to invent their own from now on.

    Joril on
    bonesnacksig.jpg
  • FyreWulffFyreWulff YouRegistered User, ClubPA regular
    edited February 2008
    Either that or nVidia will license it to them, like Intel license SSE and other things to AMD/etc.

    FyreWulff on
  • MonaroMonaro Registered User regular
    edited February 2008
    I dunno. I'd imagine AMD and Intel have much more riding on their agreements for the sake of standardization than AMD/nvidia would for game physics.

    So what is going on with AMD's supposed all-in-one CPU-GPU-motherboard idea anyway?

    Monaro on
    steam_sig.png
  • RookRook Registered User regular
    edited February 2008
    Joril wrote: »
    there's more:
    from: http://crave.cnet.com/8301-1_105-9864532-1.html?part=rss&tag=feed&subj=Crave
    crave wrote: »
    The PhysX chip can be found in all three of the modern gaming consoles--Playstation 3, Xbox 360, and the Wii--as well as in add-in cards for PC gaming. Developers have to write their games with the processor in mind to unlock the performance, and over 140 titles are available for consoles and PCs that support the PhysX technology.

    Take that!
    So, I guess AMD/ATI will have to invent their own from now on.

    I think that article is just written by some idiot who doesn't understand that Aegia PhysX is a software middleware solution, rather than there's a secret Physics Processor Chip on every single home console.

    Rook on
  • ViscountalphaViscountalpha The pen is mightier than the sword http://youtu.be/G_sBOsh-vyIRegistered User regular
    edited February 2008
    Monaro wrote: »
    I dunno. I'd imagine AMD and Intel have much more riding on their agreements for the sake of standardization than AMD/nvidia would for game physics.

    So what is going on with AMD's supposed all-in-one CPU-GPU-motherboard idea anyway?

    AMD's fusion? I thought it isn't scheduled to show up until 2009. In which case intel is going haul ass on getting larrabee finished well before then.

    Viscountalpha on
  • TechBoyTechBoy Registered User regular
    edited February 2008
    Nvidia's CUDA says hello.

    Apparently graphics chips are designed to do a ton of floating point calculations which make them much better at physics simulation than even the beastiest hojillion core CPU that Intel can churn out.

    From a website using GPUs for simulating molecular dynamics:
    The high degree of parallelism and floating point arithmetic capability of GPUs can attain performance levels twenty times that of a single CPU core

    I have a feeling that a physics card isn't too different architecturally from a GPU. nVidia probably wanted the experience the Ageia folks have in hardware based physics simulation, since it seems like CUDA could open up a whole new market for nVidia cards.

    TechBoy on
    tf2_sig.png
Sign In or Register to comment.