As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/
Options

You got GPU in my CPU!

MKRMKR Registered User regular
http://arstechnica.com/gaming/news/2010/08/microsoft-beats-intel-amd-to-market-with-cpugpu-combo-chip.ars

Short version:
At Hot Chips today, Microsoft's Xbox team unveiled details of the system-on-a-chip (SoC) that powers the newer, slimmer Xbox 360 250GB model. Produced on the IBM/GlobalFoundries 45nm process, it's fair to say that the new SoC (pictured above) is the first mass-market, desktop-class processor to combine a CPU, GPU, memory, and I/O logic onto a single piece of silicon. The goal of the consolidation was, of course, to lower the cost of making the console by reducing the number of different chips needed for the system, shrinking the motherboard, and reducing the number of expensive fans and heatsinks.

So do you think this is going to be a thing, or do you think it will be limited to special applications like this?

MKR on

Posts

  • Options
    GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    edited August 2010
    For now, it's a limited application thing, I think. Intel killing off Larrabee was really the first sign that completely integrated GPU/CPU systems were not coming to the mainstream desktop for a while.

    Right now discreet graphics are still very much "the thing" for mainstream desktops. Obviously consoles are a completely different ball of wax, as both the PS3 and 360 use some form of SoC (if you want to call Cell an SoC, it's kind of the same concept I guess).

    GnomeTank on
    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • Options
    AyulinAyulin Registered User regular
    edited August 2010
    I think just the fact that it involves a GPU would limit it to special applications - the other place I could see this being really useful is in laptops, and that'd only really be useful for gaming models or portable workstations. GPU-computing (to my very limited understanding, anyway) doesn't seem to have a whole lot of uses at this point in time (either video encoding or stuff like Floating@Home), and those seem to be tasks you'd usually use desktops for.

    This being said, I'm all for more powerful gaming laptops with longer battery life/less heat issues/lower cost, if any of these can be made possible by this.

    Ayulin on
    steam_sig.png
  • Options
    GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    edited August 2010
    GPU computing doesn't have a lot of use in mainstream applications, perhaps, but it's gaining an incredibly quick foothold in the sciences and research, because you can build a super computer equivalent in FLOPS for 1/100th the price. Several big physics research labs have recently announced large GPU farms.

    The real issue, and why this continues to be such a hard nut to crack, is that GPU's and general register CPU's are really nothing alike, at all, besides both being made of silicon and using transistors.

    GnomeTank on
    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • Options
    PirusuPirusu Pierce Registered User regular
    edited August 2010
    AMD also released more information for it's Bulldozer line, which will include their "APUs" (basically a CPU/GPU on a single die).

    Pirusu on
  • Options
    stigweardstigweard Registered User regular
    edited August 2010
    It will become the norm (if it isn't already) on mobile hardware and likely general purpose computers (cheap laptops, pos systems, etc...) It's hard to say if it will take off in the desktop / workstation market and I don't really see a real use in server market.

    stigweard on
  • Options
    Jubal77Jubal77 Registered User regular
    edited August 2010
    While a cool idea the system itself has a self inducing lagg to it. So it is not faster than prior consoles. Pretty obvious why they did this but in the end all the nifty new hardware is limited to act like legacy systems.

    Jubal77 on
  • Options
    DehumanizedDehumanized Registered User regular
    edited August 2010
    I think there's a future for this. There will always be a demand for smaller, more compact devices -- and consolidating CPU and GPU allows them to cool both at the same time, and by integrating the north and south bridge, as well as I/O and memory, they can also facilitate even quicker communication between the separate parts. It'll lead to greater speed for less cooling and cheaper components.

    Dehumanized on
  • Options
    GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    edited August 2010
    You wouldn't want your GPU on your northbridge. GPU's tend to use direct access to the memory across a bus, without a true memory controller being in the way. It's incredibly fast and allows them to make big, huge, wide pipes to pass those gobs and gobs of textures through.

    Again, GPU's and CPU's don't work even remotely the same, which is one of the primary issues in making this work well.

    GnomeTank on
    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • Options
    MKRMKR Registered User regular
    edited August 2010
    I think there is an average consumer application for it. Every desktop and laptop MB has integrated graphics. Could this turn the usual barely functional integrated GPUs into just above passable systems?

    MKR on
  • Options
    BarrakkethBarrakketh Registered User regular
    edited August 2010
    Jubal77 wrote: »
    While a cool idea the system itself has a self inducing lagg to it.
    Source?
    Pretty obvious why they did this but in the end all the nifty new hardware is limited to act like legacy systems.
    Not really. If anything it should have a faster connection to the CPU and main memory, and aside from providing better integrated graphics it provides the potential option of having the CPU's shaders (or whatever the equivalent will be, though I imagine it'll have a similar programming model available) used to accelerate rendering on a normal graphics card. Sort of like SLI/Crossfire but without the need for a second video card. It might even be able to function as simply having additional shaders available to a card instead of working as a second GPU in such a system.

    I imagine operations that will need to work on tons of memory (large textures) will be taken care of by normal graphics that gamers will own, but it's a neat idea to get some more cost effective rendering power for games that won't see much improvement through more/faster cores.

    Barrakketh on
    Rollers are red, chargers are blue....omae wa mou shindeiru
  • Options
    Jubal77Jubal77 Registered User regular
    edited August 2010
    Barrakketh wrote: »
    Jubal77 wrote: »
    While a cool idea the system itself has a self inducing lagg to it.
    Source?
    Pretty obvious why they did this but in the end all the nifty new hardware is limited to act like legacy systems.
    Not really. If anything it should have a faster connection to the CPU and main memory, and aside from providing better integrated graphics it provides the potential option of having the CPU's shaders (or whatever the equivalent will be, though I imagine it'll have a similar programming model available) used to accelerate rendering on a normal graphics card. Sort of like SLI/Crossfire but without the need for a second video card. It might even be able to function as simply having additional shaders available to a card instead of working as a second GPU in such a system.

    I imagine operations that will need to work on tons of memory (large textures) will be taken care of by normal graphics that gamers will own, but it's a neat idea to get some more cost effective rendering power for games that won't see much improvement through more/faster cores.

    Ars

    Jubal77 on
  • Options
    GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    edited August 2010
    Wouldn't it be more accurate to say that IBM beat Intel and AMD to market? I'm pretty sure the XBox's chip is an IBM die.

    GnomeTank on
    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • Options
    BarrakkethBarrakketh Registered User regular
    edited August 2010
    GnomeTank wrote: »
    Wouldn't it be more accurate to say that IBM beat Intel and AMD to market? I'm pretty sure the XBox's chip is an IBM die.

    CPU is IBM, GPU is AMD/ATI. GlobalFoundries is what was created by AMD divesting itself of its fabrication plants.

    Jubal77 wrote: »
    Barrakketh wrote: »
    Jubal77 wrote: »
    While a cool idea the system itself has a self inducing lagg to it.
    Source?
    Ars
    Then you missed the point. It's not supposed to be faster, it's supposed to consume less power and generate less heat. They basically made it perform exactly like the other 360s, which is exactly the idea behind a console.

    Barrakketh on
    Rollers are red, chargers are blue....omae wa mou shindeiru
  • Options
    Jubal77Jubal77 Registered User regular
    edited August 2010
    Barrakketh wrote: »
    GnomeTank wrote: »
    Wouldn't it be more accurate to say that IBM beat Intel and AMD to market? I'm pretty sure the XBox's chip is an IBM die.

    CPU is IBM, GPU is AMD/ATI. GlobalFoundries is what was created by AMD divesting itself of its fabrication plants.

    Jubal77 wrote: »
    Barrakketh wrote: »
    Jubal77 wrote: »
    While a cool idea the system itself has a self inducing lagg to it.
    Source?
    Ars
    Then you missed the point. It's not supposed to be faster, it's supposed to consume less power and generate less heat. They basically made it perform exactly like the other 360s, which is exactly the idea behind a console.

    Nah I didnt miss the point. There is a concrete reason why they would do it. It is a console. They can have different hardware but not different performance. I would imagine they are saving that extra power for Kinect since they have the "Kinect ready" statement on the boxes etc.

    Jubal77 on
  • Options
    BarrakkethBarrakketh Registered User regular
    edited August 2010
    Jubal77 wrote: »
    I would imagine they are saving that extra power for Kinect since they have the "Kinect ready" statement on the boxes etc.
    Nope. There's a special connector for Kinect on the new 360. If you're not using one of the new 360s it'll require an external power source via a special cable and wall wart. That's all it means.

    Barrakketh on
    Rollers are red, chargers are blue....omae wa mou shindeiru
  • Options
    proyebatproyebat GARY WAS HERE ASH IS A LOSERRegistered User regular
    edited August 2010
    Jubal77 wrote: »
    Nah I didnt miss the point. There is a concrete reason why they would do it. It is a console. They can have different hardware but not different performance. I would imagine they are saving that extra power for Kinect since they have the "Kinect ready" statement on the boxes etc.

    If Microsoft did dish out faster 360s, Microsoft couldn't tell game designers to make the games on the new hardware; it would be unfair to previous console owners. The only benefit would be better framerates.

    I can imagine a universe where Microsoft would make the faster console, tell developers to make their games off that new tech, and tell all previous consoles owners to buy a new console, else new games play like shit. Actually it very much sounds like Microsoft.
    Jubal77 wrote: »
    There is a concrete reason why they would do it. It is a console.
    right.....
    It's MICROSOFT

    proyebat on
    455Bo4O.png
  • Options
    DehumanizedDehumanized Registered User regular
    edited August 2010
    they call it the PC upgrade cycle; microsoft usually isn't the one driving it though!


    alternate response: xbox -> xbox 360 rite

    Dehumanized on
  • Options
    krapst78krapst78 Registered User regular
    edited August 2010
    Judging by the image in the Ars post, it's a bit of a stretch to call it a true SoC because it looks like the I/O logic is still handled by a separate Southbridge which is not part of the actual chip. Intel already did the GPU (a crappy 3D GPU) on the physical chip with their Clarkdale/Arrandale chips and their recent Pineview chips. If we're talking about actually integrating the GPU into the same die, then Intel is already moving in that direction with their upcoming Sandybridge based chips and consolidates their I/O functionality to their new Cougar Point PCH.

    Like GnomeTank stated above, it's kind of limited application wise because doing this for a game console and for a mainstream desktop are pretty different.

    As far as mobile computing goes, we're already seeing mobile SoCs with fairly impressive GPU capabilities like Nvidia's Tegra platform and the A4 chip in Apple's offerings.

    krapst78 on
    Hello! My name is Inigo Montoya! You killed my father prepare to die!
    Looking for a Hardcore Fantasy Extraction Shooter? - Dark and Darker
  • Options
    Dark ShroudDark Shroud Registered User regular
    edited August 2010
    Yes Snap Dragon (ATI ARM based partnership) just reached 1.5ghz and Intel just released a duel core Atom CPU. Even VIA has some good mobile/small CPUs.

    It's a great for portable devices and lower end desktops. I look forward to seeing what can be done with Mini-ITX systems a year from now.

    Dark Shroud on
  • Options
    GungHoGungHo Registered User regular
    edited August 2010
    GnomeTank wrote: »
    GPU computing doesn't have a lot of use in mainstream applications, perhaps, but it's gaining an incredibly quick foothold in the sciences and research, because you can build a super computer equivalent in FLOPS for 1/100th the price. Several big physics research labs have recently announced large GPU farms.
    Yeah, it also has a lot of application in the energy industry for finite element analysis. Because the GPUs natively allow for pretty complex vertex calculations, you can use something like CUDA as a catalyst to turn the GPU into a supercharged FPU. Future on-die CPU/GPU cubes should make those applications go even faster because you won't have to have any interim chips or the speed of light (no shit) slowing you down when the GPU and CPU need to talk. Jobs that used to take a week or two to process are being completed within a few hours, and once the GPUs get on the die, they think it may push that down to less than an hour. And, when you need to get answers to someone who is trying to use your information to make a decision worth millions of dollars, time really is money.

    GungHo on
  • Options
    GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    edited August 2010
    I think what you will eventually see is actually the death of the GPU, but we are several years at a minimum from that. CPU's like the Cell in the PS3 are eventually going to become the norm: Hundreds of cores, hyper parallel, ability to change the "focus" of the processing unit at run time (so, for instance, when playing a game, 64 of your 128 cores go in to "FPU, fast memory mode" to run purely floating point calculations without the overhead of general register processing).

    The only reason we require GPU's today is that general register CPU's are not very good at hyper parallel floating point operations. Things like shaders are actually just a bandaid to allow those hyper parallel stream processors to be programmable by the end user.

    GnomeTank on
    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
Sign In or Register to comment.