The new forums will be named Coin Return (based on the most recent vote)! You can check on the status and timeline of the transition to the new forums here.
The Guiding Principles and New Rules document is now in effect.
ATI is stating that a SINGLE R600 high end configuration will require 300 watts of power (+/-9%) and a DUAL R600 "CrossFire" high end configuration will require, as you might guess, 600 watts of power (+/-9%). Compare that to a single GeForce 8800 GTX that will pull 150 to 180 watts. Add in a CPU to that mix and you overtake most power supplies’ peak ratings on the retail shelves today.
Needs its own external power plug, like the last line of Voodoos did, indeed.
But.. seriously.. 300 watts? Jesus. Every time something like this happens, the 'other company' always comes around a couple of months later with an equally-powerful card that runs on half the power consumption.
I could barely fit my 8800GTX into my machine. There's no way in hell I'd be able to fit an R600, unless I buy a full-sized tower. And those are heavy and bulky, especially compared to my 13 pound, awesome looking Cooler Master Mystique. The power consumption is a huge turn off, too.
Dashui on
Xbox Live, PSN & Origin: Vacorsis 3DS: 2638-0037-166
I wonder if the R600 is going to be really noisy, too. I went from a 7900 to an 8800, and I thought something was wrong because there was no noise coming from the PC.
Dashui on
Xbox Live, PSN & Origin: Vacorsis 3DS: 2638-0037-166
Definitely not... as AMD is considering selling ATi right now.
See, ATi was making shit-tons of money before AMD bought them because they had a huge segment of the mobile market. AMD scooped them up, and now their mobility chipsets aren't going onto Intel motherboards anymore, and suddenly having ATi in their stable isn't really making AMD any money.
Why do they even bother going through the MoBo power supply? You'd think that at 300 Watts, you'd start thinking about putting a power jack next to the video out plug to hook up to an external transformer.
EDIT: and here's another thought: If I'm not mistaken, a US standard residential branch circuit can deliver 1800 Watts (120V RMS x 15 amps). Put two of these guys in on a Crossfire setup, and you've already used up 1/3rd of the capacity of a circuit that may be supplying power to two or three rooms in a house. Assuming that the rest of your system is similarly high-end, you could be coming quite close to popping the circuit breaker with your computer. Eek.
Robo Beat on
This is not the greatest sig in the world.
This is just a tribute.
I read in some article about it that they're releasing a non-OEM (ie. Retail) version right after that will be for consumers, and will be about the same size as the 1950XT. So this first version is apparently for OEM manufacturers or something.
The XTX version comes in retail and OEM versions and it's the OEM one (pictured) that really gets outlandish, with the 12.4-inch length and 270W of power consumption. The retail XTX cuts it down to 9.5-inches and 240W, while the weaker XT matches those specs.
Definitely not... as AMD is considering selling ATi right now.
See, ATi was making shit-tons of money before AMD bought them because they had a huge segment of the mobile market. AMD scooped them up, and now their mobility chipsets aren't going onto Intel motherboards anymore, and suddenly having ATi in their stable isn't really making AMD any money.
What the hell are you talking about? AMD knew when they acquired ATi that they would no longer be producing chips for use on Intel boards. The point of purchasing ATi was so that they could have AMD brand chipsets and integrated graphics ala Intel (appealing to business customers), rather than relying on Nvidia and VIA, and so they could get access to GPU technology and make a dual or multicore CPU/GPU all-in-one chip, which appeals to all kinds of markets.
It was mentioned in the HardOCP article and just about every other article since the first R600 card was unveiled that the retail card would be smaller, but the fact remains that ATi is producing a fucking HUGE video card. Just because only system integrators will be able to buy them doesn't mean anything. There are a fuckton of gaming PCs sold and a second hand market. Six to nine months from now when ATi refreshes the lineup, you'll see plenty of these behemoths on ebay.
Ok, we know it isn't more than a foot long. What about power consumption? I have heard stories of a system with a 8800GTX card running happily on a 350W PSU and it looks like the R600 needs its own 350W PSU.
Needs its own external power plug, like the last line of Voodoos did, indeed
No it doesnt. It needs 2 PCI-e power plugs, but that's no different from nvidia's 8800.
Now, take this with a huge grain of salt.
But I read on the inquirer that it'd be using a proprietary plug for one of the power adapters, as well as a PCIe adapter. And if you used two PCIe adapters with a converter you wouldn't have access to the overclocking options.
Of course, standard disclaimer, it's the inquirer, huge grains of salt yadda yadda.
(I ended up giving in and buying an 8800GTS, I'd love to be an ATI fanboy, but seriously, drop the fucking ball much?)
but to address what you're saying, there's (on the XTX version, possibly not the XT, and definititely not the lower end versions)
an 8pin and a 6 pin PCI-e power jack.
for reference, the 8800 has 2 6 pin PCI-e
8 pin is not proprietary, just a newer standard not really seen before now.
You can use two 6 pin plugs if you want (leaving 2 holes empty on the 8 pin jack, and yeah you might lose the ATI-controlled overclocking ability (could probably still overclock it manually)
Well, I did hear about the PCI-E power connector going to 8-pin instead of its current 6-pin design, but I hadn't heard that the R600 thing was doing that. Might as well get that jump out the way, too, the rest of it's taking bloody well long enough. Are they going dual HDMI, too? lol
Yeah, but the reason that is, is that they simply shovel out so much stuff, that by pure statistical chance of it some stuff has to be true. It's not demonstrably from some special insight-ability.
Personally, I don't give a shit if AMD's top card consumes 1.21 jiggawatts and requires a separate 4U rackmount. The top card a company makes is solely for bragging rights over the other company anyway. I care about whether AMD's $200-ish card will be better than nVidia's upcoming $200-ish card.
Is this the card that competes with Nvidia's 8800 line?
300W is ridiculous. I was pretty much waiting for the midline 8800 card priced at $200 before I upgraded, but I at least wanted to see what ATI had up their sleeve. This was not so much up their sleeve as under their entire jacket.
Posts
PSN ID : Xander51 Steam ID : Xander51
But.. seriously.. 300 watts? Jesus. Every time something like this happens, the 'other company' always comes around a couple of months later with an equally-powerful card that runs on half the power consumption.
Steam ID: slashx000______Twitter: @bill_at_zeboyd______ Facebook: Zeboyd Games
HAHAHA OH WOW
I'M A TWITTER SHITTER
Oh wait.
If by "slot" you mean "anywhere within 5 feet of your desk" then the answer is "yes."
Nvidia to buy AMD in five years.
sorry
Definitely not... as AMD is considering selling ATi right now.
See, ATi was making shit-tons of money before AMD bought them because they had a huge segment of the mobile market. AMD scooped them up, and now their mobility chipsets aren't going onto Intel motherboards anymore, and suddenly having ATi in their stable isn't really making AMD any money.
Is this DX10?
3ds friend code: 2981-6032-4118
EDIT: and here's another thought: If I'm not mistaken, a US standard residential branch circuit can deliver 1800 Watts (120V RMS x 15 amps). Put two of these guys in on a Crossfire setup, and you've already used up 1/3rd of the capacity of a circuit that may be supplying power to two or three rooms in a house. Assuming that the rest of your system is similarly high-end, you could be coming quite close to popping the circuit breaker with your computer. Eek.
This is just a tribute.
That's just fucking ridiculous. Now I'm really curious to see the stats for the rest of the line as well.
Let me see if I can find the article...
http://www.engadget.com/2007/02/11/shots-surface-of-atis-r600-and-boy-is-she-a-big-one/
What the hell are you talking about? AMD knew when they acquired ATi that they would no longer be producing chips for use on Intel boards. The point of purchasing ATi was so that they could have AMD brand chipsets and integrated graphics ala Intel (appealing to business customers), rather than relying on Nvidia and VIA, and so they could get access to GPU technology and make a dual or multicore CPU/GPU all-in-one chip, which appeals to all kinds of markets.
http://www.tgdaily.com/2006/10/25/amd_announces_fusion_processor/
http://www.internetnews.com/ent-news/article.php/3640251
No it doesnt. It needs 2 PCI-e power plugs, but that's no different from nvidia's 8800.
Also the OEM card is that long, not the retail one.
OP, change your OP to reflect the proper information
EDIT: what the guy above me said
Goddamn you, now I have to replace the keyboard I just sprayed coffee all over
So the OP and title are even more off base.
That's been said, and the reverse, so many times in the last decade it's not even funny.
Now, take this with a huge grain of salt.
But I read on the inquirer that it'd be using a proprietary plug for one of the power adapters, as well as a PCIe adapter. And if you used two PCIe adapters with a converter you wouldn't have access to the overclocking options.
Of course, standard disclaimer, it's the inquirer, huge grains of salt yadda yadda.
(I ended up giving in and buying an 8800GTS, I'd love to be an ATI fanboy, but seriously, drop the fucking ball much?)
but to address what you're saying, there's (on the XTX version, possibly not the XT, and definititely not the lower end versions)
an 8pin and a 6 pin PCI-e power jack.
for reference, the 8800 has 2 6 pin PCI-e
8 pin is not proprietary, just a newer standard not really seen before now.
You can use two 6 pin plugs if you want (leaving 2 holes empty on the 8 pin jack, and yeah you might lose the ATI-controlled overclocking ability (could probably still overclock it manually)
What about the GeForce 8 line? How much power will the 8600s pull, I wonder?
But surprisingly accurate at times.
Yeah, but the reason that is, is that they simply shovel out so much stuff, that by pure statistical chance of it some stuff has to be true. It's not demonstrably from some special insight-ability.
I think when you make so many random guesses so often, you're bound to be right just according to probability rules
edit: lol, beat'd
Yeah, try pushing a 24" (or 30" in my case, 77% more pixels) display with decent settings and an acceptable framerate with your $200-ish card
You're bragging.
300W is ridiculous. I was pretty much waiting for the midline 8800 card priced at $200 before I upgraded, but I at least wanted to see what ATI had up their sleeve. This was not so much up their sleeve as under their entire jacket.