NVIDIA today launches its newest mainstream part, the GeForce 8800 GT
NVIDIA today is set to launch its newest midrange graphics card part, the GeForce 8800 GT,
previously known by its codename G92. NVIDIA guidance states that its newest graphics card will be sold at a retail price in the $199 and $249 price range.
Development around the G92 processor revolved around reducing the thermal and power draw on the GeForce 8800 GTX (G80) processor. G80 was manufactured on TSMC's 90nm process node while the G92 is manufactured on TSMC's 65nm node. This shrink allows a single 8800 GT to operate on a 105 Watt draw, almost 80 Watts less than the 8800 GTX during heavy operation.
Top-to-bottom, the GeForce 8800 GT fits snugly between NVIDIA's GeForce 8800 GTS 640 MB, which NVIDIA sets at a retail price of $349, and NVIDIA's GeForce 8600 GTS, which is sold for $149.
The GeForce 8800 GT sports a 100 MHz speed bump over the 8800 GTS, and comes factory clocked at 600 MHz. The 600 MHz clock speed of the 8800 GT is actually 15 MHz higher than the 8800 GTX's default GPU clock, which is set at 575 MHz. The 8800 GT's clock speed also comes within striking distance of the GeForce 8800 Ultra's 612 MHz GPU clock speed.
The NVIDIA GeForce 8800 GT features 112 stream processors, 6 less than the 128 stream processors found on the ultra high-end 8800 GTX and 16 more than the 96 stream processors found on NVIDIA's 8800 GTS.
The stream processors of the 8800 GT come clocked at 1500 MHz, the same speed as the stream processors of the GeForce 8800 Ultra. Comparatively, the GeForce 8800 GTX comes with its stream processors clocked at 1350 MHz while the 8800 GTS' stream processors are clocked at 1200 MHz.
The GeForce 8800 GTS features up to 1024MB of GDDR3 memory, which is based on a 256-bit memory interface. According to NVIDIA, total memory bandwidth rings in at 57.6 GBps, and the texture fill rate is 33.6 billion/second. Naturally, the two preceding figures are only theoretical values; actual values are bound to be quite different.
The GDDR3 memory of the GeForce 8800 GT comes clocked at 900 MHz -- equal to the memory frequency of the GeForce 8800 GTX. However, the 8800 GT falls short of the 8800 Ultra's memory speed, which is 1080 MHz (2160 MHz effective).
NVIDIA guidance states that the GeForce 8800 GT supports the new PCIe 2.0 bus standard. The
PCI-Express Special Interest Group claims that the new bus standard yield improvements in bandwidth.
High-definition video fans will be glad to hear that the GeForce 8800 GTS comes integrated with support for NVIDIA's 2nd generation PureVideo HD engine, which allows for H.264 video decoding to be offloaded from the processor and on to the video card. HDCP support is also present on all reference designs.
NVIDIA guidance promises a hard launch with its GeForce 8800 GT cards, however, so far only Gigabyte, Palit and Zotac have 8800 GT-based offerings. Newegg independently confirmed with
DailyTech that the card will be available online after the 6AM embargo lift.
Posts
It's a bit worse than the 8800gtx but better than the 8800gts:
What does it mean for the 8600 cards? They're fucking useless. Sucks for 8600 owners.
So there is still a place for the old mid to high-end cards.
Let's play Mario Kart or something...
So yeah, the 8800gt fails at higher resolutions.
And no, there's no reason for the 8800GTS 320mb and 640mb to exist.
first of all, tom's hardware is a piece of shit.
2nd of all, all you demonstrated was that it cant keep at high resolutions with high AA and AF.
its ONLY when you introduce those latter two things - whose performance under is directly related to memory size - that it is no longer able to tail the GTX.
http://www.vr-zone.com/articles/Nvidia_Geforce_8800GT_Review/5369-1.html
Hi, thanks for making me feeling even better.
i only bought it a month ago...
Although I'm waiting to see how the RV670 pans out before I make any hasty purchase decisions...
Also, it came with a copy of Quake Wars (I had to register with EVGA, which sent it to me in the mail in a paper envelope sleeve). Do I have to return the disk as well, or do I keep it because that was done through EVGA rather than newegg?
Edit: Now that I rechecked Newegg, apparently my card is non-refundable. I suppose the step-up program is an option, and if I pretend I bought quake wars, I really only lost 50 bucks.
such a kick in the balls.
Not a video card, but I did RMA a dead motherboard a few weeks earlier and it was pretty painless. Definitely return it if you can. I would but the deadline on my 8800GTS 320mb passed a few days ago.
Haha you too huh? As soon as my computer arrives from back home, methinks I'll be replacing my horribly antiquated x800 with this bad boy. The price is certainly right (although I'm sure up here in Canada we'll still be paying $300+ for it, despite the exchange rate) and if the only downside is a slight faltering at higher resolutions, I'm sold.
Frankly, I couldn't even care less what ATI has to offer. Even if their new card blew this one out of the water, I've lost so much faith in ATI after this last graphics card round. The crappy drivers, the poor performance, the mediocre prices...
So glad I didn't drop the cash on a GTS320mb earlier in the year.
I have a X1600.
The single slot cooler on this is a tad off. They were having, and still do have, heat problems on this card. This means that you're not going to see as good OC'd stock cards. And a 8800gts 640mb, when clocked to the same specs, still hands this card it's ass. You can get those clocks on most stock cooled cards on the 8800gts 640mb. Of course if you ditch the stock cooler on this it will be a monster.
As stated by others it falls apart at higher resolutions. As scary as it sounds we've passed the point where 512mb is "enough" for higher resolutions. Clock rates can be changed, vRAM can't.
The HD video part is nice, but... that was only needed in the lower end cards since they lacked the punch for it in other areas.
This has killed the 8600 series, and killed the 8800320mb, but the higher end parts still pack a punch at higher resolutions, and aren't worthless. No need to panic if you already bought them.
This is more of an "end of life for geforce 8" series buy. Given that geforce 9 with dx 10.1 and pci-e 2.0 cards are going to drop soon, this is your chance to grab a decent part cheap, rather then getting something truly next gen.
How is this affected by dx10.1 that MS is throwing around now ?
Librarians harbor a terrible secret. Find it.
speaking as a person in the "Oof, my nuts. I just bought a 8800GTS640 x single digit weeks ago" crowd, also as a "I finally understand cpu and memory overclocking, but how the hell do I video card" crowd, how did I OC videocard?
I may be completely wrong here so please correct me if I am, but is this a process you do using an application in $OPERATING_SYSTEM? Or is this involving using some secret code to unlock extra stuff in my BIOS?
For nvidia cards riviatuner will let you OC both core clock and mem clock, though only core will net you real gains.
the stock 8800gts is 500mhz, the gt 600mhz.
Now, most OC'd GTS cards that you buy are around 575mhz, I know people that have OC'd them a lot higher.
BIOS unlocking with video cards is another matter. You can possibly gain extra shaders, or say convert a geforce to a quadro, but IIRC they laser lock that on a good portion of cards so it's a "try at your own risk" option.
http://www.tweaktown.com/reviews/1210/16/page_16_temperature_and_sound_tests/index.html
As loud as the 2900XT. (Kinda loud)
pcie 2.0... will we need a new mobo to make this card work?
PSN: super_emu
Xbox360 Gamertag: Emuchop
Nope, PCIe 2 is forward+backward compatible.
e.g. you can run PCIe 1 cards in PCIe 2 slots, and PCIe 2 cards in PCIe 1 slots
http://www.anandtech.com/video/showdoc.aspx?i=3140&p=4
Nope. But from my understanding, if you DO have a pcie 2.0 board, you don't have to give the card additional power.
On that note, are there any pcie 2.0 motherboards? Been looking, haven't seen any.
PSN: super_emu
Xbox360 Gamertag: Emuchop
All the reviews I've read have been done with the card in pcie1 boards. So I guess as far as we know, how it performs in pcie1 is just how it performs.
X38 chipsets are PCIe 2.0
The same.
What's the status on SLI + X38? I'm on a "holdover" Asus P5N-32 SLI PLUS board (feature-compatible with 680i) because it was the cheapest board with SLI & either DTS Connect or DolbyDigital Live (DTS Connect in this case). I haven't even used it yet and it looks like there is a good X38 option. Should I go ahead and activate Vista Ultimate x64 with this board or hold off for an SLI-compatible X38 with either DD/DTS real-time encoding? I mean, I was torn between the better performing P35 and SLI before, so this satisfies both OCD angles.
Oh, and it's the PureHD in a high-end part that has me sold on these cards.
Will there be DDR2 enthusiast boards based on X38? I mean, I only just got my 4GB, and I ain't goin' DDR3 anytime soon.
oh, I guess the better question would of be how does it perform on pcie 2. hm..mm
PSN: super_emu
Xbox360 Gamertag: Emuchop
I dunno, wouldn't SLI 8800gt out perform a single GTX? they would both have the same entry cost...
PSN: super_emu
Xbox360 Gamertag: Emuchop
In any case, I might look into this card if I don't see anything that catches my eye in the 9000 series.
Edit2: They suggest putting on an aftermarket video card cooler/fan to solve the noise problem. Are these easy to install, and do they void the warranty?
This depends on the game.
In SLI vRAM is not doubled, since the frame buffer is loaded into both cards. Hence 8800gts 640mb SLI is faster then a single 8800gtx at higher resolutions, but 8800gts 320mb SLI is slower then a single 8800gtx at higher resolutions.
Hence the inherent problem in cards with 512mb or lower vRAM and SLI. Given that some current games can use over that, and SLI doesn't double the vRAM, it's silly to SLI some cards because you do not remove their inherent bottleneck.
In theory, yes it is faster until you reach the point where 512mb is not enough. But since the point of SLI is absurdly high resolutions in the latest and greatest games you might reach that quicker then you think.
x38 will not have SLI. The only intel + SLI platform uses xeon class (ie pentium EE chips) costing over 1k each and is a dual CPU socket board.
There are DDR2 x38 boards out though.
780i comes out soon, but the real selling point on that is tripple SLI and that only works on gtx/ultra boards
interesting..
my knowledge on sli is very limited as I've never done it before..
but for gamers like me who are locked in at 1650x1080 lcd screens, hopefully it would be a pretty good solution for us to go with dual GT over a single GTX...
PSN: super_emu
Xbox360 Gamertag: Emuchop
The GTS owners are the ones who probably feel the hurt the most. Buying a rather expensive next-to-best card you might expect to get trumped in speed in a year or so, but you don't expect to get hit by both the speed and price train.
That said something like this was needed. As one of many people with a nice last-gen card (X1950XT) there was absolutely no reason for me to upgrade to a new card because I don't care about DX10. The performance gains were minimal aside from at huge resolutions limiting their market to only the super hardcore. The enthusiasts like myself somewhat below that level were never going to bite at $3-600 when we weren't getting much of a performance upgrade aside from DX10 which isn't even a valid selling point at this time.
So while I'm not surprised they released a card faster than the 8800GTS, I am surprised they aren't just pushing the GTS down to $200 and slotting the GT into the GTS's old price slot. That doesn't make a lot of sense to me.