Ok so i currently am using an XFX Geforce 9800GTX (from when it first came out.) and i've been wanting to switch to ATI, i used to be a big ATI guy but im not particularly biased anymore as they both seem to have pros and cons these days, i will say i have enjoyed the PhysX properties in the few games that support it.
I am happy with my current card, it seems to run pretty much everything still at highest settings other than a few select things. The thing is the rest of my PC is upgraded and now my weakest link is literally my vid card.
I am not looking for stuff that is top of the line and $700 here, i am looking for the sub $300 range if possible, the most bang for my buck so to speak.
I'd prefer to go ATI this time around as they seem to have better values from what i have seen. I've got a quad core AMD, 6gigs of DDR2, (2 matched pairs. ) , 650 Watt PSU, and i am running Win7. I've been looking at some of the DX11 cards but honestly they dont seem to really be all that appealing for their current prices. I've been looking so far at
this card and
this card.
But i would really like some suggestions? I dont want a debate of ATI vs Nvidia or anything like that, and feel free to mention both companies cards if the deal is there and you think it's worth it.
Posts
edit: Here's a link to the anandtech article where they compare the 4890 to the gtx275
Looking for a Hardcore Fantasy Extraction Shooter? - Dark and Darker
Otherwise, he should wait until the 5870s are cheaper. It's not really a good time to upgrade a GPU that's already decent.
Well it feels like my 9800GTX is going into the realm of obsolete. I got it when they were first released which was a while back now. I mean i guess it does everything i need it to do, but the DX 11 cards are just too much money for their power. The higher end 4k series ATI's look really nice for their price on the other hand. I havn't even seen a game that uses DX11 yet.
It "feels" like it's going obsolete? Dirt2 is DX11 btw, and other dx11 games were announced.
I really think it's a bad idea to upgrade from a good GPU to a not-that-much-better GPU that's not up to date. That's my definition of a bad upgrade.
But, hey, it's your money to burn, suit yourself.
If you can muster the patience to hold off until the spring, you can get a 5870 for a reasonable price and have a card that's going to last you several years.
Hmm maybe you guys are right, i was thinking DX11 would be like DX10 which didnt really even get used in most games, and when it did the difference was marginal at best.
well, DX10 was not that big, but DX11 is supposed to be. Like 9.
And well, I'd rather have the latest DX, anyway, LOTRO looks awesome with it on.
PSN: TheScrublet
(Please do not gift. My game bank is already full.)
I mean from my own perspective, games aren't really likely to be pushing DX11 this generation largely because the console market is still on DX9, and will remain that way until the next generation. And that won't be for a while yet. Any additions using DX11 would largely be pretty superficial, along the lines of the DX10 stuff added to games. It gets applied to the surface but doesn't make too much of a visual difference.
What changes is DX11 actually bringing, and how significant are they? Is it going to streamline performance? Dynamic pipelines, that sort of thing?
DirectX 11 is a much closer API to DirectX 9 than 10 was. I always think of DirectX 10/10.1 just like I do DirectX 8/8.1. Very few games actually targeted those versions, they either targeted DirectX 7 or later DirectX 9. DirectX 8 and 8.1 were almost like experimental releases, to get some really cutting edge stuff out there. That's what DX 10/10.1 were, and DX 11 is going to take all of that and wrap it up in to a much more streamlined API. With the 11on9 technology that's built in as well (the capability to emulated many DX11 features on DX9 cards), it's almost certain that DX11 will catch on as the next major Windows graphics API.
Also, the assertion that PC games won't use DX11 because consoles are DX9 is sort of misleading. First off, the XBox 360 is not actually DirectX 9. The API is very, very similar, but you can't just take a stock DX9 rendering engine and make it work on the 360, it will require tweaks. Second, the PS3 is not even remotely close to any kind of DX API. Developers are already used to having two or three rendering engines for their game.
Where did you get the idea i am using a 7900? Im using a 9800GTX
I've got an 8800GT single slot card, and am drooling at the possibility of getting an HD5850.
Either that or more memory as I've still got a 2 GB kit of Corsair Memory. I want to get a 4GB kit, but I'm still stuck on 32bit Vista.
This doesn't help the fact that I want to buy a car as well, but that's something for another discussion.
Yeah for me i've already got a good quad core, and 6gb of decent ram, Win7 Ultimate 64. Only things i want now are another HDD and a new vid card....maybe its just impatience driving me.
We already have this, it's called OpenCL. What does DirectCompute bring to the table?