So, I figured out why my memory wouldn't overclock. This card was built with Micron Memory instead of the better Samsung chips. Now it makes perfect sense. I can live with it.
0
mojojoeoA block off the park, living the dream.Registered Userregular
I have a used 780ti gigabyte windforce edition for sale but no idea how to price it?
You guys know what i should as for it?
Chief Wiggum: "Ladies, please. All our founding fathers, astronauts, and World Series heroes have been either drunk or on cocaine."
Ordered my parts! Went with the EVGA 1080 instead of the Palit, probably would have been no difference but I know EVGAs return policy is rock solid so the peace of mind is good. Pics incoming tomorrow!
edit: in my foolish excitement I forgot to get cables. I've used DVI for the past few years, is it worth switching to displayport?
Ordered my parts! Went with the EVGA 1080 instead of the Palit, probably would have been no difference but I know EVGAs return policy is rock solid so the peace of mind is good. Pics incoming tomorrow!
edit: in my foolish excitement I forgot to get cables. I've used DVI for the past few years, is it worth switching to displayport?
Generally yes. Though if you're at 1080p there isn't much of a practical difference.
Go to ebay and search for the card. On the left hand side, check the box that says "Completed Auctions." Any price in green was a successful auction (i.e. someone actually bid).
I'm not sure how you're planning to sell it, but that's a fairly easy way to get a sense of the secondary market.
Ordered my parts! Went with the EVGA 1080 instead of the Palit, probably would have been no difference but I know EVGAs return policy is rock solid so the peace of mind is good. Pics incoming tomorrow!
edit: in my foolish excitement I forgot to get cables. I've used DVI for the past few years, is it worth switching to displayport?
Generally yes. Though if you're at 1080p there isn't much of a practical difference.
1440p, will grab a displayport cable today. Thanks!
So modern monitor stuff confuses the hell out of me. Despite a not insignificant amount of effort to figure out what all this alphabet soup means, I'm still at a loss. Like I read the stuff on Tom's Hardware about G-Sync, and it seems like folks in their blind test liked it. But does that make it worth the $150-200 premium it seems to add to monitors? Is FreeSync a viable alternative? What does WQHD mean (and does it matter)?
Hypothetically, if I wanted a 27" monitor w/ 144Hz that's going to look nice for gaming, is this Acer XF270HU 27-inch WQHD Widescreen LCD Monitor a solid choice? It just had a 12% price drop on Amazon. (Oh, I guess that one's recertified. Well is it a good monitor new?)
WQHD is just another word for 1440p, most standard resolutions have some kind of acronym associated with them, even if most folks never use them.
G-Sync only works with Nvidia cards, whereas FreeSync is meant to be open source. In reality that just means that only AMD cards work with it. Kinda sucks that you end up being locked to a manufacturer if you go the synced refresh rate route, but it is what it is.
Correct me if I'm wrong, but this type of stuff also happened when PhysX was first released. I believe the processor (or, at least, the ability) is baked into both nVidia and AMD, now; whereas when it was first released, it was nVidia only.
WQHD is just another word for 1440p, most standard resolutions have some kind of acronym associated with them, even if most folks never use them.
G-Sync only works with Nvidia cards, whereas FreeSync is meant to be open source. In reality that just means that only AMD cards work with it. Kinda sucks that you end up being locked to a manufacturer if you go the synced refresh rate route, but it is what it is.
So is FreeSync not a good option if I have an Nvidia card?
WQHD is just another word for 1440p, most standard resolutions have some kind of acronym associated with them, even if most folks never use them.
G-Sync only works with Nvidia cards, whereas FreeSync is meant to be open source. In reality that just means that only AMD cards work with it. Kinda sucks that you end up being locked to a manufacturer if you go the synced refresh rate route, but it is what it is.
So is FreeSync not a good option if I have an Nvidia card?
yea nVidia doesn't support it on their cards.
IT really is a mess. gsync is proprietary to nVidia, so AMD doesn't have access to use it, so AMD uses freesync, which is open source, but nvidia doesn't support it because they have their own proprietary standard.
And it even makes monitor shopping less fun. You usually find one or the other supported in a monitor, which means if you buy a gysnc monitor if you want to take advantage of it you're kind of locked into nVidia cards and vice versa for freesync/AMD
Correct me if I'm wrong, but this type of stuff also happened when PhysX was first released. I believe the processor (or, at least, the ability) is baked into both nVidia and AMD, now; whereas when it was first released, it was nVidia only.
I think the difference here is that PhysX was software related, whereas the G/Freesync is hardware; built into the screens themselves. Thus I imagine it will be a lot harder for AMD to overcome the closed proprietary system that Nvidia have built.
@lolo, a Freesync monitor will still work as a normal monitor with an Nvidia card, but you won't be able to use the Freesync.
PhysX is still an Nvidia thing but AMD cards can still use it but the processing, I think, is offloaded to the CPU. The last time I played a game with PhysX, Borderlands 2, and enabling PhysX on an AMD card came with a heavy and noticeable performance loss.
Okay, let me see if I've gotten this right: If I decide to take the plunge on a monitor with adaptive refresh (which lets refresh rates be controlled by the GPU rather than be throttled by the monitor and which G-sync and Freesync are both flavors of), it should be one with G-sync if I have an Nvidia card.
Okay, let me see if I've gotten this right: If I decide to take the plunge on a monitor with adaptive refresh (which lets refresh rates be controlled by the GPU rather than throttled by the monitor and which G-sync and Freesync are both flavors of), it should be one with G-sync if I have an Nvidia card.
This is a correct statement. To be more specific, the GeForce 970, which I believe is your stated card, supports Gsync as well. So a 970 paired with a Gsync monitor will provide an adaptive refresh rate.
Okay, let me see if I've gotten this right: If I decide to take the plunge on a monitor with adaptive refresh (which lets refresh rates be controlled by the GPU rather than throttled by the monitor and which G-sync and Freesync are both flavors of), it should be one with G-sync if I have an Nvidia card.
Correct. And then be prepared to only ever buy nVidia cards again as long as you own that monitor.
Frankly, unless they shit the bed, I'm okay with sticking with Nvidia GPUs.
Yeah there would have to be some sort of serious ass-fuckery by Nvidia to get me to change over to AMD at this point. Their cards have been just that little bit better for so long now, just like with Intel vs. AMD processors. At least they managed to get the supply contracts for both major gaming consoles, so they won't be going broke any time soon hopefully!
Are kabylake CPUs going to be worth it over skylake?
on desktop sides, you'll probably see the standard 10%ish improvement in performance. Most of the innovation right now is still on power efficiency on mobile parts. If someone has to buy right now I have no problem saying buy skylake. If it can wait....eh, than maybe wait and see.
It really is sad and I do harp on this a bunch in the thread but I'm more excited in the chipsets than the processors now. I really want to see more/faster USB 3.1 adoption in the chipset along with more USB-C support, but we're probably at least a year or so away from things like that./
I debated waiting a few months on Kabylake but nothing stood out enough enough for my uses to warrant waiting. Optane sounds good on paper, but much like NVMe SSD's, it's not going to noticeable for gaming and day to day use (at least for me). Once prices on NVMe M.2 SSD's drop to $200'ish for 500gb I will consider upgrading from my 850 EVO. I also don't have much use for USB 3.1 gen 2 since my only USB 3 devices are a flash drive and hard drive dock.
That_GuyI don't wanna be that guyRegistered Userregular
edited September 2016
2018's going to be a landmark year for personal computing. I am getting excited just thinking about it. We're going to see the culmination of a decade of processor development and component interconnection. TDP is about all we can improve from there without a radical shift in the materials we're using for transistors. This isn't the end for silicone but desktop computers won't be getting any faster for a while.
So IPS vs. TN. Sounds like IPS gets you better colors and viewing angles generally, but TN is cheaper and has lower response times on the whole. There seems to be a lot of IPS love, but viewing angles aren't a concern for me (it's my desktop monitor. I sit in front of it.) and something like the DELL S2716DG checks all the other boxes aside from IPS - 27", 144hz and G-sync (and a nice understated design that doesn't look like a Pterodactyl. Looking at you Acer Predator!) And the Dell $200ish cheaper than something like the ASUS ROG SWIFT PG279Q which is all that and also IPS.
Does anyone happen to have experience with the Dell S2716DG? I thought maybe I'd go look at one at Best Buy, but they all seem to be out of stock until mid October round here.
The other thing is that "response time" listed by companies is a mostly useless spec, it's just the time the monitor can change from grey to a different grey or from black to white. And companies will cherry pick whatever test gives them the lowest number. Once you get below about 8ms, any ghosting effects are unnoticeable.
What really matters is input lag, or the time you go from a mouse/keyboard input to it showing on the monitor, and that spec they never list. You need to go hunt it down at somewhere like http://www.displaylag.com/
2018's going to be a landmark year for personal computing. I am getting excited just thinking about it. We're going to see the culmination of a decade of processor development and component interconnection. TDP is about all we can improve from there without a radical shift in the materials we're using for transistors. This isn't the end for silicone but desktop computers won't be getting any faster for a while.
I doubt that. People have been saying the same thing for years and nobody's been right yet.
0
That_GuyI don't wanna be that guyRegistered Userregular
2018's going to be a landmark year for personal computing. I am getting excited just thinking about it. We're going to see the culmination of a decade of processor development and component interconnection. TDP is about all we can improve from there without a radical shift in the materials we're using for transistors. This isn't the end for silicone but desktop computers won't be getting any faster for a while.
I doubt that. People have been saying the same thing for years and nobody's been right yet.
There's a physical limit to the size you can manufacture silicon based transistors at before quantum tunneling starts becoming a problem. 10nm is that limit. You can push that limit to 7nm with some more exotic materials, but such material is not currently ready for mass production. I fully expect us to push the limits of what's possible per watt, but transistors can't be packed any more densely.
Posts
You guys know what i should as for it?
sold ebay listings show somewhere around ~$200 +/- $30.
edit: in my foolish excitement I forgot to get cables. I've used DVI for the past few years, is it worth switching to displayport?
Generally yes. Though if you're at 1080p there isn't much of a practical difference.
Go to ebay and search for the card. On the left hand side, check the box that says "Completed Auctions." Any price in green was a successful auction (i.e. someone actually bid).
I'm not sure how you're planning to sell it, but that's a fairly easy way to get a sense of the secondary market.
1440p, will grab a displayport cable today. Thanks!
Hypothetically, if I wanted a 27" monitor w/ 144Hz that's going to look nice for gaming, is this Acer XF270HU 27-inch WQHD Widescreen LCD Monitor a solid choice? It just had a 12% price drop on Amazon. (Oh, I guess that one's recertified. Well is it a good monitor new?)
Steam profile.
Getting started with BATTLETECH: Part 1 / Part 2
G-Sync only works with Nvidia cards, whereas FreeSync is meant to be open source. In reality that just means that only AMD cards work with it. Kinda sucks that you end up being locked to a manufacturer if you go the synced refresh rate route, but it is what it is.
http://steamcommunity.com/id/pablocampy
So is FreeSync not a good option if I have an Nvidia card?
Steam profile.
Getting started with BATTLETECH: Part 1 / Part 2
yea nVidia doesn't support it on their cards.
IT really is a mess. gsync is proprietary to nVidia, so AMD doesn't have access to use it, so AMD uses freesync, which is open source, but nvidia doesn't support it because they have their own proprietary standard.
And it even makes monitor shopping less fun. You usually find one or the other supported in a monitor, which means if you buy a gysnc monitor if you want to take advantage of it you're kind of locked into nVidia cards and vice versa for freesync/AMD
I think the difference here is that PhysX was software related, whereas the G/Freesync is hardware; built into the screens themselves. Thus I imagine it will be a lot harder for AMD to overcome the closed proprietary system that Nvidia have built.
@lolo, a Freesync monitor will still work as a normal monitor with an Nvidia card, but you won't be able to use the Freesync.
http://steamcommunity.com/id/pablocampy
Steam profile.
Getting started with BATTLETECH: Part 1 / Part 2
This is a correct statement. To be more specific, the GeForce 970, which I believe is your stated card, supports Gsync as well. So a 970 paired with a Gsync monitor will provide an adaptive refresh rate.
SteamID: edgruberman GOG Galaxy: EdGruberman
Correct. And then be prepared to only ever buy nVidia cards again as long as you own that monitor.
Now to get back to pricewatching. I really need to see all. the, frames, for Solitairica, Warlock of Firetop Mountain and Magic Online!
Steam profile.
Getting started with BATTLETECH: Part 1 / Part 2
Yeah there would have to be some sort of serious ass-fuckery by Nvidia to get me to change over to AMD at this point. Their cards have been just that little bit better for so long now, just like with Intel vs. AMD processors. At least they managed to get the supply contracts for both major gaming consoles, so they won't be going broke any time soon hopefully!
It will probably be the same ol 10-15% performance increase, maybe run a little cooler.
Don't upgrade if your on skylake already, but if you were going to build a new computer soon and can wait the 3 months I'd do so.
But if you don't want to wait it's not that much of an upgrade to worry about.
I might just snag a corsair because fuck it at this point.
Was really hoping for something with USB-C/3 and thunderbolt.
on desktop sides, you'll probably see the standard 10%ish improvement in performance. Most of the innovation right now is still on power efficiency on mobile parts. If someone has to buy right now I have no problem saying buy skylake. If it can wait....eh, than maybe wait and see.
It really is sad and I do harp on this a bunch in the thread but I'm more excited in the chipsets than the processors now. I really want to see more/faster USB 3.1 adoption in the chipset along with more USB-C support, but we're probably at least a year or so away from things like that./
Cannonlake though is going to be on 10nm, so that should be more interesting.
The 1TB Samsung Pro M.2 PCI-E is dropping in October.
Probably a $700+ hard drive.
Happy fucking birthday to me.
So IPS vs. TN. Sounds like IPS gets you better colors and viewing angles generally, but TN is cheaper and has lower response times on the whole. There seems to be a lot of IPS love, but viewing angles aren't a concern for me (it's my desktop monitor. I sit in front of it.) and something like the DELL S2716DG checks all the other boxes aside from IPS - 27", 144hz and G-sync (and a nice understated design that doesn't look like a Pterodactyl. Looking at you Acer Predator!) And the Dell $200ish cheaper than something like the ASUS ROG SWIFT PG279Q which is all that and also IPS.
Does anyone happen to have experience with the Dell S2716DG? I thought maybe I'd go look at one at Best Buy, but they all seem to be out of stock until mid October round here.
Steam profile.
Getting started with BATTLETECH: Part 1 / Part 2
The other thing is that "response time" listed by companies is a mostly useless spec, it's just the time the monitor can change from grey to a different grey or from black to white. And companies will cherry pick whatever test gives them the lowest number. Once you get below about 8ms, any ghosting effects are unnoticeable.
What really matters is input lag, or the time you go from a mouse/keyboard input to it showing on the monitor, and that spec they never list. You need to go hunt it down at somewhere like http://www.displaylag.com/
I doubt that. People have been saying the same thing for years and nobody's been right yet.
There's a physical limit to the size you can manufacture silicon based transistors at before quantum tunneling starts becoming a problem. 10nm is that limit. You can push that limit to 7nm with some more exotic materials, but such material is not currently ready for mass production. I fully expect us to push the limits of what's possible per watt, but transistors can't be packed any more densely.
So buy the 27", 144Hz, 2560x1440 1ms G-Sync TN screen, but...
...don't buy the 27", 144Hz, 2560x1440 1ms G-Sync IPS screen?
I mean, that's heartening, because it's $130 cheaper. But why?
EDIT: What's an extra '4' here or there?
Steam profile.
Getting started with BATTLETECH: Part 1 / Part 2