As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/
We're funding a new Acquisitions Incorporated series on Kickstarter right now! Check it out at https://www.kickstarter.com/projects/pennyarcade/acquisitions-incorporated-the-series-2

[PC Build Thread] NVIDIA can't stop releasing GPUs like Oprah can't stop releasing bees

11718202223101

Posts

  • übergeekübergeek Sector 2814Registered User regular
    So, I figured out why my memory wouldn't overclock. This card was built with Micron Memory instead of the better Samsung chips. Now it makes perfect sense. I can live with it.

    camo_sig.png
  • mojojoeomojojoeo A block off the park, living the dream.Registered User regular
    I have a used 780ti gigabyte windforce edition for sale but no idea how to price it?

    You guys know what i should as for it?

    Chief Wiggum: "Ladies, please. All our founding fathers, astronauts, and World Series heroes have been either drunk or on cocaine."
  • FoomyFoomy Registered User regular
    mojojoeo wrote: »
    I have a used 780ti gigabyte windforce edition for sale but no idea how to price it?

    You guys know what i should as for it?

    sold ebay listings show somewhere around ~$200 +/- $30.

    Steam Profile: FoomyFooms
  • mojojoeomojojoeo A block off the park, living the dream.Registered User regular
    Wow! thats way better than i thought.

    Chief Wiggum: "Ladies, please. All our founding fathers, astronauts, and World Series heroes have been either drunk or on cocaine."
  • Santa ClaustrophobiaSanta Claustrophobia Ho Ho Ho Disconnecting from Xbox LIVERegistered User regular
    I think those cards are still highly thought of.

  • mere_immortalmere_immortal So tasty!Registered User regular
    edited September 2016
    Ordered my parts! Went with the EVGA 1080 instead of the Palit, probably would have been no difference but I know EVGAs return policy is rock solid so the peace of mind is good. Pics incoming tomorrow!

    edit: in my foolish excitement I forgot to get cables. I've used DVI for the past few years, is it worth switching to displayport?

    mere_immortal on
    Steam: mere_immortal - PSN: mere_immortal - XBL: lego pencil - Wii U: mimmortal - 3DS: 1521-7234-1642 - Bordgamegeek: mere_immortal
  • wunderbarwunderbar What Have I Done? Registered User regular
    Ordered my parts! Went with the EVGA 1080 instead of the Palit, probably would have been no difference but I know EVGAs return policy is rock solid so the peace of mind is good. Pics incoming tomorrow!

    edit: in my foolish excitement I forgot to get cables. I've used DVI for the past few years, is it worth switching to displayport?

    Generally yes. Though if you're at 1080p there isn't much of a practical difference.

    XBL: thewunderbar PSN: thewunderbar NNID: thewunderbar Steam: wunderbar87 Twitter: wunderbar
  • MugsleyMugsley DelawareRegistered User regular
    mojojoeo wrote: »
    Wow! thats way better than i thought.

    Go to ebay and search for the card. On the left hand side, check the box that says "Completed Auctions." Any price in green was a successful auction (i.e. someone actually bid).

    I'm not sure how you're planning to sell it, but that's a fairly easy way to get a sense of the secondary market.

  • mere_immortalmere_immortal So tasty!Registered User regular
    wunderbar wrote: »
    Ordered my parts! Went with the EVGA 1080 instead of the Palit, probably would have been no difference but I know EVGAs return policy is rock solid so the peace of mind is good. Pics incoming tomorrow!

    edit: in my foolish excitement I forgot to get cables. I've used DVI for the past few years, is it worth switching to displayport?

    Generally yes. Though if you're at 1080p there isn't much of a practical difference.

    1440p, will grab a displayport cable today. Thanks!

    Steam: mere_immortal - PSN: mere_immortal - XBL: lego pencil - Wii U: mimmortal - 3DS: 1521-7234-1642 - Bordgamegeek: mere_immortal
  • IoloIolo iolo Registered User regular
    So modern monitor stuff confuses the hell out of me. Despite a not insignificant amount of effort to figure out what all this alphabet soup means, I'm still at a loss. Like I read the stuff on Tom's Hardware about G-Sync, and it seems like folks in their blind test liked it. But does that make it worth the $150-200 premium it seems to add to monitors? Is FreeSync a viable alternative? What does WQHD mean (and does it matter)?

    :(

    Hypothetically, if I wanted a 27" monitor w/ 144Hz that's going to look nice for gaming, is this Acer XF270HU 27-inch WQHD Widescreen LCD Monitor a solid choice? It just had a 12% price drop on Amazon. (Oh, I guess that one's recertified. Well is it a good monitor new?)

    Lt. Iolo's First Day
    Steam profile.
    Getting started with BATTLETECH: Part 1 / Part 2
  • DhalphirDhalphir don't you open that trapdoor you're a fool if you dareRegistered User regular
    Either the Acer or ASUS monitors are good buys in the 1440p 144hz range.

  • CampyCampy Registered User regular
    WQHD is just another word for 1440p, most standard resolutions have some kind of acronym associated with them, even if most folks never use them.

    G-Sync only works with Nvidia cards, whereas FreeSync is meant to be open source. In reality that just means that only AMD cards work with it. Kinda sucks that you end up being locked to a manufacturer if you go the synced refresh rate route, but it is what it is.

    Synthesis
  • MugsleyMugsley DelawareRegistered User regular
    Correct me if I'm wrong, but this type of stuff also happened when PhysX was first released. I believe the processor (or, at least, the ability) is baked into both nVidia and AMD, now; whereas when it was first released, it was nVidia only.

  • IoloIolo iolo Registered User regular
    Thanks, Folks.
    Campy wrote: »
    WQHD is just another word for 1440p, most standard resolutions have some kind of acronym associated with them, even if most folks never use them.

    G-Sync only works with Nvidia cards, whereas FreeSync is meant to be open source. In reality that just means that only AMD cards work with it. Kinda sucks that you end up being locked to a manufacturer if you go the synced refresh rate route, but it is what it is.

    So is FreeSync not a good option if I have an Nvidia card?

    Lt. Iolo's First Day
    Steam profile.
    Getting started with BATTLETECH: Part 1 / Part 2
  • wunderbarwunderbar What Have I Done? Registered User regular
    Iolo wrote: »
    Thanks, Folks.
    Campy wrote: »
    WQHD is just another word for 1440p, most standard resolutions have some kind of acronym associated with them, even if most folks never use them.

    G-Sync only works with Nvidia cards, whereas FreeSync is meant to be open source. In reality that just means that only AMD cards work with it. Kinda sucks that you end up being locked to a manufacturer if you go the synced refresh rate route, but it is what it is.

    So is FreeSync not a good option if I have an Nvidia card?

    yea nVidia doesn't support it on their cards.

    IT really is a mess. gsync is proprietary to nVidia, so AMD doesn't have access to use it, so AMD uses freesync, which is open source, but nvidia doesn't support it because they have their own proprietary standard.

    And it even makes monitor shopping less fun. You usually find one or the other supported in a monitor, which means if you buy a gysnc monitor if you want to take advantage of it you're kind of locked into nVidia cards and vice versa for freesync/AMD

    XBL: thewunderbar PSN: thewunderbar NNID: thewunderbar Steam: wunderbar87 Twitter: wunderbar
  • CampyCampy Registered User regular
    Mugsley wrote: »
    Correct me if I'm wrong, but this type of stuff also happened when PhysX was first released. I believe the processor (or, at least, the ability) is baked into both nVidia and AMD, now; whereas when it was first released, it was nVidia only.

    I think the difference here is that PhysX was software related, whereas the G/Freesync is hardware; built into the screens themselves. Thus I imagine it will be a lot harder for AMD to overcome the closed proprietary system that Nvidia have built.

    @lolo, a Freesync monitor will still work as a normal monitor with an Nvidia card, but you won't be able to use the Freesync.

  • CormacCormac Registered User regular
    PhysX is still an Nvidia thing but AMD cards can still use it but the processing, I think, is offloaded to the CPU. The last time I played a game with PhysX, Borderlands 2, and enabling PhysX on an AMD card came with a heavy and noticeable performance loss.

    Steam: Gridlynk | PSN: Gridlynk | FFXIV: Jarvellis Mika
  • IoloIolo iolo Registered User regular
    edited September 2016
    Okay, let me see if I've gotten this right: If I decide to take the plunge on a monitor with adaptive refresh (which lets refresh rates be controlled by the GPU rather than be throttled by the monitor and which G-sync and Freesync are both flavors of), it should be one with G-sync if I have an Nvidia card.

    Iolo on
    Lt. Iolo's First Day
    Steam profile.
    Getting started with BATTLETECH: Part 1 / Part 2
    schussElvenshaeDhalphir
  • Ed GrubermanEd Gruberman Registered User regular
    Iolo wrote: »
    Okay, let me see if I've gotten this right: If I decide to take the plunge on a monitor with adaptive refresh (which lets refresh rates be controlled by the GPU rather than throttled by the monitor and which G-sync and Freesync are both flavors of), it should be one with G-sync if I have an Nvidia card.

    This is a correct statement. To be more specific, the GeForce 970, which I believe is your stated card, supports Gsync as well. So a 970 paired with a Gsync monitor will provide an adaptive refresh rate.

    steam_sig.png

    SteamID: edgruberman GOG Galaxy: EdGruberman
  • wunderbarwunderbar What Have I Done? Registered User regular
    Iolo wrote: »
    Okay, let me see if I've gotten this right: If I decide to take the plunge on a monitor with adaptive refresh (which lets refresh rates be controlled by the GPU rather than throttled by the monitor and which G-sync and Freesync are both flavors of), it should be one with G-sync if I have an Nvidia card.

    Correct. And then be prepared to only ever buy nVidia cards again as long as you own that monitor.

    XBL: thewunderbar PSN: thewunderbar NNID: thewunderbar Steam: wunderbar87 Twitter: wunderbar
  • MugsleyMugsley DelawareRegistered User regular
    Yes

  • IoloIolo iolo Registered User regular
    Hooray for learning! Thanks, Build Thread.

    Now to get back to pricewatching. I really need to see all. the, frames, for Solitairica, Warlock of Firetop Mountain and Magic Online!

    Lt. Iolo's First Day
    Steam profile.
    Getting started with BATTLETECH: Part 1 / Part 2
  • Santa ClaustrophobiaSanta Claustrophobia Ho Ho Ho Disconnecting from Xbox LIVERegistered User regular
    Frankly, unless they shit the bed, I'm okay with sticking with Nvidia GPUs.

    bowenElvenshaeMadpoetDonovan PuppyfuckerErlkönigDhalphirBlackDragon480Kashaar
  • Donovan PuppyfuckerDonovan Puppyfucker A dagger in the dark is worth a thousand swords in the morningRegistered User regular
    Frankly, unless they shit the bed, I'm okay with sticking with Nvidia GPUs.

    Yeah there would have to be some sort of serious ass-fuckery by Nvidia to get me to change over to AMD at this point. Their cards have been just that little bit better for so long now, just like with Intel vs. AMD processors. At least they managed to get the supply contracts for both major gaming consoles, so they won't be going broke any time soon hopefully!

    ElvenshaebowenDhalphir
  • bowenbowen How you doin'? Registered User regular
    Are kabylake CPUs going to be worth it over skylake?

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • FoomyFoomy Registered User regular
    edited September 2016
    bowen wrote: »
    Are kabylake CPUs going to be worth it over skylake?

    It will probably be the same ol 10-15% performance increase, maybe run a little cooler.

    Don't upgrade if your on skylake already, but if you were going to build a new computer soon and can wait the 3 months I'd do so.

    But if you don't want to wait it's not that much of an upgrade to worry about.

    Foomy on
    Steam Profile: FoomyFooms
    bowen
  • bowenbowen How you doin'? Registered User regular
    Yeah I'm still trying to find the right case for me!

    I might just snag a corsair because fuck it at this point.

    Was really hoping for something with USB-C/3 and thunderbolt.

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • wunderbarwunderbar What Have I Done? Registered User regular
    bowen wrote: »
    Are kabylake CPUs going to be worth it over skylake?

    on desktop sides, you'll probably see the standard 10%ish improvement in performance. Most of the innovation right now is still on power efficiency on mobile parts. If someone has to buy right now I have no problem saying buy skylake. If it can wait....eh, than maybe wait and see.

    It really is sad and I do harp on this a bunch in the thread but I'm more excited in the chipsets than the processors now. I really want to see more/faster USB 3.1 adoption in the chipset along with more USB-C support, but we're probably at least a year or so away from things like that./

    XBL: thewunderbar PSN: thewunderbar NNID: thewunderbar Steam: wunderbar87 Twitter: wunderbar
    bowen
  • ZxerolZxerol for the smaller pieces, my shovel wouldn't do so i took off my boot and used my shoeRegistered User regular
    Sounds like it's going to be more of the same. Some more chipset features (notably usb 3.1 gen 2), faster doodads, the usual.

    Cannonlake though is going to be on 10nm, so that should be more interesting.

  • bowenbowen How you doin'? Registered User regular
    Yeah I'm going to wait for it to drop I think.

    The 1TB Samsung Pro M.2 PCI-E is dropping in October.

    Probably a $700+ hard drive.

    Happy fucking birthday to me.

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
    MugsleyKashaar
  • That_GuyThat_Guy I don't wanna be that guy Registered User regular
    I think I'll be able to sit on my 2600k and 980 until Cannonlake and Volta. There's some exciting stuff coming with those chips.

    steam_sig.png
    emp123
  • CormacCormac Registered User regular
    I debated waiting a few months on Kabylake but nothing stood out enough enough for my uses to warrant waiting. Optane sounds good on paper, but much like NVMe SSD's, it's not going to noticeable for gaming and day to day use (at least for me). Once prices on NVMe M.2 SSD's drop to $200'ish for 500gb I will consider upgrading from my 850 EVO. I also don't have much use for USB 3.1 gen 2 since my only USB 3 devices are a flash drive and hard drive dock.

    Steam: Gridlynk | PSN: Gridlynk | FFXIV: Jarvellis Mika
  • That_GuyThat_Guy I don't wanna be that guy Registered User regular
    edited September 2016
    2018's going to be a landmark year for personal computing. I am getting excited just thinking about it. We're going to see the culmination of a decade of processor development and component interconnection. TDP is about all we can improve from there without a radical shift in the materials we're using for transistors. This isn't the end for silicone but desktop computers won't be getting any faster for a while.

    That_Guy on
    steam_sig.png
  • IoloIolo iolo Registered User regular
    Still obsessing about monitors over here...

    So IPS vs. TN. Sounds like IPS gets you better colors and viewing angles generally, but TN is cheaper and has lower response times on the whole. There seems to be a lot of IPS love, but viewing angles aren't a concern for me (it's my desktop monitor. I sit in front of it.) and something like the DELL S2716DG checks all the other boxes aside from IPS - 27", 144hz and G-sync (and a nice understated design that doesn't look like a Pterodactyl. Looking at you Acer Predator!) And the Dell $200ish cheaper than something like the ASUS ROG SWIFT PG279Q which is all that and also IPS.

    Does anyone happen to have experience with the Dell S2716DG? I thought maybe I'd go look at one at Best Buy, but they all seem to be out of stock until mid October round here.

    Lt. Iolo's First Day
    Steam profile.
    Getting started with BATTLETECH: Part 1 / Part 2
  • FoomyFoomy Registered User regular
    This video gives a good idea on how an IPS looks compared to a TN. So it's if the pinker whites and greyish blacks bother you.

    https://youtu.be/79r5rxS276Y

    The other thing is that "response time" listed by companies is a mostly useless spec, it's just the time the monitor can change from grey to a different grey or from black to white. And companies will cherry pick whatever test gives them the lowest number. Once you get below about 8ms, any ghosting effects are unnoticeable.

    What really matters is input lag, or the time you go from a mouse/keyboard input to it showing on the monitor, and that spec they never list. You need to go hunt it down at somewhere like http://www.displaylag.com/

    Steam Profile: FoomyFooms
    Elvenshae
  • DhalphirDhalphir don't you open that trapdoor you're a fool if you dareRegistered User regular
    For what it's worth i picked up the ROG Swift PG278Q and adore it.

    That_Guy
  • DhalphirDhalphir don't you open that trapdoor you're a fool if you dareRegistered User regular
    edited October 2016
    Don't buy the PG27*9*Q, though.

    Dhalphir on
  • LD50LD50 Registered User regular
    That_Guy wrote: »
    2018's going to be a landmark year for personal computing. I am getting excited just thinking about it. We're going to see the culmination of a decade of processor development and component interconnection. TDP is about all we can improve from there without a radical shift in the materials we're using for transistors. This isn't the end for silicone but desktop computers won't be getting any faster for a while.

    I doubt that. People have been saying the same thing for years and nobody's been right yet.

  • That_GuyThat_Guy I don't wanna be that guy Registered User regular
    LD50 wrote: »
    That_Guy wrote: »
    2018's going to be a landmark year for personal computing. I am getting excited just thinking about it. We're going to see the culmination of a decade of processor development and component interconnection. TDP is about all we can improve from there without a radical shift in the materials we're using for transistors. This isn't the end for silicone but desktop computers won't be getting any faster for a while.

    I doubt that. People have been saying the same thing for years and nobody's been right yet.

    There's a physical limit to the size you can manufacture silicon based transistors at before quantum tunneling starts becoming a problem. 10nm is that limit. You can push that limit to 7nm with some more exotic materials, but such material is not currently ready for mass production. I fully expect us to push the limits of what's possible per watt, but transistors can't be packed any more densely.

    steam_sig.png
  • IoloIolo iolo Registered User regular
    edited October 2016
    Dhalphir wrote: »
    For what it's worth i picked up the ROG Swift PG278Q and adore it.

    So buy the 27", 144Hz, 2560x1440 1ms G-Sync TN screen, but...
    Dhalphir wrote: »
    Don't buy the PG27*9*Q, though.

    ...don't buy the 27", 144Hz, 2560x1440 1ms G-Sync IPS screen?

    I mean, that's heartening, because it's $130 cheaper. But why?

    EDIT: What's an extra '4' here or there?

    Iolo on
    Lt. Iolo's First Day
    Steam profile.
    Getting started with BATTLETECH: Part 1 / Part 2
This discussion has been closed.