As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/

[PC Build Thread] Keep your human antivirus up to date

1909193959699

Posts

  • jdarksunjdarksun Struggler VARegistered User regular
    One thing's for sure from all these videos: they make me feel like a sucker for buying a 2080. :lol:

  • Trajan45Trajan45 Registered User regular
    ED! wrote: »
    Fucking hilarious (I know it's at Ultra but come on MS).

    My understanding is that Flight Simulator has bottlenecks all over the place including CPU as a big one. People use it as it's a modern game, but it's probably not a great GPU benchmark.

    Origin ID\ Steam ID: Warder45
  • NamrokNamrok Registered User regular
    jdarksun wrote: »
    TLDR is performance gain on the 2080 is more like 50% at 1440p... which is still way better than 1080 -> 2080, but the "80%" performance boosts advertised is really only at 4K.

    Digital Foundry's review was interesting. They put a lot of the responsibility for how much different games got a boost out of the 3080 on how well the game engine supports multiple cores. With well threaded, DX12 or Vulkan games seeing the most improvement. And I saw on Linus Tech Tips that when they load up just raw productivity tasks, the 2x CUDA core count pretty much directly translates into a 2x gain.

    I'm getting a picture that the 3080 is such a performance monster, it's running into issues with the CPU bottlenecking it. Or games that are sloppy with efficiently loading data onto the GPU and minimizing draw calls.

    I'm also getting the picture that this is really a 4K card. I have kind of a crazy 3840x1080@144hz monitor, a 2K monitor if you will. Has a pixel count in the ballpark of 1440p. And outside of ray tracing, it's sounding like the RTX 3080 would be wasted on such a monitor.

  • jdarksunjdarksun Struggler VARegistered User regular
    Namrok wrote: »
    jdarksun wrote: »
    TLDR is performance gain on the 2080 is more like 50% at 1440p... which is still way better than 1080 -> 2080, but the "80%" performance boosts advertised is really only at 4K.

    Digital Foundry's review was interesting. They put a lot of the responsibility for how much different games got a boost out of the 3080 on how well the game engine supports multiple cores. With well threaded, DX12 or Vulkan games seeing the most improvement. And I saw on Linus Tech Tips that when they load up just raw productivity tasks, the 2x CUDA core count pretty much directly translates into a 2x gain.

    I'm getting a picture that the 3080 is such a performance monster, it's running into issues with the CPU bottlenecking it. Or games that are sloppy with efficiently loading data onto the GPU and minimizing draw calls.

    I'm also getting the picture that this is really a 4K card. I have kind of a crazy 3840x1080@144hz monitor, a 2K monitor if you will. Has a pixel count in the ballpark of 1440p. And outside of ray tracing, it's sounding like the RTX 3080 would be wasted on such a monitor.
    Define "wasted," right? The 3080 is pushing the same performance at 1440p as the 2080 Ti was pushing at 1080p, for ... $400-500 less?

    55% faster (average) than the 1080 TI @ 1440p
    47% faster (average) than the 2080 @ 1440p
    21% faster (average) than the 2080 Ti @ 1440p

    74% faster (average) than the 1080 Ti @ 4K
    68% faster (average) than the 2080 @ 4K
    31% faster (average) than the 2080 Ti @ 4K

    The 3080 is absolutely wasted at 1080p (JayZTwoCents goes so far as to say "pick up the 3080 and spend what you would have on the 3090 on a 1440p panel"), but for 1440p - especially with refresh rates > 60Hz - it sure seems like some pretty impressive performance gains.

  • DixonDixon Screwed...possibly doomed CanadaRegistered User regular
    Yeah this was my feeling as well. I noticed some reviewers doing benchmarks with an older CPU. I believe pcgamer was a 7700k Intel cpu. It def had an impact on the fps compared to other reviewers using top tier chips.

    Games like Control where the DLSS and RTX implementation seem to be top notch hit that 2x level at 4k.

    All in all I am very happy, and will be trying to grab one tomorrow at 9am lol.

    I want a FE but we'll see what happens.

  • ED!ED! Registered User regular
    I'm hoping to to get 4K gaming out of this card, but the real appeal is 1440P (or 3K) gaming at the highest graphical settings which it seems the 3080 is more than capable of doing. Seeing RDR2 at 70FPS at Ultra 4K is definitely nice, but it's cranking the games other bells and whistles up high that has me excited to jump back in.

    "Get the hell out of me" - [ex]girlfriend
  • NamrokNamrok Registered User regular
    edited September 2020
    jdarksun wrote: »
    Namrok wrote: »
    jdarksun wrote: »
    TLDR is performance gain on the 2080 is more like 50% at 1440p... which is still way better than 1080 -> 2080, but the "80%" performance boosts advertised is really only at 4K.

    Digital Foundry's review was interesting. They put a lot of the responsibility for how much different games got a boost out of the 3080 on how well the game engine supports multiple cores. With well threaded, DX12 or Vulkan games seeing the most improvement. And I saw on Linus Tech Tips that when they load up just raw productivity tasks, the 2x CUDA core count pretty much directly translates into a 2x gain.

    I'm getting a picture that the 3080 is such a performance monster, it's running into issues with the CPU bottlenecking it. Or games that are sloppy with efficiently loading data onto the GPU and minimizing draw calls.

    I'm also getting the picture that this is really a 4K card. I have kind of a crazy 3840x1080@144hz monitor, a 2K monitor if you will. Has a pixel count in the ballpark of 1440p. And outside of ray tracing, it's sounding like the RTX 3080 would be wasted on such a monitor.
    Define "wasted," right? The 3080 is pushing the same performance at 1440p as the 2080 Ti was pushing at 1080p, for ... $400-500 less?

    55% faster (average) than the 1080 TI @ 1440p
    47% faster (average) than the 2080 @ 1440p
    21% faster (average) than the 2080 Ti @ 1440p

    74% faster (average) than the 1080 Ti @ 4K
    68% faster (average) than the 2080 @ 4K
    31% faster (average) than the 2080 Ti @ 4K

    The 3080 is absolutely wasted at 1080p (JayZTwoCents goes so far as to say "pick up the 3080 and spend what you would have on the 3090 on a 1440p panel"), but for 1440p - especially with refresh rates > 60Hz - it sure seems like some pretty impressive performance gains.

    I guess to be more specific, it's not really worth it, except for RTX enabled games, for me. Everything I play regularly already runs at 100+ fps. Control w/ RTX, Quake 2 RTX, Mechwarrior 5 w/ RTX and Minecraft RTX excepted. Perhaps when the next killer game with RTX comes out, I'll give the 3080 another thought.

    Namrok on
  • SoggybiscuitSoggybiscuit Tandem Electrostatic Accelerator Registered User regular
    One thing that would be an absolute RTX 3000 series sell for me: SR-IOV support. Its apparently built into all of these cards but not enabled in the drivers.



    I've basically become a by default linux user, and I only have Windows for games and Office. If I could just use linux and do pass through to a Windows VM, that would be a killer app for me. Most of the software I use daily runs best on linux, and while WSL is good I would rather run it natively.

    Steam - Synthetic Violence | XBOX Live - Cannonfuse | PSN - CastleBravo | Twitch - SoggybiscuitPA
  • LD50LD50 Registered User regular
    ED! wrote: »
    Fucking hilarious (I know it's at Ultra but come on MS).

    Does it have a 50 fps cap?

  • SpoitSpoit *twitch twitch* Registered User regular
    Trajan45 wrote: »
    ED! wrote: »
    Fucking hilarious (I know it's at Ultra but come on MS).

    My understanding is that Flight Simulator has bottlenecks all over the place including CPU as a big one. People use it as it's a modern game, but it's probably not a great GPU benchmark.

    It's not even a good cpu benchmark since it's not taking advantage of additional cores very well

    steam_sig.png
  • GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    So Nvidia's 2x the 2080 was a biiitttt overhyped (though it can be seen in very specific scenarios, especially productivity ones), but putting that aside, the 3080 really does seem to be the monster they claimed it was. The generational uplift over the original 2080 is massive, even at it's lowest points. It's advantage over the 2080 Ti, at half the cost, is equally impressive.

    I'm pretty much sold on trying to get a 3090 next week, given that I expect it to scale pretty much inline with it's specs.

    Good luck to everyone trying to get a 3080 tomorrow. I really hope as many of you as want one, get one, though I expect the rush to be absurd.

    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    So I posted a little PSA on Reddit about Nvidia not having pre-orders and people should not be buying them on Ebay, and while I figured I might get a few upvotes while doing a public good (people hate scalpers) I didn't realize it would explode like this:

    isi528lm3u6b.png

  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    GnomeTank wrote: »
    So Nvidia's 2x the 2080 was a biiitttt overhyped (though it can be seen in very specific scenarios, especially productivity ones), but putting that aside, the 3080 really does seem to be the monster they claimed it was. The generational uplift over the original 2080 is massive, even at it's lowest points. It's advantage over the 2080 Ti, at half the cost, is equally impressive.

    I'm pretty much sold on trying to get a 3090 next week, given that I expect it to scale pretty much inline with it's specs.

    Good luck to everyone trying to get a 3080 tomorrow. I really hope as many of you as want one, get one, though I expect the rush to be absurd.

    I'm gonna say it being a generational leap above the next tier was genuinely something I was not expecting. I figured it would be good, and I figured the RTX would be massive, but I thought the pure rasterizing wouldn't be a 20-40% leap over the 2080 TI like we've been seeing.

  • BouwsTBouwsT Wanna come to a super soft birthday party? Registered User regular
    GnomeTank wrote: »
    ...
    Good luck to everyone trying to get a 3080 tomorrow. I really hope as many of you as want one, get one, though I expect the rush to be absurd.

    yoz73ncjscec.gif

    Between you and me, Peggy, I smoked this Juul and it did UNTHINKABLE things to my mind and body...
  • ThawmusThawmus +Jackface Registered User regular
    One thing that would be an absolute RTX 3000 series sell for me: SR-IOV support. Its apparently built into all of these cards but not enabled in the drivers.



    I've basically become a by default linux user, and I only have Windows for games and Office. If I could just use linux and do pass through to a Windows VM, that would be a killer app for me. Most of the software I use daily runs best on linux, and while WSL is good I would rather run it natively.

    Yeah just gonna chime in here and say that the positive gains made over the past few years for linux gaming have been shit on at a steady pace lately by developers turning on EAC*. Fall Guys this week was the latest in a string of games that people bought to play on Linux because, hey, Proton works great, only to be met this week with error messages and having to refund the game because EAC got turned on.

    This is all to say that while I've been gaming on linux for years now, and love it to death, the idea of getting SR-IOV for GPU passthrough to a Windows VM as insurance against this shit would be amazing.



    *To be clear, there is a linux version of EAC, but more and more developers have abandoned linux client development because Proton is such an easy out. I can't blame them, Proton is amazing and it keeps getting better, and now you can just skip that stage of your development cycle (though many developers actively support their linux users when there are issues with Proton). But the Windows version of EAC does not work with Proton at all and makes the game suddenly unplayable. It's been happening a lot lately.

    Twitch: Thawmus83
  • A duck!A duck! Moderator, ClubPA mod
    Getting my thoughts together on the reviews (I need them AIB reviews), but I kind of appreciate Hardware Canucks trying to consider what a 1080ti owner's situation might be and evaluating it from there.

  • SpoitSpoit *twitch twitch* Registered User regular
    The 3080 benches look good, I'm quite curious to see how the 3090 stacks up. I'm trying hard not to give into the hype and wait until we get benches on the Radeon 6000 cards.

    Nvidia's software suite is at the point where big Navi would have to either be a $200 less or a generational leap in raster performance for me to even consider it. And even then I'd have to think hard about it, since Nvidia is pretty much the only game in town for ml

    steam_sig.png
  • LD50LD50 Registered User regular
    I hope we see AIB reviews of the 3080 before the 3090 FE goes on sale.

  • ThawmusThawmus +Jackface Registered User regular
    edited September 2020
    So I posted a little PSA on Reddit about Nvidia not having pre-orders and people should not be buying them on Ebay, and while I figured I might get a few upvotes while doing a public good (people hate scalpers) I didn't realize it would explode like this:

    isi528lm3u6b.png

    Favorite response in the thread (Yes, I searched for it and found it and it's awesome):

    "TRYING TO KEEP EM ALL TO YOURSELF EHHH? I SEE YOUR TRICKS!"


    EDIT: Also saw more than a few comments asking for the thread to be pinned, and I saw at least one person genuinely thankful for the PSA because they thought the preorders were real and were legitimately about to make a mistake.

    So uh, here's to jungleroomx!

    Thawmus on
    Twitch: Thawmus83
  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    Thawmus wrote: »
    So I posted a little PSA on Reddit about Nvidia not having pre-orders and people should not be buying them on Ebay, and while I figured I might get a few upvotes while doing a public good (people hate scalpers) I didn't realize it would explode like this:

    isi528lm3u6b.png

    Favorite response in the thread (Yes, I searched for it and found it and it's awesome):

    "TRYING TO KEEP EM ALL TO YOURSELF EHHH? I SEE YOUR TRICKS!"


    EDIT: Also saw more than a few comments asking for the thread to be pinned, and I saw at least one person genuinely thankful for the PSA because they thought the preorders were real and were legitimately about to make a mistake.

    So uh, here's to jungleroomx!

    I've seen a few people who thought there were legit preorders because it's a very common thing. I felt it was a point that wasn't being explained enough.

    I absolutely hate scalpers and how they take advantage of people.

  • 3cl1ps33cl1ps3 I will build a labyrinth to house the cheese Registered User regular
    Thawmus wrote: »
    So I posted a little PSA on Reddit about Nvidia not having pre-orders and people should not be buying them on Ebay, and while I figured I might get a few upvotes while doing a public good (people hate scalpers) I didn't realize it would explode like this:

    isi528lm3u6b.png

    Favorite response in the thread (Yes, I searched for it and found it and it's awesome):

    "TRYING TO KEEP EM ALL TO YOURSELF EHHH? I SEE YOUR TRICKS!"


    EDIT: Also saw more than a few comments asking for the thread to be pinned, and I saw at least one person genuinely thankful for the PSA because they thought the preorders were real and were legitimately about to make a mistake.

    So uh, here's to jungleroomx!

    Speaking as a mod of multiple subreddits, pins don't do a damn thing because no one reads them anyway :D

  • CormacCormac Registered User regular
    LD50 wrote: »
    I hope we see AIB reviews of the 3080 before the 3090 FE goes on sale.

    Indeed. I kind of already convinced myself I want a FE card because I like how it looks. Like Jayztwocents I'm also debating not watercooling it because of how good it unique appearance. However, if the Watercool Heatkiller blocks look as good as they usually do then I will put a block on it. That leads to me being very curious to see Steve's teardown of the FE cooler.

    Then again if stock of the FE is so low it's out of stock for months but AIB's are more available then that decision becomes harder.

    Can AMD announce a launch date for Ryzen 3 and motherboard partners announce whether or not there's going to be a new chipset so I can stop holding myself back from ordering the rest of the parts? October seems so far away now and that's just the announcement. The actual release of the CPU's and sufficient stock could be weeks from then.

    Steam: Gridlynk | PSN: Gridlynk | FFXIV: Jarvellis Mika
  • GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    Ryzen 3 news is coming 10/8. You can be sure there will be new chipsets, but it will work with X570...so the question is what X670 brings to the table, if anything.

    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • 3cl1ps33cl1ps3 I will build a labyrinth to house the cheese Registered User regular
    GnomeTank wrote: »
    Ryzen 3 news is coming 10/8. You can be sure there will be new chipsets, but it will work with X570...so the question is what X670 brings to the table, if anything.

    Another improvement in VRMs, maybe?

  • GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    But...why? The VRM's on most X570 boards is already overkill for even the 3950X. Unless Ryzen 3's clockspeed v. power scaling is horrendous, I can't imagine people needing much better VRM components to run even the highest level chips.

    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    GnomeTank wrote: »
    But...why? The VRM's on most X570 boards is already overkill for even the 3950X. Unless Ryzen 3's clockspeed v. power scaling is horrendous, I can't imagine people needing much better VRM components to run even the highest level chips.

    Maybe the same power level but without active cooling required

  • GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    edited September 2020
    GnomeTank wrote: »
    But...why? The VRM's on most X570 boards is already overkill for even the 3950X. Unless Ryzen 3's clockspeed v. power scaling is horrendous, I can't imagine people needing much better VRM components to run even the highest level chips.

    Maybe the same power level but without active cooling required

    That wouldn't make sense though. Most X570 boards don't have active VRM cooling. Maybe you mean the active chipset cooling? In most cases it barely ever activates, and is there because of how much warmer PCI-e 4.0 can make the chip...not really VRM related. Though I suppose that could be the big feature of a possible x670. If they make the chipset on a smaller node it might run cooler *shrugs*.

    All very speculative at this point.

    GnomeTank on
    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    Namrok wrote: »
    jdarksun wrote: »
    Namrok wrote: »
    jdarksun wrote: »
    TLDR is performance gain on the 2080 is more like 50% at 1440p... which is still way better than 1080 -> 2080, but the "80%" performance boosts advertised is really only at 4K.

    Digital Foundry's review was interesting. They put a lot of the responsibility for how much different games got a boost out of the 3080 on how well the game engine supports multiple cores. With well threaded, DX12 or Vulkan games seeing the most improvement. And I saw on Linus Tech Tips that when they load up just raw productivity tasks, the 2x CUDA core count pretty much directly translates into a 2x gain.

    I'm getting a picture that the 3080 is such a performance monster, it's running into issues with the CPU bottlenecking it. Or games that are sloppy with efficiently loading data onto the GPU and minimizing draw calls.

    I'm also getting the picture that this is really a 4K card. I have kind of a crazy 3840x1080@144hz monitor, a 2K monitor if you will. Has a pixel count in the ballpark of 1440p. And outside of ray tracing, it's sounding like the RTX 3080 would be wasted on such a monitor.
    Define "wasted," right? The 3080 is pushing the same performance at 1440p as the 2080 Ti was pushing at 1080p, for ... $400-500 less?

    55% faster (average) than the 1080 TI @ 1440p
    47% faster (average) than the 2080 @ 1440p
    21% faster (average) than the 2080 Ti @ 1440p

    74% faster (average) than the 1080 Ti @ 4K
    68% faster (average) than the 2080 @ 4K
    31% faster (average) than the 2080 Ti @ 4K

    The 3080 is absolutely wasted at 1080p (JayZTwoCents goes so far as to say "pick up the 3080 and spend what you would have on the 3090 on a 1440p panel"), but for 1440p - especially with refresh rates > 60Hz - it sure seems like some pretty impressive performance gains.

    I guess to be more specific, it's not really worth it, except for RTX enabled games, for me. Everything I play regularly already runs at 100+ fps. Control w/ RTX, Quake 2 RTX, Mechwarrior 5 w/ RTX and Minecraft RTX excepted. Perhaps when the next killer game with RTX comes out, I'll give the 3080 another thought.

    Cyberpunk, Witcher 3, and Doom Eternal are supposedly in the works.

  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    Demons Souls Remake is a timed exclusive and will be coming to PC.

  • DhalphirDhalphir don't you open that trapdoor you're a fool if you dareRegistered User regular
    A duck! wrote: »
    Getting my thoughts together on the reviews (I need them AIB reviews), but I kind of appreciate Hardware Canucks trying to consider what a 1080ti owner's situation might be and evaluating it from there.

    I watched all of their video and it was really good, especially them comparing 7700K vs 10900K. I'm on an 8700K, so it's pretty comparable. The conclusion seems to be don't feel like you need to touch your CPU, you'll be fine.

  • Stabbity StyleStabbity Style He/Him | Warning: Mothership Reporting Kennewick, WARegistered User regular
    Demons Souls Remake is a timed exclusive and will be coming to PC.

    It's not, they just came out and confirmed it's not.



    Anyway, another thing to keep in mind for people thinking it's "wasted" on a 1440p monitor is that you're probably gonna keep that card longer than a generation and, as much as future proofing isn't really a thing, you'll get a longer lifespan out of it performing at a high level than you would out of a 3070. I'm expecting it to keep me til the 6000 series at least, maybe longer.

    Stabbity_Style.png
  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    Demons Souls Remake is a timed exclusive and will be coming to PC.

    It's not, they just came out and confirmed it's not.



    Anyway, another thing to keep in mind for people thinking it's "wasted" on a 1440p monitor is that you're probably gonna keep that card longer than a generation and, as much as future proofing isn't really a thing, you'll get a longer lifespan out of it performing at a high level than you would out of a 3070. I'm expecting it to keep me til the 6000 series at least, maybe longer.

    It's definitely not being wasted on a 1440p monitor. You'll deffo get that 144hz locked with it.





    It may be wasted on a 1080p screen tho.

  • GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    edited September 2020
    Just got an email from EVGA that they are doing a 10 AM PT live stream tomorrow to introduce their 3080 cards. If I had to guess, near to that time they will go live for ordering if you want to try and get an AIB card.

    GnomeTank on
    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • ED!ED! Registered User regular
    GnomeTank wrote: »
    Just got an email from EVGA that they are doing a 10 AM PT live stream tomorrow to introduce their 3080 cards. If I had to guess, near to that time they will go live for ordering if you want to try and get an AIB card.

    Bonerjam. So late after everyone else is going live. EVGA is who I wanted to go with. Maybe this will just be a "take a look at our cards in action" kind of reveal. Although if my APM's on trying to snag a digital PS5 are any indication I'm not getting a card tomorrow anyway.

    "Get the hell out of me" - [ex]girlfriend
  • XeddicusXeddicus Registered User regular
    Just a nitpick, but ALL video cards are "AIB" just about.

    And it's not ironic, either! :P

  • LD50LD50 Registered User regular
    What I need to know about the AIB cards is what the power regulation/delivery looks like compared to the FE cards. The 2080 reviews pretty much all say the card settles at ~75c under max load, which is very fine, but GN had issues getting it to go faster not because of thermals but because of power. I think I'll try to go with an FE 3090 if they have similar power ceilings.

  • DixonDixon Screwed...possibly doomed CanadaRegistered User regular
    11 hours to go

  • OrcaOrca Also known as Espressosaurus WrexRegistered User regular
    Xeddicus wrote: »
    Just a nitpick, but ALL video cards are "AIB" just about.

    And it's not ironic, either! :P

    Yeah, I still don't understand why people use that terminology to distinguish the reference design from customized version. As terminology goes it just seems silly!

  • Pixelated PixiePixelated Pixie They/Them Registered User regular
    bdgor7xdjf9e.png

    ~~ Pixie on Steam ~~
    ironzerg wrote: »
    Chipmunks are like nature's nipple clamps, I guess?
  • Trajan45Trajan45 Registered User regular
    Feels so weird not having pre-order. Like are these concert tickets? Almost feels malicious forcing folks to have to F5 on sites that’ll probably be crashing to get a $600 item.

    Origin ID\ Steam ID: Warder45
This discussion has been closed.