As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/
We're funding a new Acquisitions Incorporated series on Kickstarter right now! Check it out at https://www.kickstarter.com/projects/pennyarcade/acquisitions-incorporated-the-series-2

[PC Build Thread] Nope, you still can't buy anything

1767779818299

Posts

  • NamrokNamrok Registered User regular
    Namrok wrote: »
    Namrok wrote: »
    Namrok wrote: »
    Epic deprecated Ray Tracing so they could sell their Lumen system in UE5.

    Jesus Christ I hate this company

    Even among scummy tech companies they feel particularly scum-loaded

    Is that actually true? I immediately googled it, and only found
    Lumen implements efficient Software Ray Tracing, allowing Global Illumination and Reflections to run on a wide range of video cards, while supporting Hardware Ray Tracing for high end visuals.

    It sounds like they didn't deprecate raytracing, but included a software fallback for it?

    Edit: Furthermore

    https://docs.unrealengine.com/5.0/en-US/RenderingFeatures/Lumen/TechOverview/
    Lumen uses Software Ray Tracing through Signed Distance Fields by default, but can achieve higher quality on supporting video cards when Hardware Ray Tracing is enabled.

    It’s been deprecated and non of the hardware in rt cards gets used unless it’s specially done. They’re really pushing their own custom solution.

    So just like their store they half ass their engine now that they’ve got Fortnite money.

    I'm skeptical you can read the documentation I linked, understand it, and then write what you wrote.

    Every engine supports software lighting unless you specifically enable raytracing, if it supports raytracing at all. It's baffling to me that you can see Epic bringing some of the benefits of RT to a subset of geometry in a game scene, while still preserving the really good stuff for hardware raytracing, and you're conclusion is that Epic is somehow taking something away from you, instead of giving you more.

    Are you angry they keep trying to give you free games too? Like they are specifically stealing them from your Steam library?

    Meanwhile, in the Unreal Engine 5 interface

    e3pvk3o88u4b.png

    And I really think you need to look up what deprecation specifically means. They are officially discouraging its use as a method to remove it. That's what deprecation is. That is its use.

    They aren't deprecating the functionality. They deprecated the namespace. This is normal and typical. The lumen system encompasses hw rt, with software callbacks for the viable subset of its functionality.

    I ask again

    If that’s the case, why is Nvidia “working” with Epic?

    And what you’re basically saying is their system man-in-the-middles RT sessions. Which makes things worse, pretty much always. And why did it get done? Why did they insert a layer of their proprietary software that, according to you, functionally does nothing?

    Can’t say for sure, but the fact they can’t patent ray tracing seems high on the list.

    This has a similar stink on it as needing to buy commercial Nvidia cards for Linux.

    Where are you even getting this from? "Man in the middle"? Its an off the shelf game engine. The whole thing is a man in the middle. Always has been. For all engines. And who knows if its implemented as another "layer", or if this is just a minor tweak to workflow to streamline creating raster and RT content with less duplicated work. And nvidia has been working with everyone on RTX. What do you think Quake 2 Rtx and Minecraft Rtx are?

    You act like Epic has jacked up Nvidias very drivers.

  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    Namrok wrote: »
    Namrok wrote: »
    Namrok wrote: »
    Namrok wrote: »
    Epic deprecated Ray Tracing so they could sell their Lumen system in UE5.

    Jesus Christ I hate this company

    Even among scummy tech companies they feel particularly scum-loaded

    Is that actually true? I immediately googled it, and only found
    Lumen implements efficient Software Ray Tracing, allowing Global Illumination and Reflections to run on a wide range of video cards, while supporting Hardware Ray Tracing for high end visuals.

    It sounds like they didn't deprecate raytracing, but included a software fallback for it?

    Edit: Furthermore

    https://docs.unrealengine.com/5.0/en-US/RenderingFeatures/Lumen/TechOverview/
    Lumen uses Software Ray Tracing through Signed Distance Fields by default, but can achieve higher quality on supporting video cards when Hardware Ray Tracing is enabled.

    It’s been deprecated and non of the hardware in rt cards gets used unless it’s specially done. They’re really pushing their own custom solution.

    So just like their store they half ass their engine now that they’ve got Fortnite money.

    I'm skeptical you can read the documentation I linked, understand it, and then write what you wrote.

    Every engine supports software lighting unless you specifically enable raytracing, if it supports raytracing at all. It's baffling to me that you can see Epic bringing some of the benefits of RT to a subset of geometry in a game scene, while still preserving the really good stuff for hardware raytracing, and you're conclusion is that Epic is somehow taking something away from you, instead of giving you more.

    Are you angry they keep trying to give you free games too? Like they are specifically stealing them from your Steam library?

    Meanwhile, in the Unreal Engine 5 interface

    e3pvk3o88u4b.png

    And I really think you need to look up what deprecation specifically means. They are officially discouraging its use as a method to remove it. That's what deprecation is. That is its use.

    They aren't deprecating the functionality. They deprecated the namespace. This is normal and typical. The lumen system encompasses hw rt, with software callbacks for the viable subset of its functionality.

    I ask again

    If that’s the case, why is Nvidia “working” with Epic?

    And what you’re basically saying is their system man-in-the-middles RT sessions. Which makes things worse, pretty much always. And why did it get done? Why did they insert a layer of their proprietary software that, according to you, functionally does nothing?

    Can’t say for sure, but the fact they can’t patent ray tracing seems high on the list.

    This has a similar stink on it as needing to buy commercial Nvidia cards for Linux.

    Where are you even getting this from? "Man in the middle"? Its an off the shelf game engine. The whole thing is a man in the middle. Always has been. For all engines. And who knows if its implemented as another "layer", or if this is just a minor tweak to workflow to streamline creating raster and RT content with less duplicated work. And nvidia has been working with everyone on RTX. What do you think Quake 2 Rtx and Minecraft Rtx are?

    You act like Epic has jacked up Nvidias very drivers.

    I’m acting like Epic is acting like a multi billion dollar company. And that is very likely not good. Like managing to give away millions in software for a store that doesn’t even have a cart.

    And if path tracing worked before, why does Lumen exist at all? They said it wasn’t performant enough and wanted people to use their Lumen stuff instead, which is why it’s getting deprecated and why Rt has been effectively shoved in the back. It can be used, but it’s not default.

    I’ll be very interested to see how it runs in Unreal compared to engines not futzing around with it like Cryengine, RE, and idTech

  • ElaroElaro Apologetic Registered User regular
    Namrok wrote: »
    Namrok wrote: »
    Namrok wrote: »
    Namrok wrote: »
    Epic deprecated Ray Tracing so they could sell their Lumen system in UE5.

    Jesus Christ I hate this company

    Even among scummy tech companies they feel particularly scum-loaded

    Is that actually true? I immediately googled it, and only found
    Lumen implements efficient Software Ray Tracing, allowing Global Illumination and Reflections to run on a wide range of video cards, while supporting Hardware Ray Tracing for high end visuals.

    It sounds like they didn't deprecate raytracing, but included a software fallback for it?

    Edit: Furthermore

    https://docs.unrealengine.com/5.0/en-US/RenderingFeatures/Lumen/TechOverview/
    Lumen uses Software Ray Tracing through Signed Distance Fields by default, but can achieve higher quality on supporting video cards when Hardware Ray Tracing is enabled.

    It’s been deprecated and non of the hardware in rt cards gets used unless it’s specially done. They’re really pushing their own custom solution.

    So just like their store they half ass their engine now that they’ve got Fortnite money.

    I'm skeptical you can read the documentation I linked, understand it, and then write what you wrote.

    Every engine supports software lighting unless you specifically enable raytracing, if it supports raytracing at all. It's baffling to me that you can see Epic bringing some of the benefits of RT to a subset of geometry in a game scene, while still preserving the really good stuff for hardware raytracing, and you're conclusion is that Epic is somehow taking something away from you, instead of giving you more.

    Are you angry they keep trying to give you free games too? Like they are specifically stealing them from your Steam library?

    Meanwhile, in the Unreal Engine 5 interface

    e3pvk3o88u4b.png

    And I really think you need to look up what deprecation specifically means. They are officially discouraging its use as a method to remove it. That's what deprecation is. That is its use.

    They aren't deprecating the functionality. They deprecated the namespace. This is normal and typical. The lumen system encompasses hw rt, with software callbacks for the viable subset of its functionality.

    I ask again

    If that’s the case, why is Nvidia “working” with Epic?

    And what you’re basically saying is their system man-in-the-middles RT sessions. Which makes things worse, pretty much always. And why did it get done? Why did they insert a layer of their proprietary software that, according to you, functionally does nothing?

    Can’t say for sure, but the fact they can’t patent ray tracing seems high on the list.

    This has a similar stink on it as needing to buy commercial Nvidia cards for Linux.

    Where are you even getting this from? "Man in the middle"? Its an off the shelf game engine. The whole thing is a man in the middle. Always has been. For all engines. And who knows if its implemented as another "layer", or if this is just a minor tweak to workflow to streamline creating raster and RT content with less duplicated work. And nvidia has been working with everyone on RTX. What do you think Quake 2 Rtx and Minecraft Rtx are?

    You act like Epic has jacked up Nvidias very drivers.

    I’m acting like Epic is acting like a multi billion dollar company. And that is very likely not good. Like managing to give away millions in software for a store that doesn’t even have a cart.

    And if path tracing worked before, why does Lumen exist at all? They said it wasn’t performant enough and wanted people to use their Lumen stuff instead, which is why it’s getting deprecated and why Rt has been effectively shoved in the back. It can be used, but it’s not default.

    I’ll be very interested to see how it runs in Unreal compared to engines not futzing around with it like Cryengine, RE, and idTech

    Unrustle your jimmies, my friend. Nobody's taking away your raytracing toys.

    They're just putting them in a box with the software raytracing toys, a big box labeled "Lumen". That's it.

    Stop stirring up shit in a desert.

    Children's rights are human rights.
    NamrokwunderbarKamarMvrckschuss3cl1ps3
  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    edited June 2021
    Elaro wrote: »
    Namrok wrote: »
    Namrok wrote: »
    Namrok wrote: »
    Namrok wrote: »
    Epic deprecated Ray Tracing so they could sell their Lumen system in UE5.

    Jesus Christ I hate this company

    Even among scummy tech companies they feel particularly scum-loaded

    Is that actually true? I immediately googled it, and only found
    Lumen implements efficient Software Ray Tracing, allowing Global Illumination and Reflections to run on a wide range of video cards, while supporting Hardware Ray Tracing for high end visuals.

    It sounds like they didn't deprecate raytracing, but included a software fallback for it?

    Edit: Furthermore

    https://docs.unrealengine.com/5.0/en-US/RenderingFeatures/Lumen/TechOverview/
    Lumen uses Software Ray Tracing through Signed Distance Fields by default, but can achieve higher quality on supporting video cards when Hardware Ray Tracing is enabled.

    It’s been deprecated and non of the hardware in rt cards gets used unless it’s specially done. They’re really pushing their own custom solution.

    So just like their store they half ass their engine now that they’ve got Fortnite money.

    I'm skeptical you can read the documentation I linked, understand it, and then write what you wrote.

    Every engine supports software lighting unless you specifically enable raytracing, if it supports raytracing at all. It's baffling to me that you can see Epic bringing some of the benefits of RT to a subset of geometry in a game scene, while still preserving the really good stuff for hardware raytracing, and you're conclusion is that Epic is somehow taking something away from you, instead of giving you more.

    Are you angry they keep trying to give you free games too? Like they are specifically stealing them from your Steam library?

    Meanwhile, in the Unreal Engine 5 interface

    e3pvk3o88u4b.png

    And I really think you need to look up what deprecation specifically means. They are officially discouraging its use as a method to remove it. That's what deprecation is. That is its use.

    They aren't deprecating the functionality. They deprecated the namespace. This is normal and typical. The lumen system encompasses hw rt, with software callbacks for the viable subset of its functionality.

    I ask again

    If that’s the case, why is Nvidia “working” with Epic?

    And what you’re basically saying is their system man-in-the-middles RT sessions. Which makes things worse, pretty much always. And why did it get done? Why did they insert a layer of their proprietary software that, according to you, functionally does nothing?

    Can’t say for sure, but the fact they can’t patent ray tracing seems high on the list.

    This has a similar stink on it as needing to buy commercial Nvidia cards for Linux.

    Where are you even getting this from? "Man in the middle"? Its an off the shelf game engine. The whole thing is a man in the middle. Always has been. For all engines. And who knows if its implemented as another "layer", or if this is just a minor tweak to workflow to streamline creating raster and RT content with less duplicated work. And nvidia has been working with everyone on RTX. What do you think Quake 2 Rtx and Minecraft Rtx are?

    You act like Epic has jacked up Nvidias very drivers.

    I’m acting like Epic is acting like a multi billion dollar company. And that is very likely not good. Like managing to give away millions in software for a store that doesn’t even have a cart.

    And if path tracing worked before, why does Lumen exist at all? They said it wasn’t performant enough and wanted people to use their Lumen stuff instead, which is why it’s getting deprecated and why Rt has been effectively shoved in the back. It can be used, but it’s not default.

    I’ll be very interested to see how it runs in Unreal compared to engines not futzing around with it like Cryengine, RE, and idTech

    Unrustle your jimmies, my friend. Nobody's taking away your raytracing toys.

    They're just putting them in a box with the software raytracing toys, a big box labeled "Lumen". That's it.

    Stop stirring up shit in a desert.

    It's only stirring up shit if you've got some kind of emotional connection to a company.

    Which, lol, go on king

    jungleroomx on
  • NamrokNamrok Registered User regular
    You know, I got a kick out of Linus defending his take on the 3080 ti on The WAN Show. In short, $1200 "ti" priced SKU is not unprecedented. We saw the same thing with the RTX 2080 ti. It's just that nobody wanted one. It's also likely increasing supply of GPUs because it's a 3090 class card with half the ram. Meaning if production of 3090's is RAM limited (which it very well could be), they can now produce twice as many 3080 ti's as they could 3090s. Also, it's a halo product. If you are frustrated you can't get your hands on a 3060 or a 3070, what's it matter to you if the 3080 ti you weren't going to get anyways costs more than you weren't going to spend in the first place?

    I'm sympathetic to the argument. Then again, I also have no feelings of entitlement to a GPU at all. I've done the lion's share of my gaming lately on, no shit, a Geforce 2 MX 400 I paid $8 for.

    Caedwyr
  • GilgaronGilgaron Registered User regular
    Friend's son watched a video on building a gaming PC and asked if I could help him build something to play Hunter Call of the Wild, but I don't think my old ATI 5770 is going to be as useful as I'd hoped looking at the minimum specs and there's really don't appear to be any GPUs that would run in what I think they were hoping would be an $800 build. I guess when I was thinking about replacing my own PC it wasn't too surprising that the high end stuff was hard to come by, but at least the integrators have high end systems if you want to pay through the nose. There's just no mid tier stuff at all. At what point do the AAA software companies go bust because none of the gamers can get any hardware? I suppose I should tell him he needs to buy $800 of 2x4s first and then sell it in a few months...

  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    edited June 2021
    Gilgaron wrote: »
    Friend's son watched a video on building a gaming PC and asked if I could help him build something to play Hunter Call of the Wild, but I don't think my old ATI 5770 is going to be as useful as I'd hoped looking at the minimum specs and there's really don't appear to be any GPUs that would run in what I think they were hoping would be an $800 build. I guess when I was thinking about replacing my own PC it wasn't too surprising that the high end stuff was hard to come by, but at least the integrators have high end systems if you want to pay through the nose. There's just no mid tier stuff at all. At what point do the AAA software companies go bust because none of the gamers can get any hardware? I suppose I should tell him he needs to buy $800 of 2x4s first and then sell it in a few months...

    The supply problem probably will/might/maybe will be/could be fixed in a few years, which is kind of scary for devs but not overly bad. You're going to see a lot of multiplat releases I think.

    If anything I think seeing graphical jumps above the norm like CP2077 could end up being reeled in because the market just ain't there yet for it. A lot of companies are already experimenting with all of the good stuff (direct access to high speed storage, etc) we have in this post spinning-rust era. And lets be honest, we're really damn close to photorealism as it is. Right now we're like 2012 Pixar animated realism.

    Square's Forspoken, with its super-fast loading massive outdoor maps?

    https://www.youtube.com/watch?v=9Rcke8rOiIQ very little gameplay, but it starts at 1:45

    Hell with this stuff we finally get a 3D Sonic game that excels at being a 3D Sonic game.

    We may see companies not forcing the already diminishing returns we've been getting in just blunt-force pushing more pixels as the only advance. Which is kind of exciting.

    jungleroomx on
  • wunderbarwunderbar What Have I Done? Registered User regular
    Namrok wrote: »
    You know, I got a kick out of Linus defending his take on the 3080 ti on The WAN Show. In short, $1200 "ti" priced SKU is not unprecedented. We saw the same thing with the RTX 2080 ti. It's just that nobody wanted one. It's also likely increasing supply of GPUs because it's a 3090 class card with half the ram. Meaning if production of 3090's is RAM limited (which it very well could be), they can now produce twice as many 3080 ti's as they could 3090s. Also, it's a halo product. If you are frustrated you can't get your hands on a 3060 or a 3070, what's it matter to you if the 3080 ti you weren't going to get anyways costs more than you weren't going to spend in the first place?

    I'm sympathetic to the argument. Then again, I also have no feelings of entitlement to a GPU at all. I've done the lion's share of my gaming lately on, no shit, a Geforce 2 MX 400 I paid $8 for.

    Couple things there. While the Founders edition card of the 2080 Ti was $1200, a "basic" 2080 Ti's MSRP was $1000, and it was possible to find some cards less than $1200.

    And I bet these 3080 Ti cards aren't going to be fully functional 3090 dies that end up as 3080 Ti because or RAM shortages. These are likely 3090 dies that got part binned out because a shader or two is defective, so this is an opportunity to sell those dies instead of throwing it out, which is fine. I'm ok with that. But nvidia did raise the price compared to the 2080 Ti.

    Also, if the goal is to sell as many cards as possible nvidia would stop fabbing those dies entirely and focus on the 3060 and 3070 dies. The 3060 die is about 1/3 of the size of the 3090 die and the 3070 die is half the size of the 3090 die. So they could probably get 2.5x the amount of viable 3060's for every 3090 wafer, and maybe 1.75x the number of 3070's.

    But the 3080 Ti and 3090 are higher margin cards, so they'd rather sell those. That's a business decision, not one meant to appease the most number of gamers.

    XBL: thewunderbar PSN: thewunderbar NNID: thewunderbar Steam: wunderbar87 Twitter: wunderbar
    Orca
  • SpoitSpoit *twitch twitch* Registered User regular
    Yeah, if you don't want to pay through the nose for GPUs, a prebuilt is probably your best option. But at $800, that might be a bit difficult.

    steam_sig.png
  • nexuscrawlernexuscrawler Registered User regular
    Namrok wrote: »
    You know, I got a kick out of Linus defending his take on the 3080 ti on The WAN Show. In short, $1200 "ti" priced SKU is not unprecedented. We saw the same thing with the RTX 2080 ti. It's just that nobody wanted one. It's also likely increasing supply of GPUs because it's a 3090 class card with half the ram. Meaning if production of 3090's is RAM limited (which it very well could be), they can now produce twice as many 3080 ti's as they could 3090s. Also, it's a halo product. If you are frustrated you can't get your hands on a 3060 or a 3070, what's it matter to you if the 3080 ti you weren't going to get anyways costs more than you weren't going to spend in the first place?

    I'm sympathetic to the argument. Then again, I also have no feelings of entitlement to a GPU at all. I've done the lion's share of my gaming lately on, no shit, a Geforce 2 MX 400 I paid $8 for.

    MY guess is the ti's exist for two things: The people who will just pay anything to have the "best" and making everyone else feel better about what they pay for the regular model.

  • GilgaronGilgaron Registered User regular
    That video looks like what my mind embellished on top of Morrowind all those years ago, it'll be exciting to play stuff like that! I suppose I'll warn them that they may be puttering along with that 5770 longer than would be ideal if they want to go that route and see if they want to adjust the rest of the build or just shoot for 'severely GPU limited for the next year or so' in functionality.

  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    Gilgaron wrote: »
    That video looks like what my mind embellished on top of Morrowind all those years ago, it'll be exciting to play stuff like that! I suppose I'll warn them that they may be puttering along with that 5770 longer than would be ideal if they want to go that route and see if they want to adjust the rest of the build or just shoot for 'severely GPU limited for the next year or so' in functionality.

    Dude yeah.

    Access to data has been the biggest bottleneck for the longest time. Finally removing all mechanical components is as big a step forward as installing to a hard drive was in the late 80's.

  • SpoitSpoit *twitch twitch* Registered User regular
    But yeah, the GPU companies have been giving the budget lines next to no attention for a few generations now. This used to be okay, since the last gen parts could be the budget option in a pinch. But now that supplies are so bad, even those are impossible to get anywhere near MSRP, much less for cheaper

    steam_sig.png
    jungleroomx
  • wunderbarwunderbar What Have I Done? Registered User regular
    Spoit wrote: »
    But yeah, the GPU companies have been giving the budget lines next to no attention for a few generations now. This used to be okay, since the last gen parts could be the budget option in a pinch. But now that supplies are so bad, even those are impossible to get anywhere near MSRP, much less for cheaper

    I mean, the problem is that the entire line of cards has been creeping up in price, not leaving anything in the budget class. The Nvidia x60 class cards are a good indication of that. the GTX 960, which was the budget card of that generation, had a MSRP of $200. Now the 3060 is a $330 card at MSRP, with most retailers selling for north of $350 as MSRP. Now yes inflation is a thing but we're talking a ~75% increase in price for the 60 class card from the 960 to the 3060.

    Nvidia did have the 16xx series of cards to compliment the 20xx space as budget cards, but we're 8 months into the 3xxx generation and there's nothing in the budget segment. I know that's as much because of supply issues as anything, but Nvidia is putting their priority on the higher margin high end cards, which is a business decision.

    XBL: thewunderbar PSN: thewunderbar NNID: thewunderbar Steam: wunderbar87 Twitter: wunderbar
    Red RaevynBahamutZEROBlackDragon480
  • MugsleyMugsley DelawareRegistered User regular
    Gilgaron wrote: »
    That video looks like what my mind embellished on top of Morrowind all those years ago, it'll be exciting to play stuff like that! I suppose I'll warn them that they may be puttering along with that 5770 longer than would be ideal if they want to go that route and see if they want to adjust the rest of the build or just shoot for 'severely GPU limited for the next year or so' in functionality.

    Does something like a 5300G or 5600G have comparable graphics? Get them a solid air cooler and it should tide the boy over until things clean up a bit

  • NamrokNamrok Registered User regular
    wunderbar wrote: »
    Spoit wrote: »
    But yeah, the GPU companies have been giving the budget lines next to no attention for a few generations now. This used to be okay, since the last gen parts could be the budget option in a pinch. But now that supplies are so bad, even those are impossible to get anywhere near MSRP, much less for cheaper

    I mean, the problem is that the entire line of cards has been creeping up in price, not leaving anything in the budget class. The Nvidia x60 class cards are a good indication of that. the GTX 960, which was the budget card of that generation, had a MSRP of $200. Now the 3060 is a $330 card at MSRP, with most retailers selling for north of $350 as MSRP. Now yes inflation is a thing but we're talking a ~75% increase in price for the 60 class card from the 960 to the 3060.

    Nvidia did have the 16xx series of cards to compliment the 20xx space as budget cards, but we're 8 months into the 3xxx generation and there's nothing in the budget segment. I know that's as much because of supply issues as anything, but Nvidia is putting their priority on the higher margin high end cards, which is a business decision.

    Inflation is such a... weird thing. You can look at the CPI and see 2% a year. But in some things it happens in wild fits and starts. Electronics have rapidly gotten way, way cheaper. And how do you even properly track that to inflation? I mean a steak is a steak is a steak, whether it's 1967, 1984 or 2021. But a computer? I know Moore's law has been a thing so long, we kind of take for granted that more betterer computers aren't necessarily more expensive year after year. And there has been a general trend of computers getting more affordable, and not less...

    But is there really a reason for that to continue on forever? Especially in an era where computers are in everything and there aren't enough to go around? Are we going to return to an era of technical wizardry? Where game programmers actually have to optimize to meet the consumer where they are?

    Another, lets say funny, inflation anecdote. I bought a home recently! In a very hot housing market. I paid almost 4x what my parents paid for their home in 1984. The homes are broadly comparable. If housing prices had roughly followed inflation, I should have only paid 2.5x as much. One might think I'm worse off than my parents, having to pay so much more for housing.

    But mortgage rates in 1984 were 13 fucking percent! Adjusted for inflation, my parents mortgage payments were almost twice mine in 2021 dollars! That sure explains why there were so many fights about money growing up.

    For a long time, I'd been critical of how the CPI is calculated. Notoriously it uses equivalent rent as it's metric instead of raw home value. I used to think that was complete bullshit, but I kind of get it now. I'm not sure any equivalent normalizing has occurred when it comes to how they include electronics in the CPI. Maybe it should. Maybe it shouldn't. Maybe electronics have gotten so much better and cheaper specifically because the Fed includes them in the CPI in such a crass, borderline meaningless way. Maybe we're all better off without monetary policy putting a finger on the scale. Maybe those days are coming to an end though.

    evilthecat
  • GilgaronGilgaron Registered User regular
    Mugsley wrote: »
    Gilgaron wrote: »
    That video looks like what my mind embellished on top of Morrowind all those years ago, it'll be exciting to play stuff like that! I suppose I'll warn them that they may be puttering along with that 5770 longer than would be ideal if they want to go that route and see if they want to adjust the rest of the build or just shoot for 'severely GPU limited for the next year or so' in functionality.

    Does something like a 5300G or 5600G have comparable graphics? Get them a solid air cooler and it should tide the boy over until things clean up a bit

    Well the game he wants to play has minimum GPU requirements of:
    NVIDIA GTX 560Ti / ATI HD7870 – 1 GB VRAM, DirectX 11 compatible card

    and Recommended of:
    NVIDIA GTX 760 / R9 270x – 4 GB VRAM, DirectX 11 compatible card

    So the 5770 I had in a drawer appears much less powerful than the 7870 than I'd hoped, and while I have a R9 380 that would exceed that, that's what I'm using right now. The Newegg resellers indicate it has improbably not depreciated appreciably since it was new, whereas maybe he can snag one of his own on eBay for a more reasonable price. That does remind me, though... a coworker of mine whose PC/woodworking budget is flipflopped from mine scored a 3080 at Microcenter a while back, I should see if he has any hand-me-downs in a drawer that are up to snuff! If not I'll just try to convince my wife that my personal upgrade/replacement budget is obviously an impediment to our dear family friends and I really shouldn't wait until we're done throwing money into the basement finish I'm in the middle of...

  • BahamutZEROBahamutZERO Registered User regular
    edited June 2021
    Very old hardware components actually often get more expensive because while the demand for them drops, the supply also dwindles to just a handful of units with almost certainly none more ever going to be made. So the sellers of those last few know they can charge an arm and a leg for them because the only people buying specific obsolete parts absolutely need that specific part for whatever reason.

    Aside from that though, even recently obsolete graphics cards are full original price or more right now because people are desperate for any GPU to stick into their systems while there's this run on graphics cards going on.

    BahamutZERO on
    BahamutZERO.gif
    LD50
  • DhalphirDhalphir don't you open that trapdoor you're a fool if you dareRegistered User regular
    edited June 2021
    I sold my 3.5 year old 1080Ti for more than I paid for it. It's absurd. I was literally paid to own one of the greatest GPUs ever made for almost 4 years.

    Dhalphir on
    V1mschusszagdrobMvrckThawmusTrajan45danxGilgaronBouwsTIncenjucarBahamutZEROzerzhul3cl1ps3ElvenshaeboogedybooBlackDragon480an_alt
  • zagdrobzagdrob Registered User regular
    Dhalphir wrote: »
    I sold my 3.5 year old 1080Ti for more than I paid for it. It's absurd. I was literally paid to own one of the greatest GPUs ever made for almost 4 years.

    Yeah, when I listed the 1070 Ti I bought in 2018 for $380 with a starting bid of $375, I gave a 'buy it now' price of $600 just as a shits and giggles sure why not. eBay suggested $725 for the Buy it Now which just seemed insultingly absurd which is why I went with $600.

    The day before I listed it I was going to sell it to a fellow forumer for $375 shipped, but it fell outside their budget.

    I sold it in a day and a half for $600 + shipping. eBay took fucking $80 in fees, but I still got to use that card for 3.5 years and ended up over $100 ahead when all is said and done. I'm still not entirely convinced the buyer isn't trying to scam me, it feels too stupid to be true but **gestures at everything**.

    Shit, the guy paid more for my old 1070 Ti than I paid for the 3070 I was lucky enough to snag off Best Buy back in February.

    ThawmusTrajan45jmcdonaldBlackDragon480
  • MegaMan001MegaMan001 CRNA Rochester, MNRegistered User regular
    Desktop started flashing "CPU Fan Error!" at me. I haven't opened this desktop in close to a decade.

    A year or so ago the USB ports started acted weirdly.

    Coming to the end of this rigs life? Built in 2012.

    I am in the business of saving lives.
  • zagdrobzagdrob Registered User regular
    MegaMan001 wrote: »
    Desktop started flashing "CPU Fan Error!" at me. I haven't opened this desktop in close to a decade.

    A year or so ago the USB ports started acted weirdly.

    Coming to the end of this rigs life? Built in 2012.

    Have you checked for dust buildup?

    If you haven't cracked the case its probably bad in there.

    BullheadInfidelPhoenix-Djmcdonaldwebguy20Elvenshae
  • GilgaronGilgaron Registered User regular
    MegaMan001 wrote: »
    Desktop started flashing "CPU Fan Error!" at me. I haven't opened this desktop in close to a decade.

    A year or so ago the USB ports started acted weirdly.

    Coming to the end of this rigs life? Built in 2012.

    I'd open it up and dust it out real well with a can of compressed air, if the northbridge's sink fins are clogged it might make the USB ports act up? Shouldn't be hard to replace the CPU fan if its bearings are shot. At this rate you'd be able to have some gamers fight to the death over its half-functional carcass in order to have something that can play Starcraft 2 for less than $2k...

  • nexuscrawlernexuscrawler Registered User regular
    Could also be grime or dust in the connector or pins for the fan.

  • danxdanx Registered User regular
    Might also want to check if the CMOS battery is going. I had all sorts of weird intermittent errors on my old rig when it died.

    GilgaronBullhead
  • BullheadBullhead Registered User regular
    danx wrote: »
    Might also want to check if the CMOS battery is going. I had all sorts of weird intermittent errors on my old rig when it died.

    Yup, check to see if your date/time is off.

    96058.png?1619393207
  • BronzeKoopaBronzeKoopa Registered User regular
    What the hell I went and looked up my email invoice for the asus 1070 I bought in 2016 and I payed $450 new.

  • wunderbarwunderbar What Have I Done? Registered User regular
    3070 Ti reviews are basically "in a world where cards are available at MSRP there's little reason to buy this card over the other options". It's barely faster than a 3070 for $100 more, and the 3080 is a significant increase in price/performance that makes the 3070 Ti a weird card in the portfolio.

    But none of this matters since no one can actually buy anything.

    XBL: thewunderbar PSN: thewunderbar NNID: thewunderbar Steam: wunderbar87 Twitter: wunderbar
    DrovekElldrenV1m3cl1ps3
  • ErlkönigErlkönig Seattle, WARegistered User regular
    wunderbar wrote: »
    3070 Ti reviews are basically "in a world where cards are available at MSRP there's little reason to buy this card over the other options". It's barely faster than a 3070 for $100 more, and the 3080 is a significant increase in price/performance that makes the 3070 Ti a weird card in the portfolio.

    But none of this matters since no one can actually buy anything.

    Wait...I thought MSRP on standard 3070s was $599 (lol) and that the 3070ti also is going to have an MSRP of $599. Wouldn't that make the statement "why buy a 3070 when you get a slight performance boost at the same price"?

    | Origin/R*SC: Ein7919 | Battle.net: Erlkonig#1448 | XBL: Lexicanum | Steam: Der Erlkönig (the umlaut is important) |
  • EtheaEthea Registered User regular
    Erlkönig wrote: »
    wunderbar wrote: »
    3070 Ti reviews are basically "in a world where cards are available at MSRP there's little reason to buy this card over the other options". It's barely faster than a 3070 for $100 more, and the 3080 is a significant increase in price/performance that makes the 3070 Ti a weird card in the portfolio.

    But none of this matters since no one can actually buy anything.

    Wait...I thought MSRP on standard 3070s was $599 (lol) and that the 3070ti also is going to have an MSRP of $599. Wouldn't that make the statement "why buy a 3070 when you get a slight performance boost at the same price"?

    3070 FE MSRP is 499. No clue what other vendors like evga/msi charge for the 3070.

  • CormacCormac Registered User regular
    edited June 2021
    This might be more of a Tech Thread thing but I need to get a NAS. I feel like I've narrowed it down to the Asustor AS6604T and Synology DS920+. I could get by with a 2 bay but I think having a 4 bay would allow for future proofing to add more drives when the prices come back down in 6-12 months (hopefully).

    I'm making my primary PC into a dedicated gaming only computer because of how much heat it puts out which becomes a real issue during the summer even with AC. I've bought a M1 Mac Mini to use as my normal/daily computer needs and want to be able to access the files on my two 8TB hard drives without having my gaming PC on. If I end up not liking MacOS I'll return it and get an Intel NUC instead.

    The Asustor has better hardware for the money but the Synology has better native apps and support. The iOS apps for both are universally terrible but that's less of a concern that the native apps being good. I want something that's easy enough to set up, has good support both direct and through other users/reddit, and has expandability for additional storage and network speeds. The Synology might be a better option for a first NAS as I'm not a fan of delving into or dealing with complex network setup or management.

    Right now the NAS is going to be used for local network storage/file sharing and as a Plex server. Backup isn't really something I'm concerned with right now as my important files are in cloud backup, but having the ability for local backup when hard drive prices come back down is the long term plan.

    I'm not looking forward to trying to figure out how to move around the files already on the two drives between multiple old hard drives so I can format them in the NAS for it's file system. I'll get it done someway or another.

    Cormac on
    Steam: Gridlynk | PSN: Gridlynk | FFXIV: Jarvellis Mika
  • MegaMan001MegaMan001 CRNA Rochester, MNRegistered User regular
    Thanks for the tips I'll open it up and take a look

    I am in the business of saving lives.
  • wunderbarwunderbar What Have I Done? Registered User regular
    edited June 2021
    Erlkönig wrote: »
    wunderbar wrote: »
    3070 Ti reviews are basically "in a world where cards are available at MSRP there's little reason to buy this card over the other options". It's barely faster than a 3070 for $100 more, and the 3080 is a significant increase in price/performance that makes the 3070 Ti a weird card in the portfolio.

    But none of this matters since no one can actually buy anything.

    Wait...I thought MSRP on standard 3070s was $599 (lol) and that the 3070ti also is going to have an MSRP of $599. Wouldn't that make the statement "why buy a 3070 when you get a slight performance boost at the same price"?

    as said, 3070 MSRP is $500, vendors often charge a bit more for their beefed up cards, but the product stack now looks like this:

    3060: $330
    3060 Ti: $400
    3070: $500
    3070 Ti: $600
    3080: $700
    3080 Ti: 1200
    3090: $1500

    The product stack is super weird at MSRP. the cheapest card has more VRAM than all but the 3080 Ti and 3090. The middle cards offer very similar performance for their $100 jumps. the 3080 isn't super well future proofed for 4k gaming considering there are already games that can fill more than 8GB with 4k assets, and the 3080 Ti is a FIVE HUNDRED DOLLAR jump from the 3080.

    Again, none of this really matters because no one can buy anything but I just find the 3xxx series to be really weird.

    wunderbar on
    XBL: thewunderbar PSN: thewunderbar NNID: thewunderbar Steam: wunderbar87 Twitter: wunderbar
  • MugsleyMugsley DelawareRegistered User regular
    @Cormac I feel like we talked about this recently. Do you have an old laptop or old hardware that you can use to set up a running rig?

    My old Haswell build is running 4 drives via TrueNAS. The OS type stuff is being run from an old Intel 40GB SSD.

  • 3cl1ps33cl1ps3 I will build a labyrinth to house the cheese Registered User regular
    Dhalphir wrote: »
    I sold my 3.5 year old 1080Ti for more than I paid for it. It's absurd. I was literally paid to own one of the greatest GPUs ever made for almost 4 years.

    I know I probably should sell my 1080 Ti and make some cash while the market is so hot, but I am weirdly paranoid about my 3080 dying in this market and being unable to get a new one and ASUS doesn't have the most consistent customer service.

  • Pixelated PixiePixelated Pixie They/Them Registered User regular
    3cl1ps3 wrote: »
    Dhalphir wrote: »
    I sold my 3.5 year old 1080Ti for more than I paid for it. It's absurd. I was literally paid to own one of the greatest GPUs ever made for almost 4 years.

    I know I probably should sell my 1080 Ti and make some cash while the market is so hot, but I am weirdly paranoid about my 3080 dying in this market and being unable to get a new one and ASUS doesn't have the most consistent customer service.

    1. Sell your old card at the current used-card-scalping prices
    2. Bank the money
    3. If your new card dies before the market settles (and you can't RMA for some reason?), use saved money to buy used replacement from another used-card scalper
    4. If your new card doesn't die, profit!

    ~~ Pixie on Steam ~~
    ironzerg wrote: »
    Chipmunks are like nature's nipple clamps, I guess?
    ThawmusMugsleycardboard delusionsElvenshae
  • V1mV1m Registered User regular
    Got around to installing the new SSD today. Now my CPU fan is running perceptibly quieter. I didn't even go into the BIOS, I didn't touch anything I didn't change anything.

    Computers be fucking weird.

  • CormacCormac Registered User regular
    edited June 2021
    Mugsley wrote: »
    @Cormac I feel like we talked about this recently. Do you have an old laptop or old hardware that you can use to set up a running rig?

    My old Haswell build is running 4 drives via TrueNAS. The OS type stuff is being run from an old Intel 40GB SSD.

    It wasn't me I don't think but I do actually have a retired HTPC with a i5-3470. Building one myself would save a lot of money. It's worth looking into but I'm also looking for a simple solution. Unraid may be another option to consider and might be better suited to my needs.

    Cormac on
    Steam: Gridlynk | PSN: Gridlynk | FFXIV: Jarvellis Mika
    Mugsley
  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    edited June 2021
    V1m wrote: »
    Got around to installing the new SSD today. Now my CPU fan is running perceptibly quieter. I didn't even go into the BIOS, I didn't touch anything I didn't change anything.

    Computers be fucking weird.

    HDDs put out a little heat, maybe it was just enough for your CPU fan to kick into high?

    I dunno but like you said, computers be fucking weird.

    jungleroomx on
  • V1mV1m Registered User regular
    I added an SSD. Nothing was taken out. Shit is voodoo.

This discussion has been closed.