The new forums will be named Coin Return (based on the most recent vote)! You can check on the status and timeline of the transition to the new forums here.
The Guiding Principles and New Rules document is now in effect.

[PC Build Thread] Don't wanna buy our $600 GPU? Well fine, we're not making any!

24567101

Posts

  • MugsleyMugsley DelawareRegistered User regular
    Yes that should be correct

  • -Loki--Loki- Don't pee in my mouth and tell me it's raining. Registered User regular
    edited August 2022
    That didn't work. Maybe because I installed Windows before doing that.

    I went to Install again, and it wouldn't let me install to that SSD again. It's still fine, but I'm guessing I fucked something up.

    So I put the old drive back in, changed it back to the default boot drive, and formatted the other one. I'll try again on the weekend when I've got more time.

    As an aside, I realize Wifi wasn't auto connection because... I unchecked auto connect. Clever me.

    Just so I get the steps down:

    1. Unplug old drive.
    2. Change bios boot sequence to go from USB first.
    3. Restart, go through the steps to install Windows, to the correct drive.
    4. Change bios boot sequence to go from the new drive (this may happen automatically?)
    5. Windows will now have the new drive as C.

    At this point, do I just plug the old drive back in? Wouldn't it still have drive C assigned and cause problems?

    -Loki- on
  • BahamutZEROBahamutZERO Registered User regular
    auto connect checkbox unchecks itself sometimes for whatever reason, I find, so you might not be crazy for this reason

    BahamutZERO.gif
  • SiliconStewSiliconStew Registered User regular
    -Loki- wrote: »
    That didn't work. Maybe because I installed Windows before doing that.

    I went to Install again, and it wouldn't let me install to that SSD again. It's still fine, but I'm guessing I fucked something up.

    So I put the old drive back in, changed it back to the default boot drive, and formatted the other one. I'll try again on the weekend when I've got more time.

    As an aside, I realize Wifi wasn't auto connection because... I unchecked auto connect. Clever me.

    Just so I get the steps down:

    1. Unplug old drive.
    2. Change bios boot sequence to go from USB first.
    3. Restart, go through the steps to install Windows, to the correct drive.
    4. Change bios boot sequence to go from the new drive (this may happen automatically?)
    5. Windows will now have the new drive as C.

    At this point, do I just plug the old drive back in? Wouldn't it still have drive C assigned and cause problems?

    What may have happened on the first try is that the installer didn't create an EFI system partition that Windows actually boots from on the second drive. So while you could select either OS to run, it would only be able to boot with the first drive installed. Also Windows won't (re)install to an existing volume, you have to delete/format it first.

    Your steps are correct. Windows will just assign the other drive a different letter after boot.

    Just remember that half the people you meet are below average intelligence.
  • -Loki--Loki- Don't pee in my mouth and tell me it's raining. Registered User regular
    edited August 2022
    -Loki- wrote: »
    That didn't work. Maybe because I installed Windows before doing that.

    I went to Install again, and it wouldn't let me install to that SSD again. It's still fine, but I'm guessing I fucked something up.

    So I put the old drive back in, changed it back to the default boot drive, and formatted the other one. I'll try again on the weekend when I've got more time.

    As an aside, I realize Wifi wasn't auto connection because... I unchecked auto connect. Clever me.

    Just so I get the steps down:

    1. Unplug old drive.
    2. Change bios boot sequence to go from USB first.
    3. Restart, go through the steps to install Windows, to the correct drive.
    4. Change bios boot sequence to go from the new drive (this may happen automatically?)
    5. Windows will now have the new drive as C.

    At this point, do I just plug the old drive back in? Wouldn't it still have drive C assigned and cause problems?

    What may have happened on the first try is that the installer didn't create an EFI system partition that Windows actually boots from on the second drive. So while you could select either OS to run, it would only be able to boot with the first drive installed. Also Windows won't (re)install to an existing volume, you have to delete/format it first.

    Your steps are correct. Windows will just assign the other drive a different letter after boot.

    By delete/format the volume to reinstall, do you mean go into Disk Management, delete the volume, then format the disk?

    I formatted the volume, but I did not delete it first.

    -Loki- on
  • syndalissyndalis Getting Classy On the WallRegistered User, Loves Apple Products, Transition Team regular
    Metal 3, the latest version of the software that powers the gaming experience across Apple platforms, introduces new features that take the gaming experience on Mac to new heights and unleash the full potential of Apple silicon for years to come. MetalFX Upscaling enables developers to quickly render complex scenes by using less compute-intensive frames, and then apply resolution scaling and temporal anti-aliasing. The result is accelerated performance that provides gamers with a more responsive feel and graphics that look stunning. Game developers also benefit from a new Fast Resource Loading API that minimizes wait time by providing a more direct path from storage to the GPU, so games can easily access high-quality textures and geometry needed to create expansive worlds for realistic and immersive gameplay.

    Sure sounds like Apple is working on DLSS/FSR and DirectStorage for Apple things (iOS, iPad, Mx Macs). Interesting to see the whole industry moving in this direction. Maybe this stuff will help close the performance gap Mx hardware has for gaming workloads, because despite being badass for work stuff, their GPUs are mid tier at best for gaming.

    SW-4158-3990-6116
    Let's play Mario Kart or something...
  • MugsleyMugsley DelawareRegistered User regular
    Sounds a lot like a collection of buzzwords so I'm curious how it shows up in actual performance.

    Also for all the shit people give Nvidia for not playing nice with open source like AMD is, they seem to give Apple a pass. And it confuses me.

  • syndalissyndalis Getting Classy On the WallRegistered User, Loves Apple Products, Transition Team regular
    Mugsley wrote: »
    Sounds a lot like a collection of buzzwords so I'm curious how it shows up in actual performance.

    Also for all the shit people give Nvidia for not playing nice with open source like AMD is, they seem to give Apple a pass. And it confuses me.

    Probably not enough to take the crown away from anyone, but it would be nice if their Ultra and Max series chips could at least be at the heels of the competition.

    And regarding open source, Apple does contribute to and have a rather large set of open source things they are responsible for.

    WebKit is probably the most-used open source thing they produce, but they are also significant contributors to K8s, Cassandra/Spark/Solr, etc.

    SW-4158-3990-6116
    Let's play Mario Kart or something...
  • ThawmusThawmus +Jackface Registered User regular
    syndalis wrote: »
    Mugsley wrote: »
    Sounds a lot like a collection of buzzwords so I'm curious how it shows up in actual performance.

    Also for all the shit people give Nvidia for not playing nice with open source like AMD is, they seem to give Apple a pass. And it confuses me.

    Probably not enough to take the crown away from anyone, but it would be nice if their Ultra and Max series chips could at least be at the heels of the competition.

    And regarding open source, Apple does contribute to and have a rather large set of open source things they are responsible for.

    WebKit is probably the most-used open source thing they produce, but they are also significant contributors to K8s, Cassandra/Spark/Solr, etc.

    The biggest one that comes to mind is CUPS, which is responsible for pretty much every distro's unix based printing, and that footprint increased a couple years ago when the fix for the Windows print spooler exploit was to disable it. People started spinning up CUPS servers instead.

    Twitch: Thawmus83
  • LD50LD50 Registered User regular
    The real question is will they ever go back to including real gpus in their hardware.

    The current generation of macs and ios devices are using mobile gpus that are some seriously trash tier hardware. The performance is passable for the power draw but they are the buggiest hardware I have ever seen in wide production, and they don't conform to the openly available documentation.

    They might be able to marry upscaling tech to the output of their rebranded powervr gpus but that won't change the gpu choking anytime one of it's various vertex/drawing buffers fills beyond an undocumented capacity.

  • syndalissyndalis Getting Classy On the WallRegistered User, Loves Apple Products, Transition Team regular
    LD50 wrote: »
    The real question is will they ever go back to including real gpus in their hardware.

    The current generation of macs and ios devices are using mobile gpus that are some seriously trash tier hardware. The performance is passable for the power draw but they are the buggiest hardware I have ever seen in wide production, and they don't conform to the openly available documentation.

    They might be able to marry upscaling tech to the output of their rebranded powervr gpus but that won't change the gpu choking anytime one of it's various vertex/drawing buffers fills beyond an undocumented capacity.

    I think the answer is going to be a hard no on them using external GPUs, and the only reason I say think instead of “know” is because the Mac Pro hasn’t relaunched on M chips yet.

    If the Mac Pro doesn’t have PCIe slots we can assume that Apple intends to own the chip designs for the whole product line going forward.

    That said, the chips in the M series laptops are quite a bit better than you are describing here… mid tier 30XX level (RTX 3050ish) for raster out of a 13” laptop is pretty wild.

    No ray tracing, no DLSS, etc. etc. but its not a weak GPU, just not a top tier one, which is pretty remarkable for their first past and the fact that it is effectively an integrated SoC GPU.

    SW-4158-3990-6116
    Let's play Mario Kart or something...
  • LD50LD50 Registered User regular
    syndalis wrote: »
    LD50 wrote: »
    The real question is will they ever go back to including real gpus in their hardware.

    The current generation of macs and ios devices are using mobile gpus that are some seriously trash tier hardware. The performance is passable for the power draw but they are the buggiest hardware I have ever seen in wide production, and they don't conform to the openly available documentation.

    They might be able to marry upscaling tech to the output of their rebranded powervr gpus but that won't change the gpu choking anytime one of it's various vertex/drawing buffers fills beyond an undocumented capacity.

    I think the answer is going to be a hard no on them using external GPUs, and the only reason I say think instead of “know” is because the Mac Pro hasn’t relaunched on M chips yet.

    If the Mac Pro doesn’t have PCIe slots we can assume that Apple intends to own the chip designs for the whole product line going forward.

    That said, the chips in the M series laptops are quite a bit better than you are describing here… mid tier 30XX level (RTX 3050ish) for raster out of a 13” laptop is pretty wild.

    No ray tracing, no DLSS, etc. etc. but its not a weak GPU, just not a top tier one, which is pretty remarkable for their first past and the fact that it is effectively an integrated SoC GPU.

    It isn't their performance that is the issue, it is that the graphics pipeline for the M1 soc is buggy as hell, and has a bunch of undocumented limitations/quirks. It works fine for ultrabook style workloads plus pretty desktop decorations, but trying to do more than that can be maddening.

  • BahamutZEROBahamutZERO Registered User regular
    edited August 2022
    RTX 3050 chipsets are actually the lowest power chipsets in the 3000 series line, not midrange at all
    I see how the numbering might make it sound like they are the midrange chip but no that's just marketing tricks to make it sound more premium than it is, start counting halfway through the set

    BahamutZERO on
    BahamutZERO.gif
  • wunderbarwunderbar What Have I Done? Registered User regular
    I think what you'll see from Apple is that their GPU's will be heavily optimized for their software, and will run very very well on their software. You already see that with Final Cut Pro on the Mac Studio. That actually gets pretty incredible performance. But other apps, yeah it looks like a mobile GPU in terms of performance.

    That's not to say that it's terrible. A mobile 3080 is a pretty good part, and can run pretty fast if you pump enough power through it. Apple is probably going to take a similar route here, just scaling up what is otherwise a mobile part.

    And we're back to the reality that comparing performance between Mac's and PC's is returning to comparing apples to oranges (I'm so sorry). With the architectures now being different between the platform most benchmarks/performance comparisons don't have a ton of meaning anymore.

    Macs will be for people who want to live in that ecosystem, and can take full advantage of the platform and the hardware that platform runs on. Everyone else will continue to run x86 PC's. Each will have advantages and disadvantages and it turns into a decision based on an individual need.

    XBL: thewunderbar PSN: thewunderbar NNID: thewunderbar Steam: wunderbar87 Twitter: wunderbar
  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    edited August 2022
    RTX 3050 chipsets are actually the lowest power chipsets in the 3000 series line, not midrange at all
    I see how the numbering might make it sound like they are the midrange chip but no that's just marketing tricks to make it sound more premium than it is, start counting halfway through the set

    On a technical level, the M1 and M2's have been pretty great if you ignore absolutely everything Apple says about them, i.e. the famous "fastest discrete graphics" slide.

    The M2's GPU is slower than the M1 Max/Ultra, too, and the M1 Ultra topped out just at a 3050's performance. So maybe they'll hit 3060 performance numbers with the M2 Ultra stuff, which would put them fairly midrange. Even so, the 4000 series (including the mobile variants, which would be a better comparison point to the M2 GPUs) is right around the corner, which will redefine precisely what "midrange" actually is. Probably somewhere like a 3070.

    The increased performance has also increased heat from Apple Silicon, which makes a lot of sense.

    jungleroomx on
  • MugsleyMugsley DelawareRegistered User regular
    I guess what I really meant was "fuck you Nvidia why don't you use open source FSR like AMD" vs "....well Apple's chip is full of voodoo and demons so they have to use their own graphics firmware so it's okay"

    Whatever; it's nuance and I'm not dying on this hill.

  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    Mugsley wrote: »
    I guess what I really meant was "fuck you Nvidia why don't you use open source FSR like AMD" vs "....well Apple's chip is full of voodoo and demons so they have to use their own graphics firmware so it's okay"

    Whatever; it's nuance and I'm not dying on this hill.

    Oh no, Nvidia can go fuck itself right off a cliff for its anticompetitive bullshit. I am begging AMD or Intel to make a GPU that isn’t a compromise in some fashion so I can at least feel like I’m getting out from the Great Leather Jacket of Jensen

  • MulletudeMulletude Registered User regular
    5700x and aio installed. CPU hasn't gone above about 56 degrees during benchmark and game testing.

    Pretty happy with that.

    Alyx runs noticeably smoother as well

    XBL-Dug Danger WiiU-DugDanger Steam-http://steamcommunity.com/id/DugDanger/
  • V1mV1m Registered User regular
    edited August 2022
    Mugsley wrote: »
    I guess what I really meant was "fuck you Nvidia why don't you use open source FSR like AMD" vs "....well Apple's chip is full of voodoo and demons so they have to use their own graphics firmware so it's okay"

    Whatever; it's nuance and I'm not dying on this hill.

    Oh no, Nvidia can go fuck itself right off a cliff for its anticompetitive bullshit. I am begging AMD or Intel to make a GPU that isn’t a compromise in some fashion so I can at least feel like I’m getting out from the Great Leather Jacket of Jensen

    I think you will be waiting at least 2 more generations from Intel. If rDNA3 has improved raytracing then AMD might qualify this time, although I will be most surprised if Nvidia do not retain the raytracing crown.

    But Nvidia consistently introduce new GPU features that their card does and no one else's really can to make sure that there are lots of benchmark charts showing theirs on top. Even if that new feature brings the performance of their cards way down that's fine as long as it brings down AMD cards more. Eg: raytracing on the 2000 series or mk1.0a DLSS on the 3000s.
    When you're the market leader you can force response to your new features in a way AMD and Intel can't.

    So likely there will always be "compromise" in the way you're thinking of, and you keep buying green.

    V1m on
  • FremFrem Registered User regular
    V1m wrote: »
    Mugsley wrote: »
    I guess what I really meant was "fuck you Nvidia why don't you use open source FSR like AMD" vs "....well Apple's chip is full of voodoo and demons so they have to use their own graphics firmware so it's okay"

    Whatever; it's nuance and I'm not dying on this hill.

    Oh no, Nvidia can go fuck itself right off a cliff for its anticompetitive bullshit. I am begging AMD or Intel to make a GPU that isn’t a compromise in some fashion so I can at least feel like I’m getting out from the Great Leather Jacket of Jensen

    I think you will be waiting at least 2 more generations from Intel. If rDNA3 has improved raytracing then AMD might qualify this time, although I will be most surprised if Nvidia do not retain the raytracing crown.

    But Nvidia consistently introduce new GPU features that their card does and no one else's really can to make sure that there are lots of benchmark charts showing theirs on top. Even if that new feature brings the performance of their cards way down that's fine as long as it brings down AMD cards more. Eg: raytracing on the 2000 series or mk1.0a DLSS on the 3000s.
    When you're the market leader you can force response to your new features in a way AMD and Intel can't.

    So likely there will always be "compromise" in the way you're thinking of, and you keep buying green.

    Intel could theoretically do things that would be hard for NVidia to duplicate if they built features that leveraged their control over the GPU, CPU, and iGPU.

  • RightfulSinRightfulSin Registered User regular
    Hello thread. I am thinking of getting my hands on a new rig, as what I have it massively old and outdated now (IE: CPU is AMD Phenom II X6 1075T and GPU is a 970; ya I know).

    I am seeking to a place to order a PC from that will be quite good for a while to come. I will be using it to run games, modern games. I have heard of places such as Ibuypower, NZXT, Origin, and maybe one or two others. I do not have any preexisting experience with any of these companies. The budget I have for the PC is up to 4-4.5k not counting any peripherals like kb+m and new monitors. I have been looking at the above sites and doing a few custom builds and saving them via email for comparison. Just seeking some additional input. Thank you.

    "If nothing is impossible, than would it not be impossible to find something that you could not do?" - Me
  • This content has been removed.

  • V1mV1m Registered User regular
    Frem wrote: »
    V1m wrote: »
    Mugsley wrote: »
    I guess what I really meant was "fuck you Nvidia why don't you use open source FSR like AMD" vs "....well Apple's chip is full of voodoo and demons so they have to use their own graphics firmware so it's okay"

    Whatever; it's nuance and I'm not dying on this hill.

    Oh no, Nvidia can go fuck itself right off a cliff for its anticompetitive bullshit. I am begging AMD or Intel to make a GPU that isn’t a compromise in some fashion so I can at least feel like I’m getting out from the Great Leather Jacket of Jensen

    I think you will be waiting at least 2 more generations from Intel. If rDNA3 has improved raytracing then AMD might qualify this time, although I will be most surprised if Nvidia do not retain the raytracing crown.

    But Nvidia consistently introduce new GPU features that their card does and no one else's really can to make sure that there are lots of benchmark charts showing theirs on top. Even if that new feature brings the performance of their cards way down that's fine as long as it brings down AMD cards more. Eg: raytracing on the 2000 series or mk1.0a DLSS on the 3000s.
    When you're the market leader you can force response to your new features in a way AMD and Intel can't.

    So likely there will always be "compromise" in the way you're thinking of, and you keep buying green.

    Intel could theoretically do things that would be hard for NVidia to duplicate if they built features that leveraged their control over the GPU, CPU, and iGPU.

    Sure, if they wanted to hand the gamer/home market to AMD even more than they already have.

  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    edited August 2022
    V1m wrote: »
    Mugsley wrote: »
    I guess what I really meant was "fuck you Nvidia why don't you use open source FSR like AMD" vs "....well Apple's chip is full of voodoo and demons so they have to use their own graphics firmware so it's okay"

    Whatever; it's nuance and I'm not dying on this hill.

    Oh no, Nvidia can go fuck itself right off a cliff for its anticompetitive bullshit. I am begging AMD or Intel to make a GPU that isn’t a compromise in some fashion so I can at least feel like I’m getting out from the Great Leather Jacket of Jensen

    I think you will be waiting at least 2 more generations from Intel. If rDNA3 has improved raytracing then AMD might qualify this time, although I will be most surprised if Nvidia do not retain the raytracing crown.

    But Nvidia consistently introduce new GPU features that their card does and no one else's really can to make sure that there are lots of benchmark charts showing theirs on top. Even if that new feature brings the performance of their cards way down that's fine as long as it brings down AMD cards more. Eg: raytracing on the 2000 series or mk1.0a DLSS on the 3000s.
    When you're the market leader you can force response to your new features in a way AMD and Intel can't.

    So likely there will always be "compromise" in the way you're thinking of, and you keep buying green.

    To a point, yes.

    But also, AMD made massive strides in terms of raster grunt in a very short period of time, and DLSS isn't quite the unassailable thing it once was.

    FSR still isn't as good, mind, but it's better than it was by miles. It could probably get to a similar level of Freesync vs GSync, where the difference is minimal.

    jungleroomx on
  • V1mV1m Registered User regular
    We can but hope. My expectation is that the 4000s will be still be perceptibly faster running the Nvidia-defined RTX mode, but that AMD will go a long way to narrow the gap, but I would be most surprised if Nvidia did not also expect this and is so not countering with DLSS Duplo Plus or Super Flesh Tone-O-Rama or WrinkleWorks(tm) or whatever else it takes this time around to have some thing where they're hardware accelerated and AMD isn't.

  • wunderbarwunderbar What Have I Done? Registered User regular
    V1m wrote: »
    We can but hope. My expectation is that the 4000s will be still be perceptibly faster running the Nvidia-defined RTX mode, but that AMD will go a long way to narrow the gap, but I would be most surprised if Nvidia did not also expect this and is so not countering with DLSS Duplo Plus or Super Flesh Tone-O-Rama or WrinkleWorks(tm) or whatever else it takes this time around to have some thing where they're hardware accelerated and AMD isn't.

    If the rumours are to be believed nvidia is going to make the 4000 series run faster at the high end by including a micro scale nuclear reactor with each 4090 in order to provide enough power to it.

    XBL: thewunderbar PSN: thewunderbar NNID: thewunderbar Steam: wunderbar87 Twitter: wunderbar
  • ThawmusThawmus +Jackface Registered User regular
    Guys we talked about this two years ago, jesus, I can't believe we're still going on about this.


    They're going to make them Dorito Nacho Supreme flavored.

    Twitch: Thawmus83
  • DrovekDrovek Registered User regular
    Thawmus wrote: »
    Guys we talked about this two years ago, jesus, I can't believe we're still going on about this.


    They're going to make them Dorito Nacho Supreme flavored.

    Ugh, but is it going to be all over my fingers after installing it?

    steam_sig.png( < . . .
  • ThawmusThawmus +Jackface Registered User regular
    Drovek wrote: »
    Thawmus wrote: »
    Guys we talked about this two years ago, jesus, I can't believe we're still going on about this.


    They're going to make them Dorito Nacho Supreme flavored.

    Ugh, but is it going to be all over my fingers after installing it?

    Yes.

    That's what you're paying for! Finger-lickin' good! Wash the taste down with our Gamer Fuel Cooling Solution!

    Twitch: Thawmus83
  • That_GuyThat_Guy I don't wanna be that guy Registered User regular
    Thawmus wrote: »
    Drovek wrote: »
    Thawmus wrote: »
    Guys we talked about this two years ago, jesus, I can't believe we're still going on about this.


    They're going to make them Dorito Nacho Supreme flavored.

    Ugh, but is it going to be all over my fingers after installing it?

    Yes.

    That's what you're paying for! Finger-lickin' good! Wash the taste down with our Gamer Fuel Cooling Solution!

    Give me cooler ranch or get outta here.

  • ThawmusThawmus +Jackface Registered User regular
    That_Guy wrote: »
    Thawmus wrote: »
    Drovek wrote: »
    Thawmus wrote: »
    Guys we talked about this two years ago, jesus, I can't believe we're still going on about this.


    They're going to make them Dorito Nacho Supreme flavored.

    Ugh, but is it going to be all over my fingers after installing it?

    Yes.

    That's what you're paying for! Finger-lickin' good! Wash the taste down with our Gamer Fuel Cooling Solution!

    Give me cooler ranch or get outta here.

    You'll have to wait for the 4060 series.

    Twitch: Thawmus83
  • ProhassProhass Registered User regular
    edited August 2022
    Gah my tiny ssd is finally catching up with me. It only ever has 150 gig available even with nothing installed on it. I’d add another but there’s not enough space on the motherboard or something? I’ll have to replace it, which I have no idea how to do, time to start researching

    Alternatively why the heck does this presumably 225 gig drive or whatever always have so much space taken up by default? Is there any way to check what’s taking up that space

    Prohass on
  • übergeekübergeek Sector 2814Registered User regular
    Prohass wrote: »
    Gah my tiny ssd is finally catching up with me. It only ever has 150 gig available even with nothing installed on it. I’d add another but there’s not enough space on the motherboard or something? I’ll have to replace it, which I have no idea how to do, time to start researching

    Alternatively why the heck does this presumably 225 gig drive or whatever always have so much space taken up by default? Is there any way to check what’s taking up that space

    Part of that is over provisioned space, which could be around 10% of the size. It's reserved so when blocks start failing, it starts using that 10% of space to give you time to make a backup or transfer the files to a new drive. Then there's the smaller portion used by the system for the boot files that you can't see.

    camo_sig.png
  • minor incidentminor incident expert in a dying field ---Registered User, Transition Team regular
    Prohass wrote: »
    Gah my tiny ssd is finally catching up with me. It only ever has 150 gig available even with nothing installed on it. I’d add another but there’s not enough space on the motherboard or something? I’ll have to replace it, which I have no idea how to do, time to start researching

    Alternatively why the heck does this presumably 225 gig drive or whatever always have so much space taken up by default? Is there any way to check what’s taking up that space

    If this is your OS drive, Windows takes up a minimum of like 30GB itself, and will often balloon up to 50+GB, depending on hibernation files, caches, reserved space for updates, etc.

    Ah, it stinks, it sucks, it's anthropologically unjust
  • MulletudeMulletude Registered User regular
    Prohass wrote: »
    Gah my tiny ssd is finally catching up with me. It only ever has 150 gig available even with nothing installed on it. I’d add another but there’s not enough space on the motherboard or something? I’ll have to replace it, which I have no idea how to do, time to start researching

    Alternatively why the heck does this presumably 225 gig drive or whatever always have so much space taken up by default? Is there any way to check what’s taking up that space

    You could always add a sata ssd instead of try to replace that (what I'm guessing is) nvme boot drive. They're cheaper and just for game stuff they're plenty fast.

    XBL-Dug Danger WiiU-DugDanger Steam-http://steamcommunity.com/id/DugDanger/
  • ProhassProhass Registered User regular
    edited August 2022
    Mulletude wrote: »
    Prohass wrote: »
    Gah my tiny ssd is finally catching up with me. It only ever has 150 gig available even with nothing installed on it. I’d add another but there’s not enough space on the motherboard or something? I’ll have to replace it, which I have no idea how to do, time to start researching

    Alternatively why the heck does this presumably 225 gig drive or whatever always have so much space taken up by default? Is there any way to check what’s taking up that space

    You could always add a sata ssd instead of try to replace that (what I'm guessing is) nvme boot drive. They're cheaper and just for game stuff they're plenty fast.

    Are they like external ssds? I might do that actually. Any recommendations on what I should look for/a good one?

    Prohass on
  • übergeekübergeek Sector 2814Registered User regular
    Prohass wrote: »
    Mulletude wrote: »
    Prohass wrote: »
    Gah my tiny ssd is finally catching up with me. It only ever has 150 gig available even with nothing installed on it. I’d add another but there’s not enough space on the motherboard or something? I’ll have to replace it, which I have no idea how to do, time to start researching

    Alternatively why the heck does this presumably 225 gig drive or whatever always have so much space taken up by default? Is there any way to check what’s taking up that space

    You could always add a sata ssd instead of try to replace that (what I'm guessing is) nvme boot drive. They're cheaper and just for game stuff they're plenty fast.

    Are they like external ssds? I might do that actually. Any recommendations on what I should look for/a good one?

    If you go SATA SSD, Samsung makes the best but they tend to be pricier. Crucial is a quality brand and doesn't break the bank. Mushkin used to be good, haven't kept up with them. ADATA is pretty good but their NVME's are better. Then there's the usual Seagate and Western Digital brands. Just get something with a good write endurance.

    camo_sig.png
  • tsmvengytsmvengy Registered User regular
    Prohass wrote: »
    Gah my tiny ssd is finally catching up with me. It only ever has 150 gig available even with nothing installed on it. I’d add another but there’s not enough space on the motherboard or something? I’ll have to replace it, which I have no idea how to do, time to start researching

    Alternatively why the heck does this presumably 225 gig drive or whatever always have so much space taken up by default? Is there any way to check what’s taking up that space

    For a graphical representation of your space:
    https://windirstat.net/

    steam_sig.png
  • This content has been removed.

  • MulletudeMulletude Registered User regular
    Prohass wrote: »
    Mulletude wrote: »
    Prohass wrote: »
    Gah my tiny ssd is finally catching up with me. It only ever has 150 gig available even with nothing installed on it. I’d add another but there’s not enough space on the motherboard or something? I’ll have to replace it, which I have no idea how to do, time to start researching

    Alternatively why the heck does this presumably 225 gig drive or whatever always have so much space taken up by default? Is there any way to check what’s taking up that space

    You could always add a sata ssd instead of try to replace that (what I'm guessing is) nvme boot drive. They're cheaper and just for game stuff they're plenty fast.

    Are they like external ssds? I might do that actually. Any recommendations on what I should look for/a good one?

    Out of curiosity, do you know what your motherboard is or if it's a prebuilt machine which one?

    Maybe we could tailor recommendations based on what your system can support.

    And the sata ssd wouldn't be external, no. I've never used an external ssd but a quick googling tells me you can play games from them ok. Looks like one with usb-c would give you the best performance if your pc has that port (mine doesn't).

    XBL-Dug Danger WiiU-DugDanger Steam-http://steamcommunity.com/id/DugDanger/
This discussion has been closed.