As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/
Options

Is 4 GB graphics memory enough for gaming a little down the line?

I know the future is the hardest thing to predict, but I'd would like to hear some thoughts on the subject anyway 8-)

I'm thinking about replacing my beloved 7970 with a Fury X, but I can not stop wondering if the 4 GB memory on it is gonna be a major bottleneck sometime soon?

I'm gaming in 4K and I have found that my 7970 actually does run Fa Cry 4 surprisingly playable in 4K(3840x2160), so I'm sure current games will be more than fine with the Fury X and the review sites tell the same thing but what about when DX12 titles start to appear. The Fury X is expensive enough that I'd rather not have it be like a year old and then already a part to replace.

Thoughts?


The rig the card is to live in has these other specs:
Core i7-870
Gigabyte GA-P55 UD6(Rev.1.0) - Intel® P55 Express Chipset , PCI Express x16 slot at x16...
8 GB ram
Windows 8.1 (will likely be Windows 10 once I get round to it).
Primary monitor is 40" 3840x2160
Secondary monitor is a 27" 2560x1440
i.e. not the newest hardware except the 40", but unless I'm mistaken fast enough for the GPU to the bottleneck in 4K regardless of which GPU it is. Do say if I got that wrong?

Bones heal, glory is forever.

Posts

  • Options
    ButtcleftButtcleft Registered User regular
    In my opinion, you're gonna want at least 8 gigs of ram.

    Source: I had a computer with 4gb of ram and it was a big reason why I built a new one, with 8gigs.

  • Options
    GaslightGaslight Registered User regular
    Buttcleft wrote: »
    In my opinion, you're gonna want at least 8 gigs of ram.

    Source: I had a computer with 4gb of ram and it was a big reason why I built a new one, with 8gigs.

    The question is about video memory, not system memory.

  • Options
    ButtcleftButtcleft Registered User regular
    My bad

  • Options
    GaslightGaslight Registered User regular
    In answer to the question, I would say no, 4GB is not going to be sufficient for very long if you're committed to gaming at 4K. I'm a little surprised to hear FC4 runs well at 4K with 4GB of VRAM. What kind of frame rates do you get? I guess it depends on what you mean by a "a little down the line," though. But the 980 ti has 6GB VRAM for about the same price as the Fury X, and I would feel a lot more confident about getting longevity out of that.

  • Options
    BlindZenDriverBlindZenDriver Registered User regular
    Gaslight wrote: »
    In answer to the question, I would say no, 4GB is not going to be sufficient for very long if you're committed to gaming at 4K. I'm a little surprised to hear FC4 runs well at 4K with 4GB of VRAM. What kind of frame rates do you get? I guess it depends on what you mean by a "a little down the line," though. But the 980 ti has 6GB VRAM for about the same price as the Fury X, and I would feel a lot more confident about getting longevity out of that.

    I won't go so far as to say Far Cry 4is running well in 4K with 7970 but it is playable which is much better than the slide show I expected to get with it. Unfortunately I don't know how to get data on the framerate, so I can't put numbers on that. It might be that due to me gaming since before there was 3D hardware functions on graphics cards I am more forgiving of lower frame rates.

    Now on the possible issue with 4 GB VRAM for gaming I have also been thinking about 980 and it's 6GB as an Fury X alternative. There are three main reasons why I'm still considering and have simply not gone that way, one is that I like that the Fury X is so quiet and the other is that I wanna support the underdog ie. give my money to AMD in the hope that they'll stay in business(as consumers we need competition to keep seeing progress in GPU's). And then there is the final reason which is DX12 where I'm thinking the Fury X could turn out to be the one to have - unless the 4 GB think ruins it.

    Decisions, decisions... :?

    Bones heal, glory is forever.
  • Options
    acidlacedpenguinacidlacedpenguin Institutionalized Safe in jail.Registered User regular
    Just so it's clear, I'm pretty sure it's only 980TIs that are 6gb, regular 980s are still 4gb, though they're still at least "real 4gb" not that "3.5gb and another 0.5gb of slower memory" that the 970s have.

    GT: Acidboogie PSNid: AcidLacedPenguiN
  • Options
    wunderbarwunderbar What Have I Done? Registered User regular
    Just so it's clear, I'm pretty sure it's only 980TIs that are 6gb, regular 980s are still 4gb, though they're still at least "real 4gb" not that "3.5gb and another 0.5gb of slower memory" that the 970s have.

    and just to be clear, the difference of the "slower" 0.5GB of memory in the 970 is nearly imperceptible in anything but a synthetic benchmark.

    XBL: thewunderbar PSN: thewunderbar NNID: thewunderbar Steam: wunderbar87 Twitter: wunderbar
  • Options
    wunderbarwunderbar What Have I Done? Registered User regular
    Seriously though, re: how much video memory you want or need. How long would you plan to have a video card? if you're replacing them every 2ish years than the Fury X is an acceptable choice now. If this is something you want to invest in and have for 4 or so years, I'd get something with a bigger frame buffer.

    Also, 4K gaming is acceptable with current high end cards, but I really think that it's going to be a generation or two yet before we see really good performance on 4k from high end video cards. Remember, You're trying to push 4 times as many pixels as 1080p, and most of the current video card architectures were designed before anyone was really trying to do that. I don't think *any* high end card today will be super future proofed for 4k gaming 3 or so years from now.

    On a personal level, I think the 980 Ti is a better card than the Fury X. If I was spending that much money on a video card today, that'd be what I get.

    XBL: thewunderbar PSN: thewunderbar NNID: thewunderbar Steam: wunderbar87 Twitter: wunderbar
  • Options
    BahamutZEROBahamutZERO Registered User regular
    what actual resolution is "4k"? since 1080 refers to 1920x1080 is 4k 7680x4320?

    BahamutZERO.gif
  • Options
    GaslightGaslight Registered User regular
    what actual resolution is "4k"? since 1080 refers to 1920x1080 is 4k 7680x4320?

    4K is 3840 X 2160.

  • Options
    BahamutZEROBahamutZERO Registered User regular
    hurfdurf it should be called 2k then yarglebargle stop changing the naming convention for no good reason *yelling at clouds*

    BahamutZERO.gif
  • Options
    LD50LD50 Registered User regular
    They charged the naming converting because bigger numbers sound better, and they're a bunch of lying fucks.

  • Options
    tsmvengytsmvengy Registered User regular
    4K is called 4K because it's twice the horizontal and vertical resolution of 2K. 2K is roughly 2000 pixels wide. Both were originally coined by digital cinema companies.

    4K is roughly 4X the total pixels of 1080p.

    steam_sig.png
  • Options
    BahamutZEROBahamutZERO Registered User regular
    edited August 2015
    why is the cinema industry naming convention being used this time instead of the TV industry naming convention though

    that's a rhetorical question, I understand why, it's just annoying and confusing

    BahamutZERO on
    BahamutZERO.gif
  • Options
    wunderbarwunderbar What Have I Done? Registered User regular
    tsmvengy wrote: »
    4K is called 4K because it's twice the horizontal and vertical resolution of 2K. 2K is roughly 2000 pixels wide. Both were originally coined by digital cinema companies.

    4K is roughly 4X the total pixels of 1080p.

    well, 2k *is* 1080p.



    I think naming conventions changed for one main reason. it's hard-ish to explain to a "normal person" that 2160p is actually 4x as big as 1080p. the number 2160 is 2x as much, nto 4x. But then you'd have to explain the actual vertical and horizontal resolution and how they're both double which is why it's 4x as much but we just don't say the whole resolution but trust us it actually is 4x as much.

    4k has the benefit of being (relatively) close to the number of horiztonal pixels, as well as sounding 4x bigger han 1080p.

    tl;dr; it's mostly marketing speak to the layman because people don't want to do math while buying a tv at best buy.

    XBL: thewunderbar PSN: thewunderbar NNID: thewunderbar Steam: wunderbar87 Twitter: wunderbar
  • Options
    wunderbarwunderbar What Have I Done? Registered User regular
    sidenote, I love explaining to people that are still obsessed with megapixel count in cameras what they think of their TV picture and how good that looks, and then telling them that that's only a 2 megapixel screen.

    XBL: thewunderbar PSN: thewunderbar NNID: thewunderbar Steam: wunderbar87 Twitter: wunderbar
  • Options
    tsmvengytsmvengy Registered User regular
    why is the cinema industry naming convention being used this time instead of the TV industry naming convention though

    that's a rhetorical question, I understand why, it's just annoying and confusing

    Come back with me to when it was CGA/SVGA/WXGA/whatever and tell me what's confusing!

    steam_sig.png
  • Options
    wunderbarwunderbar What Have I Done? Registered User regular
    edited August 2015
    They still do it to a degree, especially on phones.

    which leads us to the fact that right now there are qHD phones and QHD phones in the world, with very different resolutions.

    qHD is quarter HD, or 960x540, and QHD Stands for Quad HD, which is 1440p.

    And they somehow think that it makes sense.

    wunderbar on
    XBL: thewunderbar PSN: thewunderbar NNID: thewunderbar Steam: wunderbar87 Twitter: wunderbar
  • Options
    BahamutZEROBahamutZERO Registered User regular
    tsmvengy wrote: »
    why is the cinema industry naming convention being used this time instead of the TV industry naming convention though

    that's a rhetorical question, I understand why, it's just annoying and confusing

    Come back with me to when it was CGA/SVGA/WXGA/whatever and tell me what's confusing!

    no thank you

    BahamutZERO.gif
  • Options
    BlindZenDriverBlindZenDriver Registered User regular
    no thank you

    No. I am the one saying thank you hijacking the thread.


    Now lets back to question whether or not 4 GB graphics memory is going to become a bottleneck real soon - like fx.once we see DX12 games coming out.

    I have come to the reactualization that the fact that so few graphics cards sold currently have more than 4 GB it ought to hold back the need for 4+ GB memory, but on the other hand much of the memory used will be going for textures and it is rather easy for developers to include different resolution textures in their game so some will surely still be pushing the memory.

    Plus then there is the 4K thing where it may or may not be that gaming in that resolution as opposed to the more widely used resolutions like 1920x1080 or even those below that. The question is here if the screen resolution matters that much in the greater scheme of things when talking graphics memory. I mean we have come a very long since back when your amount of graphics memory was what decided if you could have a screen resolution of 1024x768 in 256 colours or 16 bit colours. A 4K 32 bit image is less than 32 MB so the frame buffer can't really make that much of a difference in the matter whatever the resolution used.

    Bones heal, glory is forever.
  • Options
    jothkijothki Registered User regular
    Can we safely assume that PC graphics requirements are going to be heavily restricted by the capabilities of the XBone and PS4 for their lifespans? It may be that if you can play anything out now at whatever quality you prefer, you'll be able to play absolutely everything else even after DX12.

  • Options
    BouwsTBouwsT Wanna come to a super soft birthday party? Registered User regular
    no thank you

    No. I am the one saying thank you hijacking the thread.


    Now lets back to question whether or not 4 GB graphics memory is going to become a bottleneck real soon - like fx.once we see DX12 games coming out.

    I have come to the reactualization that the fact that so few graphics cards sold currently have more than 4 GB it ought to hold back the need for 4+ GB memory, but on the other hand much of the memory used will be going for textures and it is rather easy for developers to include different resolution textures in their game so some will surely still be pushing the memory.

    Plus then there is the 4K thing where it may or may not be that gaming in that resolution as opposed to the more widely used resolutions like 1920x1080 or even those below that. The question is here if the screen resolution matters that much in the greater scheme of things when talking graphics memory. I mean we have come a very long since back when your amount of graphics memory was what decided if you could have a screen resolution of 1024x768 in 256 colours or 16 bit colours. A 4K 32 bit image is less than 32 MB so the frame buffer can't really make that much of a difference in the matter whatever the resolution used.

    I think your logic is sound that having 4GB+ of VRAM isn't going to be necessary at all for all gaming going forward, however in 4K, it absolutely will be. VRAM is critical for the huge textures needing to be rendered, and that ABSOLUTELY does scale with resolution.

    Between you and me, Peggy, I smoked this Juul and it did UNTHINKABLE things to my mind and body...
  • Options
    BlindZenDriverBlindZenDriver Registered User regular
    BouwsT wrote: »
    I think your logic is sound that having 4GB+ of VRAM isn't going to be necessary at all for all gaming going forward, however in 4K, it absolutely will be. VRAM is critical for the huge textures needing to be rendered, and that ABSOLUTELY does scale with resolution.

    Well I am not so sure about that. The scaling thing is just the question, because I would thing the textures are the same regardless of resolution and since the frame buffer even in 4K is not gone surpass 32 MB even if it is 32 bit (and I would think it is really 24 bit). In other words the difference in frame buffer memory size may be a big 400% wise when comparing say 4K and 1080p, but 32 MB out of 4096 MB is not a lot and in that context the frame buffer should only affect the memory requirements a little bit.

    Maybe something else is happening that explains the drop in rendering speed seen in the article here which look at the 4GB thing to try and see if it a real issue:
    techreport.com/blog/28800/how-much-video-memory-is-enough
    This is from the article - it is a good read:
    mordor-980ti-furyx-980.gif

    The main thing in the example seen here and the other ones is that the drop comes at resolutions above 4K, but on the other hand there may be other issues in some games with all the AMD cards so there is that. It could simply that AMD has optimized something for 4K at the most while Nvidia has done things differently.

    Bones heal, glory is forever.
  • Options
    wunderbarwunderbar What Have I Done? Registered User regular
    edited August 2015
    but you're not rendering a single 4k image. You're rendering hundreds or thousands of objects, which all require their own texture images, and those images get bigger with more resolution. and remember that a 3D object doesn't just have a single texture. All surfaces of it have their own texture, so while a static 2D image of say a 400x400 pixel square has 160,000 pixels to render. Put that into a 3D space and suddenly you have a cube with 6 sides that are all 400x400 pixels that need to be rendered. So that 400x400 pixel square now suddenly needs almost a million pixels to render it as a cube, and that doesn't account for internal surfaces or possible contents of said cube.

    4K will not only require bigger and more detailed textures, it'll require more of them becuase with more resolution it's physically needing to render more information at any time, which means having more actively loaded into the video memory. And if a 400x400 cube needs a million pixels to render just the external surfaces, you're going to need a lot of memory to render a or 8 k screen.

    And I don't know if you can really look at a lot of current games as good measuring sticks. I doubt many developers are targeting 4k as a common resolution right now, so any current game probably won't be the best example of how quickly you can fill up video memory if you want. What we need is something akin to the the first Crysis, something that looked past what the high end was at the time, to really see the differences.

    wunderbar on
    XBL: thewunderbar PSN: thewunderbar NNID: thewunderbar Steam: wunderbar87 Twitter: wunderbar
  • Options
    TofystedethTofystedeth Registered User regular
    Why is it rendering nonvisible surfaces? Shouldn't that get culled?

    steam_sig.png
  • Options
    wunderbarwunderbar What Have I Done? Registered User regular
    Why is it rendering nonvisible surfaces? Shouldn't that get culled?

    Games would want to keep all surfaces of all objects, especially those close to the player in game, in the buffer. Otherwise you'd get constant redraws of objects, even if you're standing right beside it in game.

    XBL: thewunderbar PSN: thewunderbar NNID: thewunderbar Steam: wunderbar87 Twitter: wunderbar
  • Options
    BouwsTBouwsT Wanna come to a super soft birthday party? Registered User regular
    Absolutely. Think of a modern FPS, and the details of the gun models alone. Just because there is only one gun being used out of your inventory, doesn't mean you're not going to have that model (and texture set) loaded and ready to go in case you decide to swap mid fire-fight.

    ALL surfaces may not be loaded into the buffer at all times, but the sizes needed for 4K are going to be huge. That, AND the fact that with the smaller the VRAM quantity installed, the fewer textures can be loaded simultaneously. This means the PC will have to swap textures in and out of that VRAM from the hard drive, which is PAINFULLY slow compared to just having the storage in the first place.

    I think this was pretty well spot on a week ago, so I'll re-iterate.
    wunderbar wrote: »
    Seriously though, re: how much video memory you want or need. How long would you plan to have a video card? if you're replacing them every 2ish years than the Fury X is an acceptable choice now. If this is something you want to invest in and have for 4 or so years, I'd get something with a bigger frame buffer.

    Also, 4K gaming is acceptable with current high end cards, but I really think that it's going to be a generation or two yet before we see really good performance on 4k from high end video cards. Remember, You're trying to push 4 times as many pixels as 1080p, and most of the current video card architectures were designed before anyone was really trying to do that. I don't think *any* high end card today will be super future proofed for 4k gaming 3 or so years from now.

    On a personal level, I think the 980 Ti is a better card than the Fury X. If I was spending that much money on a video card today, that'd be what I get.

    Between you and me, Peggy, I smoked this Juul and it did UNTHINKABLE things to my mind and body...
  • Options
    StormwatcherStormwatcher Blegh BlughRegistered User regular
    Not to mention crappy programming like Batman Arkham Knight, that's only playable on Win10 with 12 gb of RAM, even AFTER the re-release! Way to go, Warner!

    Steam: Stormwatcher | PSN: Stormwatcher33 | Switch: 5961-4777-3491
    camo_sig2.png
Sign In or Register to comment.