Options

Cloud Computing: Is it really the future of Gaming?

13

Posts

  • Options
    Salvation122Salvation122 Registered User regular
    One thing that the cloud unquestionably has an advantage in is creating very large random environments.

    Last time I brought it up (in significantly more detail, I'll try to find the post), the rejoinder was "well, just make a bunch of those environments and put them on the disc." Which is fine, but, well... then it's not random. Even if you make a whole bunch of them, you'll eventually start getting repeats.

    Again, I'd like to see more devs actually make these games and prove the value of the cloud. Are they not making many games with huge random environments specifically because they aren't able to generate the world quickly enough, or is it more because gamers want visual spectacles designed for maximum awesomeness, and well-designed stories/campaigns that aren't conducive to random environments?

    Minecraft doesn't need the cloud to generate its nearly infinite world. It's a form of fractals where the entire world is already defined by a series of variables and equations. You just ask your world chunk generator for the next logical chunk.

    The only game I can think of with a very large, randomly generated world where generation is prohibitively long is Dwarf Fortress.

    Sure. But you could also have, for example, FPS roguelikes, which I'm fairly certain are simply not viable on today's hardware for world-gen reasons. Loadtimes would be really obnoxious.
    Same problem the new XCOM game has - they were convinced that no one would see the same map twice, but I assure you, over the seven or so campaigns I've played, I've seen them all many, many times.

    The problem is that most players don't even beat a game once, let alone play it 7 times. They may very well have been justified in believing that over 80% of their player base would never see the same map twice. That's the primary argument for putting generated maps on disc: that most players will not notice.

    Let's be clear, I've only beaten XCOM twice

    The other five times were Ironman runs where XCOM happened

  • Options
    lowlylowlycooklowlylowlycook Registered User regular
    One thing that the cloud unquestionably has an advantage in is creating very large random environments.

    Last time I brought it up (in significantly more detail, I'll try to find the post), the rejoinder was "well, just make a bunch of those environments and put them on the disc." Which is fine, but, well... then it's not random. Even if you make a whole bunch of them, you'll eventually start getting repeats. Same problem the new XCOM game has - they were convinced that no one would see the same map twice, but I assure you, over the seven or so campaigns I've played, I've seen them all many, many times.

    My point at the time wasn't that quickly generated maps wouldn't be cool. I'm all for CiV using 1 second of supercomputing time to just plop a huge random map out for me. Just that the application to mainstream gaming was limited. At least in the short term. In the future a truly dedicated Cloud Box would be very interesting as would getting serious computing power available to things like phones and Google Glass. Heck, if MS were talking up Cloud Gaming on WP8 I'd be listening much more closely.

    Every time I sit down and think about how cloud stuff could add to gaming it always seems to end up as some kind of MMO lite idea or some ridiculous PC style sim. Or both. For instance I was thinking of a sailboat "round the world" racing game, something like Vendee Globe or Volvo Ocean race or similar. The cloud computing would be calculating the weather dynamically as the race went on. The sailing sim would run locally so latency would be a minimal problem. Cool, right? But if it's single player, it's probably a waste to cook up your own weather sim and if it's mutiplayer then it's basically a really specific MMO.

    Anyway, in the OP I was working on I made a point to tell people to and avoid too much talk of the XBO and MS's PR talk about it. Might not be a bad idea.

    steam_sig.png
    (Please do not gift. My game bank is already full.)
  • Options
    UncleSporkyUncleSporky Registered User regular
    One thing that the cloud unquestionably has an advantage in is creating very large random environments.

    Last time I brought it up (in significantly more detail, I'll try to find the post), the rejoinder was "well, just make a bunch of those environments and put them on the disc." Which is fine, but, well... then it's not random. Even if you make a whole bunch of them, you'll eventually start getting repeats.

    Again, I'd like to see more devs actually make these games and prove the value of the cloud. Are they not making many games with huge random environments specifically because they aren't able to generate the world quickly enough, or is it more because gamers want visual spectacles designed for maximum awesomeness, and well-designed stories/campaigns that aren't conducive to random environments?

    Minecraft doesn't need the cloud to generate its nearly infinite world. It's a form of fractals where the entire world is already defined by a series of variables and equations. You just ask your world chunk generator for the next logical chunk.

    The only game I can think of with a very large, randomly generated world where generation is prohibitively long is Dwarf Fortress.

    Sure. But you could also have, for example, FPS roguelikes, which I'm fairly certain are simply not viable on today's hardware for world-gen reasons. Loadtimes would be really obnoxious.

    No, not at all. Roguelike maps are extremely simple and quick to generate.

    Diablo 3, Torchlight 2 and other such games all generate random maps very quickly, and an FPS would be the same. I know I've mentioned games that aren't necessarily lookers, but complexity of graphics doesn't mean complexity of map generation.

    Switch Friend Code: SW - 5443 - 2358 - 9118 || 3DS Friend Code: 0989 - 1731 - 9504 || NNID: unclesporky
  • Options
    SymtexSymtex Registered User regular
    I think cloud computing will be use to blend single player and multiplayer like "The Division" is doing. There will be no such things as a single player mode/campaign like gameplay we have seen in this generation. You are always connected, people can come in and out of your game and interact with you while still playing your single player experience.

    Symtex.slim.jpg
  • Options
    PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    Symtex wrote: »
    I think cloud computing will be use to blend single player and multiplayer like "The Division" is doing. There will be no such things as a single player mode/campaign like gameplay we have seen in this generation. You are always connected, people can come in and out of your game and interact with you while still playing your single player experience.

    Yeah that's terrible. Sometimes I just don't want to play with other assholes people

  • Options
    UncleSporkyUncleSporky Registered User regular
    edited June 2013
    To show what I mean, here, click construct.

    And then you can define some things like if the room is 2x4, it has a potted plant in this corner and some cubicles in the middle, if the room is 5x5 it has a big round area in the middle with steps leading down to it, or whatever.

    Any complaint you can conceive of with regard to this simple generator can be fixed without changing how quickly the map is generated. It's easy to eliminate dead ends, or pull from a list of pre-defined rooms, or instead randomly place objects in each room in complex and interesting ways.

    It's easy as hell. Like I said, only Dwarf Fortress has this generation speed problem, because the creator is slightly insane and demands an extremely realistic world with rivers formed naturally and hundreds of years of simulated history.

    UncleSporky on
    Switch Friend Code: SW - 5443 - 2358 - 9118 || 3DS Friend Code: 0989 - 1731 - 9504 || NNID: unclesporky
  • Options
    lowlylowlycooklowlylowlycook Registered User regular
    Uncle Sporky. You might want to look up the word "slightly" I'm pretty sure that it doesn't mean what you think it means.

    steam_sig.png
    (Please do not gift. My game bank is already full.)
  • Options
    DarkewolfeDarkewolfe Registered User regular
    One thing that the cloud unquestionably has an advantage in is creating very large random environments.

    Last time I brought it up (in significantly more detail, I'll try to find the post), the rejoinder was "well, just make a bunch of those environments and put them on the disc." Which is fine, but, well... then it's not random. Even if you make a whole bunch of them, you'll eventually start getting repeats.

    Again, I'd like to see more devs actually make these games and prove the value of the cloud. Are they not making many games with huge random environments specifically because they aren't able to generate the world quickly enough, or is it more because gamers want visual spectacles designed for maximum awesomeness, and well-designed stories/campaigns that aren't conducive to random environments?

    Minecraft doesn't need the cloud to generate its nearly infinite world. It's a form of fractals where the entire world is already defined by a series of variables and equations. You just ask your world chunk generator for the next logical chunk.

    The only game I can think of with a very large, randomly generated world where generation is prohibitively long is Dwarf Fortress.

    Sure. But you could also have, for example, FPS roguelikes, which I'm fairly certain are simply not viable on today's hardware for world-gen reasons. Loadtimes would be really obnoxious.

    No, not at all. Roguelike maps are extremely simple and quick to generate.

    Diablo 3, Torchlight 2 and other such games all generate random maps very quickly, and an FPS would be the same. I know I've mentioned games that aren't necessarily lookers, but complexity of graphics doesn't mean complexity of map generation.

    Both those rogue likes take place on an essentially "flat" map where all the assets load in grid formats. I would have to imagine that generating random maps that operate in 3D space and have less simple grid-like allotment of locational objects could get much more complicated VERY fast.

    What is this I don't even.
  • Options
    UncleSporkyUncleSporky Registered User regular
    sli
    Uncle Sporky. You might want to look up the word "slightly" I'm pretty sure that it doesn't mean what you think it means.

    I said it because in a very odd turn of events he comes off as a totally normal person in interviews and such.

    Switch Friend Code: SW - 5443 - 2358 - 9118 || 3DS Friend Code: 0989 - 1731 - 9504 || NNID: unclesporky
  • Options
    DarkewolfeDarkewolfe Registered User regular
    That 40k fps had some advertised procedurally generated maps. I'm pretty sure it announced for all three platforms, though, so I don't know if there's some ability to leverage Azure uniquely there or not. Lowest common denominator of the platforms and all that.

    What is this I don't even.
  • Options
    PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    One thing that the cloud unquestionably has an advantage in is creating very large random environments.

    Last time I brought it up (in significantly more detail, I'll try to find the post), the rejoinder was "well, just make a bunch of those environments and put them on the disc." Which is fine, but, well... then it's not random. Even if you make a whole bunch of them, you'll eventually start getting repeats.

    Again, I'd like to see more devs actually make these games and prove the value of the cloud. Are they not making many games with huge random environments specifically because they aren't able to generate the world quickly enough, or is it more because gamers want visual spectacles designed for maximum awesomeness, and well-designed stories/campaigns that aren't conducive to random environments?

    Minecraft doesn't need the cloud to generate its nearly infinite world. It's a form of fractals where the entire world is already defined by a series of variables and equations. You just ask your world chunk generator for the next logical chunk.

    The only game I can think of with a very large, randomly generated world where generation is prohibitively long is Dwarf Fortress.

    Sure. But you could also have, for example, FPS roguelikes, which I'm fairly certain are simply not viable on today's hardware for world-gen reasons. Loadtimes would be really obnoxious.

    No, not at all. Roguelike maps are extremely simple and quick to generate.

    Diablo 3, Torchlight 2 and other such games all generate random maps very quickly, and an FPS would be the same. I know I've mentioned games that aren't necessarily lookers, but complexity of graphics doesn't mean complexity of map generation.

    Well those maps are mostly built out of blocks, however they are very well done though. That's certainly a viable method of creating maps and I wish more games did that at that quality level

    The DF generation process is excessive; but it's also quite good. I remember someone setting it up to generate Europe and managed to get something that was very very close, same mountains, rivers, etc. And there is a quick generate mode that only takes a few seconds - and keep in mind that you generate a world and then play on <0.1% of it during the game too so most of that number crunching is wasted

  • Options
    mrt144mrt144 King of the Numbernames Registered User regular
    phyphor, you nailed some of it. it doesnt matter if you create giant open worlds procedurally if there is no game narrative that actually utilizes it.

  • Options
    UncleSporkyUncleSporky Registered User regular
    Darkewolfe wrote: »
    Both those rogue likes take place on an essentially "flat" map where all the assets load in grid formats. I would have to imagine that generating random maps that operate in 3D space and have less simple grid-like allotment of locational objects could get much more complicated VERY fast.

    I'm sorry, but it doesn't! It's as complex as you make it but ultimately generation is extremely quick.

    Now, Salvation specifically said roguelike so I am picturing roguelike-style generation, corridors and rooms.

    Traditional roguelikes typically work by placing a room, digging a tunnel out from a random wall, placing another room (maybe this one is oval-shaped), digging two tunnels out from its walls, etc.

    Games like Diablo 3 and Torchlight work by having a large set of possible areas, so you place the big "tile" with the blue shrine and where passage is blocked to the south by trees, then you place the big tile where there's a short ramp downward to a monster pit and the only exit is back where you came, then you place the big tile that contains the graveyard, etc.

    Roguelike FPS map generation could work either way, but these days it would probably done the second way, where every time you generate a map, there is a 25% chance you'll encounter the "two large holding tanks" setpiece, 25% chance of encountering the "office with enemy ambush" setpiece, and so on. Corridors in between, maybe chosen from a variety of interesting corridors.

    If you're thinking of something more freewheeling, let's say you want a big Far Cry island generated, then you do it with fractals Minecraft style and maybe place some preset areas with buildings in the same way that Minecraft generates villages.

    Either way I swear generation is quick and does not need the cloud.

    Switch Friend Code: SW - 5443 - 2358 - 9118 || 3DS Friend Code: 0989 - 1731 - 9504 || NNID: unclesporky
  • Options
    lowlylowlycooklowlylowlycook Registered User regular
    edited June 2013
    Yeah, I think the game "Sir, You are Being Hunted" is going to procedurally generate all it's maps. It will be interesting to see.

    That said, just because something can be done without the cloud doesn't mean that it couldn't be better with it.

    [edit]

    I'm almost positive that Daggerfall had procedurally made maps. Anyone know if they were made once and stored or generated at runtime? Either way they were huge as could be needed in a game.

    lowlylowlycook on
    steam_sig.png
    (Please do not gift. My game bank is already full.)
  • Options
    PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    Athenor wrote: »
    As technology ages, it gets harder to maintain, though.

    I mean.. there's going to come a day when we leave the x86 architecture behind. It's just GOING to happen. May take 20 years, may take the Crash 1.0, but it will happen. And after that point, what then?

    x86 is just a way of encoding and decoding instructions now and has very few ties to the actual implementation beyond the bit of silicon that decodes them. This exact argument was floated about 20 years ago too, and also more recently with the 32/64-bit stuff and Itanium until AMD finished up with their version.

    x86 contains a ton of cruft sure, and there are architectures that are better, however there's no incentive to ever switch to anything else. Itanium is basically better than x86, but the compilers we needed were too hard to write properly. ARM/MIPS/RISC have problems of their own. x86 instructions can be reclaimed and repurposed as needed and this has already been done in the initial 64-bit implementation.

    If we get to the point where all the apps are running in data centers and accessed through phones/web, then I could see it maybe happening there, but there's too much expectation of compatibility in the consumer market

  • Options
    AegeriAegeri Tiny wee bacteriums Plateau of LengRegistered User regular
    edited June 2013
    You know, a lot of people are bringing up Forza 5, but I honestly don't think it's a good or even valid example of cloud computing. For one thing, several games already take single player data, translate that data in some way and then when you reconnect send it back. For example, Dragon's Dogma allows other players to hire out your pawn and then take them along on their adventures. As they do so, your Pawn learns things like what enemies are weak to what, where quest items are and similar becoming "Smarter". While Forza 5 is using this same concept in a more complex and interesting way, I fail entirely to see how the "Cloud" was relevant to it. A dedicated server or three could do the same thing for the most part (in fact, I suspect that's pretty much what Cloud is being used here, as a fancier buzz word than "Dedicated Server"). I honestly don't feel this is a strong or particularly relevant example of anything really, or at least not some unique feature that couldn't be done before.

    I think it's going to be far more interesting to see how larger persistent online worlds and such use this kind of technology going forward, as opposed to things that more or less mimic the functions of dedicated servers.

    Aegeri on
    The Roleplayer's Guild: My blog for roleplaying games, advice and adventuring.
  • Options
    SymtexSymtex Registered User regular
    Phyphor wrote: »
    Athenor wrote: »
    As technology ages, it gets harder to maintain, though.

    I mean.. there's going to come a day when we leave the x86 architecture behind. It's just GOING to happen. May take 20 years, may take the Crash 1.0, but it will happen. And after that point, what then?

    x86 is just a way of encoding and decoding instructions now and has very few ties to the actual implementation beyond the bit of silicon that decodes them. This exact argument was floated about 20 years ago too, and also more recently with the 32/64-bit stuff and Itanium until AMD finished up with their version.

    x86 contains a ton of cruft sure, and there are architectures that are better, however there's no incentive to ever switch to anything else. Itanium is basically better than x86, but the compilers we needed were too hard to write properly. ARM/MIPS/RISC have problems of their own. x86 instructions can be reclaimed and repurposed as needed and this has already been done in the initial 64-bit implementation.

    If we get to the point where all the apps are running in data centers and accessed through phones/web, then I could see it maybe happening there, but there's too much expectation of compatibility in the consumer market

    The Division, BF4 and DR3 will all be using Smartglass technology to affect someone playing on the console. That's not possible without the cloud.

    Symtex.slim.jpg
  • Options
    Salvation122Salvation122 Registered User regular
    Phyphor wrote: »
    Athenor wrote: »
    As technology ages, it gets harder to maintain, though.

    I mean.. there's going to come a day when we leave the x86 architecture behind. It's just GOING to happen. May take 20 years, may take the Crash 1.0, but it will happen. And after that point, what then?

    x86 is just a way of encoding and decoding instructions now and has very few ties to the actual implementation beyond the bit of silicon that decodes them. This exact argument was floated about 20 years ago too, and also more recently with the 32/64-bit stuff and Itanium until AMD finished up with their version.

    x86 contains a ton of cruft sure, and there are architectures that are better, however there's no incentive to ever switch to anything else. Itanium is basically better than x86, but the compilers we needed were too hard to write properly. ARM/MIPS/RISC have problems of their own. x86 instructions can be reclaimed and repurposed as needed and this has already been done in the initial 64-bit implementation.

    If we get to the point where all the apps are running in data centers and accessed through phones/web, then I could see it maybe happening there, but there's too much expectation of compatibility in the consumer market

    Moreover there is an extraordinarily small list of things that are processor-bound anymore and it's more cost-efficient to just throw more processors at it than it is to try to engineer a more elegant solution.

  • Options
    AegeriAegeri Tiny wee bacteriums Plateau of LengRegistered User regular
    edited June 2013
    Symtex wrote: »
    Phyphor wrote: »
    Athenor wrote: »
    As technology ages, it gets harder to maintain, though.

    I mean.. there's going to come a day when we leave the x86 architecture behind. It's just GOING to happen. May take 20 years, may take the Crash 1.0, but it will happen. And after that point, what then?

    x86 is just a way of encoding and decoding instructions now and has very few ties to the actual implementation beyond the bit of silicon that decodes them. This exact argument was floated about 20 years ago too, and also more recently with the 32/64-bit stuff and Itanium until AMD finished up with their version.

    x86 contains a ton of cruft sure, and there are architectures that are better, however there's no incentive to ever switch to anything else. Itanium is basically better than x86, but the compilers we needed were too hard to write properly. ARM/MIPS/RISC have problems of their own. x86 instructions can be reclaimed and repurposed as needed and this has already been done in the initial 64-bit implementation.

    If we get to the point where all the apps are running in data centers and accessed through phones/web, then I could see it maybe happening there, but there's too much expectation of compatibility in the consumer market

    The Division, BF4 and DR3 will all be using Smartglass technology to affect someone playing on the console. That's not possible without the cloud.

    I am not sure about the Division, but your point about BF4 is outright wrong. All versions of the game, including the PS3, 360, PS4 and PC versions all enable Commander Mode, which is where a person playing on a tablet is capable of calling down missile strikes and directing players.

    Aegeri on
    The Roleplayer's Guild: My blog for roleplaying games, advice and adventuring.
  • Options
    SymtexSymtex Registered User regular
    Aegeri wrote: »
    Symtex wrote: »
    Phyphor wrote: »
    Athenor wrote: »
    As technology ages, it gets harder to maintain, though.

    I mean.. there's going to come a day when we leave the x86 architecture behind. It's just GOING to happen. May take 20 years, may take the Crash 1.0, but it will happen. And after that point, what then?

    x86 is just a way of encoding and decoding instructions now and has very few ties to the actual implementation beyond the bit of silicon that decodes them. This exact argument was floated about 20 years ago too, and also more recently with the 32/64-bit stuff and Itanium until AMD finished up with their version.

    x86 contains a ton of cruft sure, and there are architectures that are better, however there's no incentive to ever switch to anything else. Itanium is basically better than x86, but the compilers we needed were too hard to write properly. ARM/MIPS/RISC have problems of their own. x86 instructions can be reclaimed and repurposed as needed and this has already been done in the initial 64-bit implementation.

    If we get to the point where all the apps are running in data centers and accessed through phones/web, then I could see it maybe happening there, but there's too much expectation of compatibility in the consumer market

    The Division, BF4 and DR3 will all be using Smartglass technology to affect someone playing on the console. That's not possible without the cloud.

    I am not sure about the Division, but your point about BF4 is outright wrong. All versions of the game, including the PS3, 360, PS4 and PC versions all enable Commander Mode, which is where a person playing on a tablet is capable of calling down missile strikes and directing players.

    You are correct but it easier for developer to use it on Xbox One because the cloud is at their disposal. On Ps4, you have to build your own infrastructure to support it. It makes sense for EA to invest but for smaller developer, having those assets are your disposal is great. It makes is available to all developer

    Symtex.slim.jpg
  • Options
    AegeriAegeri Tiny wee bacteriums Plateau of LengRegistered User regular
    edited June 2013
    Symtex wrote: »
    Aegeri wrote: »
    Symtex wrote: »
    Phyphor wrote: »
    Athenor wrote: »
    As technology ages, it gets harder to maintain, though.

    I mean.. there's going to come a day when we leave the x86 architecture behind. It's just GOING to happen. May take 20 years, may take the Crash 1.0, but it will happen. And after that point, what then?

    x86 is just a way of encoding and decoding instructions now and has very few ties to the actual implementation beyond the bit of silicon that decodes them. This exact argument was floated about 20 years ago too, and also more recently with the 32/64-bit stuff and Itanium until AMD finished up with their version.

    x86 contains a ton of cruft sure, and there are architectures that are better, however there's no incentive to ever switch to anything else. Itanium is basically better than x86, but the compilers we needed were too hard to write properly. ARM/MIPS/RISC have problems of their own. x86 instructions can be reclaimed and repurposed as needed and this has already been done in the initial 64-bit implementation.

    If we get to the point where all the apps are running in data centers and accessed through phones/web, then I could see it maybe happening there, but there's too much expectation of compatibility in the consumer market

    The Division, BF4 and DR3 will all be using Smartglass technology to affect someone playing on the console. That's not possible without the cloud.

    I am not sure about the Division, but your point about BF4 is outright wrong. All versions of the game, including the PS3, 360, PS4 and PC versions all enable Commander Mode, which is where a person playing on a tablet is capable of calling down missile strikes and directing players.

    You are correct but it easier for developer to use it on Xbox One because the cloud is at their disposal.

    Have you any actual evidence of this other than your own speculation, because just before you were saying it wasn't possible without the cloud yet seems to work fine on tons of platforms that don't have one. Also from everything I have been reading, the tablet version of The Division is compatible with both the Xbox One and PS4 versions. In fact, I sincerely doubt that either tablet based app uses Microsofts cloud whatsoever, so this seems pure speculation. I have no idea about Dead Rising 3 because I lost interest in that game a long time ago, but it also isn't multiplatform so it's impossible to demonstrate what it is or isn't using because of that.

    Aegeri on
    The Roleplayer's Guild: My blog for roleplaying games, advice and adventuring.
  • Options
    CorriganXCorriganX Jacksonville, FLRegistered User regular
    Aegeri wrote: »
    Symtex wrote: »
    Aegeri wrote: »
    Symtex wrote: »
    Phyphor wrote: »
    Athenor wrote: »
    As technology ages, it gets harder to maintain, though.

    I mean.. there's going to come a day when we leave the x86 architecture behind. It's just GOING to happen. May take 20 years, may take the Crash 1.0, but it will happen. And after that point, what then?

    x86 is just a way of encoding and decoding instructions now and has very few ties to the actual implementation beyond the bit of silicon that decodes them. This exact argument was floated about 20 years ago too, and also more recently with the 32/64-bit stuff and Itanium until AMD finished up with their version.

    x86 contains a ton of cruft sure, and there are architectures that are better, however there's no incentive to ever switch to anything else. Itanium is basically better than x86, but the compilers we needed were too hard to write properly. ARM/MIPS/RISC have problems of their own. x86 instructions can be reclaimed and repurposed as needed and this has already been done in the initial 64-bit implementation.

    If we get to the point where all the apps are running in data centers and accessed through phones/web, then I could see it maybe happening there, but there's too much expectation of compatibility in the consumer market

    The Division, BF4 and DR3 will all be using Smartglass technology to affect someone playing on the console. That's not possible without the cloud.

    I am not sure about the Division, but your point about BF4 is outright wrong. All versions of the game, including the PS3, 360, PS4 and PC versions all enable Commander Mode, which is where a person playing on a tablet is capable of calling down missile strikes and directing players.

    You are correct but it easier for developer to use it on Xbox One because the cloud is at their disposal.

    Have you any actual evidence of this other than your own speculation, because just before you were saying it wasn't possible without the cloud yet seems to work fine on tons of platforms that don't have one. Also from everything I have been reading, the tablet version of The Division is compatible with both the Xbox One and PS4 versions. In fact, I sincerely doubt that either tablet based app uses Microsofts cloud whatsoever, so this seems pure speculation. I have no idea about Dead Rising 3 because I lost interest in that game a long time ago, but it also isn't multiplatform so it's impossible to demonstrate what it is or isn't using because of that.

    Shh.. just let Symtex make up more of his 'facts' its so amusing to watch him try to re-state them in different ways every time hes proven wrong. Its more compelling than most tv shows. So good.

    n1woEHJ.png
    CorriganX on Steam and just about everywhere else.
  • Options
    BlindPsychicBlindPsychic Registered User regular
    Receiver used a very primitive sort of lego block construction for its stages. I don't think it worked very well, since there were only like 4 buildings but say, 16 were used in a game, so it was very easy to get lost. I think that sort of construction is something that is good to quickly produce a layout, but FPS games really need the human touch to go back in there and make things more specifically handcrafted. Mostly because in FPS games, you're dealing with balancing issues like LOS, stage "flow" choke points, and all sorts of other things that make a good FPS map. I think using it to create a more natural environment (like Crysis' island)(and if we have those sorts of abilities) quickly, and then going back in there and making it more natural would be best. Minecraft is kinda good at faking it, but I feel like things need to be some what bespoke otherwise your mind will just look at it all and see randomness.

    So basically I don't think this is a cloud job on a user level. Perhaps it could be employed by developers to generate things easier, but in that specific case I don't see it happening much.

  • Options
    SymtexSymtex Registered User regular
    Aegeri wrote: »
    Symtex wrote: »
    Aegeri wrote: »
    Symtex wrote: »
    Phyphor wrote: »
    Athenor wrote: »
    As technology ages, it gets harder to maintain, though.

    I mean.. there's going to come a day when we leave the x86 architecture behind. It's just GOING to happen. May take 20 years, may take the Crash 1.0, but it will happen. And after that point, what then?

    x86 is just a way of encoding and decoding instructions now and has very few ties to the actual implementation beyond the bit of silicon that decodes them. This exact argument was floated about 20 years ago too, and also more recently with the 32/64-bit stuff and Itanium until AMD finished up with their version.

    x86 contains a ton of cruft sure, and there are architectures that are better, however there's no incentive to ever switch to anything else. Itanium is basically better than x86, but the compilers we needed were too hard to write properly. ARM/MIPS/RISC have problems of their own. x86 instructions can be reclaimed and repurposed as needed and this has already been done in the initial 64-bit implementation.

    If we get to the point where all the apps are running in data centers and accessed through phones/web, then I could see it maybe happening there, but there's too much expectation of compatibility in the consumer market

    The Division, BF4 and DR3 will all be using Smartglass technology to affect someone playing on the console. That's not possible without the cloud.

    I am not sure about the Division, but your point about BF4 is outright wrong. All versions of the game, including the PS3, 360, PS4 and PC versions all enable Commander Mode, which is where a person playing on a tablet is capable of calling down missile strikes and directing players.

    You are correct but it easier for developer to use it on Xbox One because the cloud is at their disposal.

    Have you any actual evidence of this other than your own speculation, because just before you were saying it wasn't possible without the cloud yet seems to work fine on tons of platforms that don't have one. Also from everything I have been reading, the tablet version of The Division is compatible with both the Xbox One and PS4 versions. In fact, I sincerely doubt that either tablet based app uses Microsofts cloud whatsoever, so this seems pure speculation. I have no idea about Dead Rising 3 because I lost interest in that game a long time ago, but it also isn't multiplatform so it's impossible to demonstrate what it is or isn't using because of that.

    Let me try to explain more in details. Its not the cloud itself that is revolutionary. It's that is provided at no cost to the developer to use. Its all manage by Microsoft Azure solution. So in games likes BF4 which is multiplatform. , The Xbox one version will use the cloud for the commando mode at no cost while they will have to spend money to manage their own infrastructure for other platform. There are many ways to use "cloud processing". This is only one of them.

    There is nothing here that Sony/Nintendo cannot reproduce but the difference is Microsoft has the infrastructure and the OS was build from the ground up with the cloud in mind. That's the difference here.

    Symtex.slim.jpg
  • Options
    AegeriAegeri Tiny wee bacteriums Plateau of LengRegistered User regular
    edited June 2013
    Yeah, I think the game "Sir, You are Being Hunted" is going to procedurally generate all it's maps. It will be interesting to see.

    That said, just because something can be done without the cloud doesn't mean that it couldn't be better with it.

    [edit]

    I'm almost positive that Daggerfall had procedurally made maps. Anyone know if they were made once and stored or generated at runtime? Either way they were huge as could be needed in a game.

    Daggerfall made a procedurally generated world actually. It was kind of janky and such, but it was extremely endearing. I'm not sure what kind of trickery they used to pull it off, such as maybe placing pre-fab things or similar, but it was pretty random for the most part. Using cloud computing to procedurally generate a world faster than a single machine, then streaming/downloading that world to the user to explore is such a great idea though. I really hope someone tries this.

    Edit: Symtex, I've asked you three times now, but where is your evidence that they are using the cloud for these apps in the first place? Have they actually said that's what they are doing or is this just speculation on your part? Because you are constantly moving your goal posts, saying initially it wasn't possible without it (it is), then saying it would be better with it (no evidence of that) and now saying that it's just a cost thing for the developer for using it that it's already there. Seriously, provide some evidence of developers saying this or we can accept it as speculation and move on. It is completely possible after all that the xbox one uses its cloud servers and the PS4/PC versions just use the dedicated server - there is no inherent additional cost to the tablet versions of any version subsequently.

    Edit2: And I am not convinced EA will even use Microsofts cloud servers. They made a ton from the console versions of BF3 by charging people to have their own dedicated servers on EAs own hardware. So I am not even convinced EA will be using Microsofts cloud services for servers for BF4 anyway (unless EA can charge for them).

    Aegeri on
    The Roleplayer's Guild: My blog for roleplaying games, advice and adventuring.
  • Options
    DehumanizedDehumanized Registered User regular
    Let's be clear here. BF4 is using some sort of cloud, which is not necessarily Azure. Should they have used Azure to handle whatever game servers they wanted, they could do so on more platforms than just XOne, but would almost certainly be out of license for whatever compute time that Microsoft is allocating for XOne users. Should that be the case, likely EA would have to foot the bill for running that infrastructure on Azure.

    So, instead, they will probably be running the games' servers on the existing model that BF3 PC has, which is somewhat in-house, somewhat servers you can purchase from official server providers.

  • Options
    DarkewolfeDarkewolfe Registered User regular
    Receiver used a very primitive sort of lego block construction for its stages. I don't think it worked very well, since there were only like 4 buildings but say, 16 were used in a game, so it was very easy to get lost. I think that sort of construction is something that is good to quickly produce a layout, but FPS games really need the human touch to go back in there and make things more specifically handcrafted. Mostly because in FPS games, you're dealing with balancing issues like LOS, stage "flow" choke points, and all sorts of other things that make a good FPS map. I think using it to create a more natural environment (like Crysis' island)(and if we have those sorts of abilities) quickly, and then going back in there and making it more natural would be best. Minecraft is kinda good at faking it, but I feel like things need to be some what bespoke otherwise your mind will just look at it all and see randomness.

    So basically I don't think this is a cloud job on a user level. Perhaps it could be employed by developers to generate things easier, but in that specific case I don't see it happening much.

    Even in Minecraft you totally come across those times when, like, a Tree is on some dirt on sand hanging over a lava pool or something. They've made great strides in not having crazy shit, but you still get some jarringly wonky terrain here and there.

    But it's Minecraft, so it's adorable. It'd look MUCH stranger in the gritty realism artscape.

    What is this I don't even.
  • Options
    lowlylowlycooklowlylowlycook Registered User regular
    Yeah, by maps I mean the world, sorry. I think the random dungeons were made by taking pieces of the story dungeons and splicing them together. I'm sure the number of dungeons and villages in Daggerfall was completely absurd. I doubt even the buildings could have been placed by hand.

    steam_sig.png
    (Please do not gift. My game bank is already full.)
  • Options
    AegeriAegeri Tiny wee bacteriums Plateau of LengRegistered User regular
    Also as Azure isn't in Australia, I would be really really worried about using it for any kind of multiplayer game like Battlefield. Having to ping to Singapore or Hong Kong would produce considerably more latency and it simply wouldn't make sense. With BF3, because local aussies could rent out servers I could find a server with 18-28 ping and play incredibly smoothly. Using cloud based virtualized servers not even in my own country would just be pants on head for a twitch based multiplayer game like BF4.

    Which is one thing that still concerns me about the idea of using virtualized servers for multiplayer functionality: I think they can use it for some things, but dedicated servers closer to customers around the place are still going to be essential.

    The Roleplayer's Guild: My blog for roleplaying games, advice and adventuring.
  • Options
    UncleSporkyUncleSporky Registered User regular
    Darkewolfe wrote: »
    Even in Minecraft you totally come across those times when, like, a Tree is on some dirt on sand hanging over a lava pool or something. They've made great strides in not having crazy shit, but you still get some jarringly wonky terrain here and there.

    But it's Minecraft, so it's adorable. It'd look MUCH stranger in the gritty realism artscape.

    It's an easily-fixable quirk of Minecraft that I think they haven't because it's endearing at this point.

    In fact if you were doing a Far Cry style island the easiest method is a fractal heightmap that you populate with foliage and buildings and such. No floating stuff and it still doesn't require the cloud. :P

    Switch Friend Code: SW - 5443 - 2358 - 9118 || 3DS Friend Code: 0989 - 1731 - 9504 || NNID: unclesporky
  • Options
    Salvation122Salvation122 Registered User regular
    edited June 2013
    Aegeri wrote: »
    Yeah, I think the game "Sir, You are Being Hunted" is going to procedurally generate all it's maps. It will be interesting to see.

    That said, just because something can be done without the cloud doesn't mean that it couldn't be better with it.

    [edit]

    I'm almost positive that Daggerfall had procedurally made maps. Anyone know if they were made once and stored or generated at runtime? Either way they were huge as could be needed in a game.

    Daggerfall made a procedurally generated world actually. It was kind of janky and such, but it was extremely endearing. I'm not sure what kind of trickery they used to pull it off, such as maybe placing pre-fab things or similar, but it was pretty random for the most part. Using cloud computing to procedurally generate a world faster than a single machine, then streaming/downloading that world to the user to explore is such a great idea though. I really hope someone tries this.

    I mean granted it was like twenty freakin' years ago, but let's be real clear, Daggerfall was the buggiest piece of bug that ever bugged

    SO MANY weird CTD/scripting/graphics/sound/whatever glitches that may not have actually been glitches. The sound ones, in particular, I'm fairly certain were because the game managed to spawn a room full of monsters with no doors and I was hearing their idle noises. Creepy as fuck when you're ten.

    Edit: Then again, it was a Bethesda game, soooooooo

    Salvation122 on
  • Options
    SymtexSymtex Registered User regular
    CorriganX wrote: »
    Aegeri wrote: »
    Symtex wrote: »
    Aegeri wrote: »
    Symtex wrote: »
    Phyphor wrote: »
    Athenor wrote: »
    As technology ages, it gets harder to maintain, though.

    I mean.. there's going to come a day when we leave the x86 architecture behind. It's just GOING to happen. May take 20 years, may take the Crash 1.0, but it will happen. And after that point, what then?

    x86 is just a way of encoding and decoding instructions now and has very few ties to the actual implementation beyond the bit of silicon that decodes them. This exact argument was floated about 20 years ago too, and also more recently with the 32/64-bit stuff and Itanium until AMD finished up with their version.

    x86 contains a ton of cruft sure, and there are architectures that are better, however there's no incentive to ever switch to anything else. Itanium is basically better than x86, but the compilers we needed were too hard to write properly. ARM/MIPS/RISC have problems of their own. x86 instructions can be reclaimed and repurposed as needed and this has already been done in the initial 64-bit implementation.

    If we get to the point where all the apps are running in data centers and accessed through phones/web, then I could see it maybe happening there, but there's too much expectation of compatibility in the consumer market

    The Division, BF4 and DR3 will all be using Smartglass technology to affect someone playing on the console. That's not possible without the cloud.

    I am not sure about the Division, but your point about BF4 is outright wrong. All versions of the game, including the PS3, 360, PS4 and PC versions all enable Commander Mode, which is where a person playing on a tablet is capable of calling down missile strikes and directing players.

    You are correct but it easier for developer to use it on Xbox One because the cloud is at their disposal.

    Have you any actual evidence of this other than your own speculation, because just before you were saying it wasn't possible without the cloud yet seems to work fine on tons of platforms that don't have one. Also from everything I have been reading, the tablet version of The Division is compatible with both the Xbox One and PS4 versions. In fact, I sincerely doubt that either tablet based app uses Microsofts cloud whatsoever, so this seems pure speculation. I have no idea about Dead Rising 3 because I lost interest in that game a long time ago, but it also isn't multiplatform so it's impossible to demonstrate what it is or isn't using because of that.

    Shh.. just let Symtex make up more of his 'facts' its so amusing to watch him try to re-state them in different ways every time hes proven wrong. Its more compelling than most tv shows. So good.

    I am guilty of given an half-ass answer the first time and he called me on it. I am fine with that.

    Symtex.slim.jpg
  • Options
    syndalissyndalis Getting Classy On the WallRegistered User, Loves Apple Products regular
    Aegeri wrote: »
    Also as Azure isn't in Australia, I would be really really worried about using it for any kind of multiplayer game like Battlefield. Having to ping to Singapore or Hong Kong would produce considerably more latency and it simply wouldn't make sense. With BF3, because local aussies could rent out servers I could find a server with 18-28 ping and play incredibly smoothly. Using cloud based virtualized servers not even in my own country would just be pants on head for a twitch based multiplayer game like BF4.

    Which is one thing that still concerns me about the idea of using virtualized servers for multiplayer functionality: I think they can use it for some things, but dedicated servers closer to customers around the place are still going to be essential.

    http://www.lifehacker.com.au/2013/05/windows-azure-in-australia-everything-you-need-to-know/

    Before the end of 2014, there will be two major Azure datacenters in Australia. Likely before, according to the guy in the article, as he said the rollout usually happens within 12-18 months of the announcement.

    Sooo.... on a console with a 5-10 year life cycle, there may be 6-12 months where the ping is 100ms longer than it needs to be.

    This isn't really doom and gloom for Australia.

    SW-4158-3990-6116
    Let's play Mario Kart or something...
  • Options
    AegeriAegeri Tiny wee bacteriums Plateau of LengRegistered User regular
    edited June 2013
    I wonder if cloud computing would enable companies to do things like procedurally generate viable interesting maps in games like XCOM. The new XCOM doesn't use any procedural generation because it was apparently too difficult to do in the Unreal Engine with current hardware (I suspect there is a little bit of "Hard to get working properly on consoles" in there). The original game also didn't use true random generation either, it actually had chunks of level with prefab bits that it stuck together in a semi-random way to appear "random". I would love to investigate different office building layouts, supermarkets and such every game (and barns with cabbages). Especially if the processing was powerful enough to always ensure viable/sensible exits/entrances etc.
    syndalis wrote: »
    Aegeri wrote: »
    Also as Azure isn't in Australia, I would be really really worried about using it for any kind of multiplayer game like Battlefield. Having to ping to Singapore or Hong Kong would produce considerably more latency and it simply wouldn't make sense. With BF3, because local aussies could rent out servers I could find a server with 18-28 ping and play incredibly smoothly. Using cloud based virtualized servers not even in my own country would just be pants on head for a twitch based multiplayer game like BF4.

    Which is one thing that still concerns me about the idea of using virtualized servers for multiplayer functionality: I think they can use it for some things, but dedicated servers closer to customers around the place are still going to be essential.

    http://www.lifehacker.com.au/2013/05/windows-azure-in-australia-everything-you-need-to-know/

    Before the end of 2014, there will be two major Azure datacenters in Australia. Likely before, according to the guy in the article, as he said the rollout usually happens within 12-18 months of the announcement.

    Sooo.... on a console with a 5-10 year life cycle, there may be 6-12 months where the ping is 100ms longer than it needs to be.

    This isn't really doom and gloom for Australia.

    Not saying it is, but then again NZ doesn't get a good ping to Australia and so I don't see this being viable for multiplayer gaming for everyone at all. I will always prefer a local dedicated server, to dedicates servers located on Azure. Especially with the way EA handled BF3, which was absolutely perfect and the best MP shooter I have ever played on a console. In fact, I'm not even sure how viable this is in the US because there aren't that many azure data clusters in the US either now I think of it. I really don't see it being a viable solution for dedicated servers of MP games at all, unless the game was inherently latency resistant (which face it, CoD and BF are not).

    Why should I put up with 100ms of ping when I can be playing the game on PC/PS4 with EAs dedicated server renting at 18ms? In fact, I suspect EA won't use Azure whatsoever and everyone here in Australia will still be renting servers from EA and playing with 18ms of ping.

    Which was my point.

    Aegeri on
    The Roleplayer's Guild: My blog for roleplaying games, advice and adventuring.
  • Options
    DehumanizedDehumanized Registered User regular
    edited June 2013
    I don't know if I'd trust the XCom team with procedural maps, given how many issues their hand-designed maps had with stuff like LOS calculations. :rotate:

    Dehumanized on
  • Options
    lowlylowlycooklowlylowlycook Registered User regular
    Aegeri wrote: »
    Yeah, I think the game "Sir, You are Being Hunted" is going to procedurally generate all it's maps. It will be interesting to see.

    That said, just because something can be done without the cloud doesn't mean that it couldn't be better with it.

    [edit]

    I'm almost positive that Daggerfall had procedurally made maps. Anyone know if they were made once and stored or generated at runtime? Either way they were huge as could be needed in a game.

    Daggerfall made a procedurally generated world actually. It was kind of janky and such, but it was extremely endearing. I'm not sure what kind of trickery they used to pull it off, such as maybe placing pre-fab things or similar, but it was pretty random for the most part. Using cloud computing to procedurally generate a world faster than a single machine, then streaming/downloading that world to the user to explore is such a great idea though. I really hope someone tries this.

    I mean granted it was like twenty freakin' years ago, but let's be real clear, Daggerfall was the buggiest piece of bug that ever bugged

    SO MANY weird CTD/scripting/graphics/sound/whatever glitches that may not have actually been glitches. The sound ones, in particular, I'm fairly certain were because the game managed to spawn a room full of monsters with no doors and I was hearing their idle noises. Creepy as fuck when you're ten.

    Edit: Then again, it was a Bethesda game, soooooooo

    In my experience coding things to run in parallel does not lead to fewer bugs either as the code is written or in the time it takes to track them down and fix them. A Bethesda game written for the Cloud could be terrifying to those much older than ten.

    Also, speaking of Daggerfall, if the cloud enables randomly generated quests that are interesting and fit into the world then I'm all for cloud computing.

    steam_sig.png
    (Please do not gift. My game bank is already full.)
  • Options
    PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    edited June 2013
    Phyphor wrote: »
    Athenor wrote: »
    As technology ages, it gets harder to maintain, though.

    I mean.. there's going to come a day when we leave the x86 architecture behind. It's just GOING to happen. May take 20 years, may take the Crash 1.0, but it will happen. And after that point, what then?

    x86 is just a way of encoding and decoding instructions now and has very few ties to the actual implementation beyond the bit of silicon that decodes them. This exact argument was floated about 20 years ago too, and also more recently with the 32/64-bit stuff and Itanium until AMD finished up with their version.

    x86 contains a ton of cruft sure, and there are architectures that are better, however there's no incentive to ever switch to anything else. Itanium is basically better than x86, but the compilers we needed were too hard to write properly. ARM/MIPS/RISC have problems of their own. x86 instructions can be reclaimed and repurposed as needed and this has already been done in the initial 64-bit implementation.

    If we get to the point where all the apps are running in data centers and accessed through phones/web, then I could see it maybe happening there, but there's too much expectation of compatibility in the consumer market

    Moreover there is an extraordinarily small list of things that are processor-bound anymore and it's more cost-efficient to just throw more processors at it than it is to try to engineer a more elegant solution.

    There are actually some very real scalability problems with just throwing more processors at it. Amusingly, the issues come about due to certain nice features of x86, specifically enforced dma-cpu cache coherency and addressability of interrupts and difficulties maintaining a coherent page mapping across all processors. At some point you hit a problem with the processors just spending too much time doing bookkeeping. But that's a battle for the next decade and there are ways around it (NUMA, new OS designs, and things like it)

    Phyphor on
  • Options
    PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    Symtex wrote: »
    Phyphor wrote: »
    Athenor wrote: »
    As technology ages, it gets harder to maintain, though.

    I mean.. there's going to come a day when we leave the x86 architecture behind. It's just GOING to happen. May take 20 years, may take the Crash 1.0, but it will happen. And after that point, what then?

    x86 is just a way of encoding and decoding instructions now and has very few ties to the actual implementation beyond the bit of silicon that decodes them. This exact argument was floated about 20 years ago too, and also more recently with the 32/64-bit stuff and Itanium until AMD finished up with their version.

    x86 contains a ton of cruft sure, and there are architectures that are better, however there's no incentive to ever switch to anything else. Itanium is basically better than x86, but the compilers we needed were too hard to write properly. ARM/MIPS/RISC have problems of their own. x86 instructions can be reclaimed and repurposed as needed and this has already been done in the initial 64-bit implementation.

    If we get to the point where all the apps are running in data centers and accessed through phones/web, then I could see it maybe happening there, but there's too much expectation of compatibility in the consumer market

    The Division, BF4 and DR3 will all be using Smartglass technology to affect someone playing on the console. That's not possible without the cloud.

    I don't know why you responded to a post about processor architectures with cloud buzzwords, but from what I can tell smartglass itself could be done over local wifi just as well

  • Options
    Salvation122Salvation122 Registered User regular
    Aegeri wrote: »
    Yeah, I think the game "Sir, You are Being Hunted" is going to procedurally generate all it's maps. It will be interesting to see.

    That said, just because something can be done without the cloud doesn't mean that it couldn't be better with it.

    [edit]

    I'm almost positive that Daggerfall had procedurally made maps. Anyone know if they were made once and stored or generated at runtime? Either way they were huge as could be needed in a game.

    Daggerfall made a procedurally generated world actually. It was kind of janky and such, but it was extremely endearing. I'm not sure what kind of trickery they used to pull it off, such as maybe placing pre-fab things or similar, but it was pretty random for the most part. Using cloud computing to procedurally generate a world faster than a single machine, then streaming/downloading that world to the user to explore is such a great idea though. I really hope someone tries this.

    I mean granted it was like twenty freakin' years ago, but let's be real clear, Daggerfall was the buggiest piece of bug that ever bugged

    SO MANY weird CTD/scripting/graphics/sound/whatever glitches that may not have actually been glitches. The sound ones, in particular, I'm fairly certain were because the game managed to spawn a room full of monsters with no doors and I was hearing their idle noises. Creepy as fuck when you're ten.

    Edit: Then again, it was a Bethesda game, soooooooo

    In my experience coding things to run in parallel does not lead to fewer bugs either as the code is written or in the time it takes to track them down and fix them. A Bethesda game written for the Cloud could be terrifying to those much older than ten.

    Also, speaking of Daggerfall, if the cloud enables randomly generated quests that are interesting and fit into the world then I'm all for cloud computing.

    Skyrim was already creepy as hell and I am considerably older than ten now

    Their spider models were entirely too detailed

    I ended up leveling archery specifically so I could snipe spiders from miles away, fuck those too-many-eyed assholes

  • Options
    mrt144mrt144 King of the Numbernames Registered User regular
    I don't know if I'd trust the XCom team with procedural maps, given how many issues their hand-designed maps had with stuff like LOS calculations. :rotate:

    I think only the 1 graveyard map has any sort of egregious LOS issues (the low half cover ledge in front of the truck obstructs LOS from the truck)

Sign In or Register to comment.