As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/

[Nintendo] The best January the Wii U has ever had

1343537394099

Posts

  • cloudeaglecloudeagle Registered User regular
    edited February 2012
    shryke wrote: »
    ME1 and ME2 are two different examples, not one game improving on the other.

    In ME1, hardware limitations are obviously responsible for the nature of things like the Citadel. It's broken up into sections because the 360 literally couldn't load environments that large all at once.

    The ME2 issue is related to loadscreens. Specifically on the PC version you can disable them and transitions become almost instantaneous because a decent computer can handle loading ME2's levels way way faster then the 360 can.

    Both are examples of how hardware enforces limitations and compromises on game design.

    The issue is, as I said, the only way your argument holds water is if developers aren't pushing the limits of current technology (which they are, as examples like the 2 above show) or they are just not interested in pushing those limits.

    And considering developers have historically always been chomping at the bit to push limits on things like level design, detail, physics, etc (as evidenced by the fact that games have drastically changed in more then just graphics over the past ... any time frame), I find the second argument kinda ridiculous and would require some explanation/evidence for why that's suddenly changed.

    I'm still a little confused as to how that disproves my argument, considering my argument is that horesepower improvements from one console to the next don't really result in tangible stuff besides visuals that most people will notice. Mass Effect 1 and 2 were on the same system. I promise I am open to examples, but that one doesn't really apply in this case.

    Well, have developers really been chomping at the bit for things besides an increase in visuals? I mean, yes, we got more physics, but that was possible last generation. Sure, they all talk about how game X is the best thing in every single way, but that's part of advertising. Yes, they all say they have awesome AI, but have we really gotten more awesome AI that was, for whatever reason, not possible on last-gen systems? I'm not saying anything has suddenly changed as far as what developers say,, I'm just saying that they haven't truly delivered.

    cloudeagle on
    Switch: 3947-4890-9293
  • HamHamJHamHamJ Registered User regular
    I for one have not noticed any real improvement in AI since Half-Life.

    While racing light mechs, your Urbanmech comes in second place, but only because it ran out of ammo.
  • UltimanecatUltimanecat Registered User regular
    There was the example of Dead Rising - a game with hundreds of enemies on screen at once in the 360, and horribly bastardized so as to lose a reason for existing when it was ported to the Wii. Now, I'm sort of charitable and want to attribute at least some of the difference to Capcom just being lazy as shit, but I have no evidence of that, and that's only based on my opinion that Capcom is pretty much always lazy as shit.

    AI is kind of weird case - I'd agree that I haven't noticed AI get appreciably better this generation. Enemies mostly just run right at you like always, or now with the hotness of the military shooter, they just duck behind cover for awhile and then run at you. Having played Halo: Anniversary recently (a remake of the first Halo) as well as the most recent entry, Reach, I can pretty much say that not much has changed in that series in the past 10 years.

    The constraint isn't really processing power when talking about AI for most games, but intelligent scripting around emergent gameplay. Since the trend these days in many shooters is away from open gameplay to more tightly scripted experiences, what we'd call AI is more of an afterthought or only considered within the larger scheme of game "scenes" and set-pieces.

    SteamID : same as my PA forum name
  • shrykeshryke Member of the Beast Registered User regular
    cloudeagle wrote: »
    shryke wrote: »
    ME1 and ME2 are two different examples, not one game improving on the other.

    In ME1, hardware limitations are obviously responsible for the nature of things like the Citadel. It's broken up into sections because the 360 literally couldn't load environments that large all at once.

    The ME2 issue is related to loadscreens. Specifically on the PC version you can disable them and transitions become almost instantaneous because a decent computer can handle loading ME2's levels way way faster then the 360 can.

    Both are examples of how hardware enforces limitations and compromises on game design.

    The issue is, as I said, the only way your argument holds water is if developers aren't pushing the limits of current technology (which they are, as examples like the 2 above show) or they are just not interested in pushing those limits.

    And considering developers have historically always been chomping at the bit to push limits on things like level design, detail, physics, etc (as evidenced by the fact that games have drastically changed in more then just graphics over the past ... any time frame), I find the second argument kinda ridiculous and would require some explanation/evidence for why that's suddenly changed.

    I'm still a little confused as to how that disproves my argument, considering my argument is that horesepower improvements from one console to the next don't really result in tangible stuff besides visuals that most people will notice. Mass Effect 1 and 2 were on the same system. I promise I am open to examples, but that one doesn't really apply in this case.

    /facepalm

    Did you really just miss how I just explained in the post you are replying to right now why the fact that they are both on the same system is irrelevant since I'm not comparing the two games to one another?

    Well, have developers really been chomping at the bit for things besides an increase in visuals? I mean, yes, we got more physics, but that was possible last generation. Sure, they all talk about how game X is the best thing in every single way, but that's part of advertising. Yes, they all say they have awesome AI, but have we really gotten more awesome AI that was, for whatever reason, not possible on last-gen systems? I'm not saying anything has suddenly changed as far as what developers say,, I'm just saying that they haven't truly delivered.

    According to whom? According to what?

    I mean, even ignoring visuals (which we shouldn't since they are very important for immersion), are you really saying you can't tell the difference between a game on the PS2 and one on the PS3?

  • mcdermottmcdermott Registered User regular
    Thank fuck somebody mentioned Dead Rising.

    Easily the go-to example of how greater horsepower actually allowed for better gameplay. Also the most hilarious example of a terribad Wii port in existence. Capcom may have been lazy, true, but at the end of the day there's no getting around the fact that the Wii simply doesn't have the balls to render the zombie apocalypse. The 360 could barely manage it.

    I'm also of the opinion that ridiculous action games (like, say, Bayonetta or Devil May Cry) benefit greatly from improved visuals, enough that it actually benefits gameplay. The more smoothly the graphics can move, and the better you can see what things actually are in a fast-moving environment, the better those games get.

  • Linespider5Linespider5 ALL HAIL KING KILLMONGER Registered User regular
    edited February 2012
    shryke wrote: »

    Do you or do you not acknowledge that better hardware allows a game to do more? ("more" here defined as more detailed environment, larger envirnoments, better physics, better graphics, better AI, etc)

    If you don't acknowledge this well ... honestly, I wouldn't even know where to begin. Not acknowledging that point would be an amazing willful denial of reality.

    Let me in. I want this one.

    You know what? They do make games prettier these days. PS1 and N64 had many, many third person games with level design you couldn't even begin to say had any semblance of architectural style or setting. Crooked corridors with 'lava zones,' bad ladders, invisible walls, giant garage-door sized doorways, you know the deal. Caveman 3D.

    You know what else? A lot, and I mean, a lot of games these days use identical gameplay mechanics from our Caveman 3D days.

    Same camera control.

    Same jump/run/climb mechanics.

    Same 'run forever against an enemy standing still' entity systems.

    Same world functionally made of impenetrable plastic.

    I'm still seeing character models with more polys in their body than a single N64 level could hold, but still having the hairstyle clip through the sides of the helmet they've equipped, or the cape on their back, or the gun under their arm can't fit under their arm and just goes through the arm, because, whoops. Ugly mesh contusions on the joints, if the arm is folded over the torso. Characters that don't fall over unless they're dead, and then become a springy mess that will ragdoll six feet if you so much walk past the corpse. Big, pretty worlds where we can't interact with 95% of the environment. Nope, you can't climb that hill. Nope, you can't climb that tree. Nope, we're not swimming today. No, don't climb that wall, climb this one that's glowing slightly. That's an impenetrable wooden door. No, you can't get back in the car. Yes, he's the bad guy, but he hasn't betrayed you just yet, hold on. Here's a cutscene that will make you impatient and change nothing about what you're doing. Actually, now that you've seen the cutscene, let's respawn you 25 feet back where you used to be standing before you climbed the difficult wall area, so you can't save after the cutscene and have to sit through it every time, now that you'll die when you get to the top of the wall. Oh, you wanted an ending? Why did you want that? It's all just an excuse for draping multiplayer features over the top of the setting.

    If I can look at your game and replace all the characters with platonic shapes and reduce the world into a system of rooms and walls because everything moves the same way and interacts the same way it did fifteen years ago, you've fucking failed your fucking job, genius. Better hardware allows a game to do more. Here defined as more detailed environment, larger envirnoments, better physics, better graphics, better AI, etc. Of course, AI is usually a very subjective thing, and 90% of AI is astute level design. We can rebuild it bigger, prettier, etceterier, but that doesn't keep it from handling exactly the same way as its grandfather.

    Linespider5 on
  • AtomikaAtomika Live fast and get fucked or whatever Registered User regular
    AI seems generally tied to difficulty settings.

    With the new Mass Effect 3 demo, on maximum difficulty it's an absolute nightmare. Enemies coordinate their positions, smoke you out of cover with grenades, and catch you in crossfires left and right.

    On "normal," they just stand there and catch bullets.

  • JuliusJulius Captain of Serenity on my shipRegistered User regular
    Julius wrote: »
    wrote:
    What prankery? Just plug in a memory card and a 'cube controller, and you're ready to go.

    No, that's not the whole story. I can't access my console menu while playing a GC game and I have to use a corded controller, which thanks to the Wii needing to be played a good distance from the TV, I don't have comfortable seating nearby enough for the cord to reach.

    You'd think since a classic control nunchuk exists Nintendo would opt for that, but nope.

    Why do you expect to play your cube-game in any other way than a cube-game? What you have to go through to play your GC game on the Wii is exactly what you had to go through on the GC itself.

    True, but like I said already, due to the logistical demands of A) the Wii requiring a good deal of empty space between the TV and wherever the player is, and B) the cords not being very long on the GC controllers, it's a situation that's less than optimal.

    The onus is on Nintendo to provide me with a reasonably accessible set-up. If I have to keep pretending the Wii is no better than my Cube, it won't be long before I just break out the 'Cube.

    And you know what? I DO expect the next generation console to be more accessible than the last, especially since the foundation for that added ease of functionality already exists with the classic controller for the Wiimote. Call me greedy.

    Well I never had a problem with cords not long enough to get to my Wii. But I do understand the problems. Though the original complaint that you worked of was by mcdermott complaining about selling his GC for it, which makes all your legit problems not apply. (not like keeping your old GC deals with cords and switching to a different mode)

    I think Nintendo should think about making backwards compatibility awesome, but I also think that if they don't do it it's a big deal. You didn't lose anything here, you just didn't gain anything either.
    Julius wrote: »
    cloudeagle wrote:
    cloudeagle wrote: »
    I don't see any of that in there, and you flat-out said that "gamers are not looking for new input experiences," a broad and sweeping generalization, with no evidence whatsoever.

    Ross, c'mon. You can't say "what I actually said was this!" when what you actually said is easily accessible a few pages up. Give me a little more respect than that.
    What Nintendo seems to have gotten all backwards is that gamers are largely looking for new gameplay experiences, not new input experiences.
    jackie-chan-meme.png?w=584

    Still speaking for the entire gaming audience there with zero evidence to back it up, plus you didn't bother to contradict all the other stuff I caught you revising.

    ...you do realize there's an easy, 100% guaranteed way to keep me from going this route, right? :P

    Have you got anything to back up your counter-assertion that gamers do want new 'input experiences' besides sales data of specious utility in actually supporting either side of the argument? Perhaps rave reviews of the manner in which motion controls improved the games that were released on both the Wii and GameCube, or the few titles that were released on both Wii and non-Nintendo consoles?

    I'm not saying that such things don't exist, just asking 'cause I've never seen them. I've read a lot of reviews wherein Wii games are slammed for the waggle, and up-thread someone posted about the MetaCritic ranking for Twilight Princess being a point higher on the GameCube than on the Wii. Considering that the games looked alike and had the same story and content, I'd expect the Wii version to be more highly acclaimed if the general public were after the new input experience.

    Given the massive sales of the Wii I think it's hard to claim that people didn't want new input experiences. I think it's rather hard to argue that people just went out and bought en masse something they didn't want.

    Now, whether the Wii actually gave them an experience that didn't suck is up for debate. But Ross isn't saying that people were disappointed, he's saying that they didn't even want that shit.

    You're a little late to this argument, I'd advise reading back a few pages. We've kinda been over all this already.

    The big rhetorical arguments against that position being that, when accounting for certain factors, Wii game sales (especially games in the traditional gaming sense) don't seem to show a statistically relevant boost from involving motion controls.

    You can argue that the games that utilize those factors sold like hotcakes, and I'd agree, but the Wii has seemed somewhat hindered in consistently replicating those figures without some kind of caveat.

    I'm not late, I'm directly engaging the argument you seem to think hasn't been addressed.

    The sales of the Wii-console can not be interpreted as anything but a want for cool new ways of playing. People thought motion-control was cool. They still think it's cool. The sales of Just Dance and such and Move clearly show that people like that new way of playing games.

    Nintendo just didn't do it well. Games didn't do well because the motion control was gimicky, not because motion control sucks. The "games in the traditional gaming sense" don't get a boost from involving motion controls because why on earth would they? They're traditional games, games where the old controllers still make sense. Motion control needs to get a lot better to get over that.

  • shrykeshryke Member of the Beast Registered User regular
    Julius wrote: »
    The sales of the Wii-console can not be interpreted as anything but a want for cool new ways of playing. People thought motion-control was cool. They still think it's cool. The sales of Just Dance and such and Move clearly show that people like that new way of playing games.

    Nintendo just didn't do it well. Games didn't do well because the motion control was gimicky, not because motion control sucks. The "games in the traditional gaming sense" don't get a boost from involving motion controls because why on earth would they? They're traditional games, games where the old controllers still make sense. Motion control needs to get a lot better to get over that.

    The complication here is that the people buying the Wii aren't necessarily the same group complaining about gimmicky motion controls.

    Basically, who was really jumping at the chance for motion controls? What market did the Wii tap in to?

  • Linespider5Linespider5 ALL HAIL KING KILLMONGER Registered User regular
    shryke wrote: »
    Julius wrote: »
    The sales of the Wii-console can not be interpreted as anything but a want for cool new ways of playing. People thought motion-control was cool. They still think it's cool. The sales of Just Dance and such and Move clearly show that people like that new way of playing games.

    Nintendo just didn't do it well. Games didn't do well because the motion control was gimicky, not because motion control sucks. The "games in the traditional gaming sense" don't get a boost from involving motion controls because why on earth would they? They're traditional games, games where the old controllers still make sense. Motion control needs to get a lot better to get over that.

    The complication here is that the people buying the Wii aren't necessarily the same group complaining about gimmicky motion controls.

    Basically, who was really jumping at the chance for motion controls? What market did the Wii tap in to?

    BLUE OCEAN.

  • AtomikaAtomika Live fast and get fucked or whatever Registered User regular
    Julius wrote: »
    Nintendo just didn't do it well. Games didn't do well because the motion control was gimicky, not because motion control sucks. The "games in the traditional gaming sense" don't get a boost from involving motion controls because why on earth would they? They're traditional games, games where the old controllers still make sense. Motion control needs to get a lot better to get over that.

    See, I think that if Nintendo would have just said from the get-go, "Oh, if you're playing traditional games, you can definitely use the classic controller if you want to," we wouldn't be having many of these arguments right now. They could have even innovated some of the Wiimote features (vibrational feedback, shaking gyro, limited motion capture) into that classic controller and still tell the market that it's all still part of their big overall plan, and I think people would have caught on in droves.

    It could be something as simple as having an option box in the settings that said, "Wii Remote Yes/No?" and that would have been that. Then, all your third-party developers would have taken a sigh of relief, traditional gaming aficionados wouldn't be grumbling about waggles for FPSers and whatnot, and you could still have the Wiimote as a kickass peripheral for games where its use is apparent, like sports games.

    Which, correct me if I'm wrong, it seems that's the approach Nintendo is taking with the Wii-U: traditional controller scheme input with support for many different peripherals (including the Wiimote).

  • CantidoCantido Registered User regular
    The lack of Classic Controller compatibility stopped me from buying DKC Returns. I love that franchise, haters gonna hate, but waggle for attacking is rubbish.

    3DS Friendcode 5413-1311-3767
  • AtomikaAtomika Live fast and get fucked or whatever Registered User regular
    Cantido wrote: »
    The lack of Classic Controller compatibility stopped me from buying DKC Returns. I love that franchise, haters gonna hate, but waggle for attacking is rubbish.

    And that's exactly what I'm talking about. The game itself is (someone correct me if I'm offbase here) not significantly different than any of it's precursors; it's still DK and Diddy side-scrolling through the jungle and breaking shit. Why Nintendo would lock that experience into a totally unnecessary new input mechanism is, I'll say, pretty stupid. That's a case where the player is now forced into using a waggle just because.

  • EggyToastEggyToast Jersey CityRegistered User regular
    Julius wrote: »
    Nintendo just didn't do it well. Games didn't do well because the motion control was gimicky, not because motion control sucks. The "games in the traditional gaming sense" don't get a boost from involving motion controls because why on earth would they? They're traditional games, games where the old controllers still make sense. Motion control needs to get a lot better to get over that.

    See, I think that if Nintendo would have just said from the get-go, "Oh, if you're playing traditional games, you can definitely use the classic controller if you want to," we wouldn't be having many of these arguments right now. They could have even innovated some of the Wiimote features (vibrational feedback, shaking gyro, limited motion capture) into that classic controller and still tell the market that it's all still part of their big overall plan, and I think people would have caught on in droves.

    It sounds like advocating for the original Sixaxis controller, a normal controller with some motion built into it.

    I mean, playing the PSN game Eden, which has "thrust the controller down to do a butt-stomp," essentially, and playing NSMBWii, the use of motion control is basically the same. Actually, the PS3 does it better because it's "one motion makes you do the same thing on the screen" instead of "shake the controller to do a couple different things we've mapped to 'shake'."

    || Flickr — || PSN: EggyToast
  • AtomikaAtomika Live fast and get fucked or whatever Registered User regular
    EggyToast wrote: »
    Julius wrote: »
    Nintendo just didn't do it well. Games didn't do well because the motion control was gimicky, not because motion control sucks. The "games in the traditional gaming sense" don't get a boost from involving motion controls because why on earth would they? They're traditional games, games where the old controllers still make sense. Motion control needs to get a lot better to get over that.

    See, I think that if Nintendo would have just said from the get-go, "Oh, if you're playing traditional games, you can definitely use the classic controller if you want to," we wouldn't be having many of these arguments right now. They could have even innovated some of the Wiimote features (vibrational feedback, shaking gyro, limited motion capture) into that classic controller and still tell the market that it's all still part of their big overall plan, and I think people would have caught on in droves.

    It sounds like advocating for the original Sixaxis controller, a normal controller with some motion built into it.

    I mean, playing the PSN game Eden, which has "thrust the controller down to do a butt-stomp," essentially, and playing NSMBWii, the use of motion control is basically the same. Actually, the PS3 does it better because it's "one motion makes you do the same thing on the screen" instead of "shake the controller to do a couple different things we've mapped to 'shake'."

    Yeah, I would advocate for a controller that was a hybrid of the SixAxis and the Wiimote.

    It takes the balance gyro, button pressure sensitivity, and variable vibration function from the SixAxis Dual-Shock, and pair that with the shake function, auditory feedback, flash memory, and some of the motion capture of the Wiimote, all of that paired into either of the Classic Controller form factors (minus the tethering):

    102265303-260x260-0-0_Nintendo+Wii+Classic+Controller+Pro+White.jpg

    wii-classic-controller-skin.jpg



    I still think the tablet screen for the Wii-U is a non-starter, but I'm willing to wait it out and see what they end up doing with it. I have to say, though, I'm a little sick and tired of Nintendo "innovating" these form factors and then punting them out to developers with no clear apparency of use. Hopefully the tablet won't share the fate of the Wiimote, or Wii Balance Board, or Vitality Sensor, or R.O.B., or GameBoy printer, or the Transfer Pack, or the Bongo Controller, or the WiiSpeak, or the . . . . . jesus Nintendo makes a lot of shitty peripherals.

  • cloudeaglecloudeagle Registered User regular
    edited February 2012
    shryke wrote: »
    cloudeagle wrote: »
    shryke wrote: »
    ME1 and ME2 are two different examples, not one game improving on the other.

    In ME1, hardware limitations are obviously responsible for the nature of things like the Citadel. It's broken up into sections because the 360 literally couldn't load environments that large all at once.

    The ME2 issue is related to loadscreens. Specifically on the PC version you can disable them and transitions become almost instantaneous because a decent computer can handle loading ME2's levels way way faster then the 360 can.

    Both are examples of how hardware enforces limitations and compromises on game design.

    The issue is, as I said, the only way your argument holds water is if developers aren't pushing the limits of current technology (which they are, as examples like the 2 above show) or they are just not interested in pushing those limits.

    And considering developers have historically always been chomping at the bit to push limits on things like level design, detail, physics, etc (as evidenced by the fact that games have drastically changed in more then just graphics over the past ... any time frame), I find the second argument kinda ridiculous and would require some explanation/evidence for why that's suddenly changed.

    I'm still a little confused as to how that disproves my argument, considering my argument is that horesepower improvements from one console to the next don't really result in tangible stuff besides visuals that most people will notice. Mass Effect 1 and 2 were on the same system. I promise I am open to examples, but that one doesn't really apply in this case.

    /facepalm

    Did you really just miss how I just explained in the post you are replying to right now why the fact that they are both on the same system is irrelevant since I'm not comparing the two games to one another?

    ...and did you really miss that I'm talking about the progression from one console to another? That's what I've been talking about this entire time. It really sounds like we're having two different arguments here.
    Well, have developers really been chomping at the bit for things besides an increase in visuals? I mean, yes, we got more physics, but that was possible last generation. Sure, they all talk about how game X is the best thing in every single way, but that's part of advertising. Yes, they all say they have awesome AI, but have we really gotten more awesome AI that was, for whatever reason, not possible on last-gen systems? I'm not saying anything has suddenly changed as far as what developers say,, I'm just saying that they haven't truly delivered.

    According to whom? According to what?

    I mean, even ignoring visuals (which we shouldn't since they are very important for immersion), are you really saying you can't tell the difference between a game on the PS2 and one on the PS3?

    Ignoring visuals (and, of course, online, which is an important caveat I brought up earlier, since it really has changed the way games are produced), then I'd say yes, with the exceptions brought up above (Ninety Nine Nights, Dead Rising, and L.A. Noire).

    I'm happy to be proven wrong, though.

    cloudeagle on
    Switch: 3947-4890-9293
  • cloudeaglecloudeagle Registered User regular
    Julius wrote: »
    Nintendo just didn't do it well. Games didn't do well because the motion control was gimicky, not because motion control sucks. The "games in the traditional gaming sense" don't get a boost from involving motion controls because why on earth would they? They're traditional games, games where the old controllers still make sense. Motion control needs to get a lot better to get over that.

    See, I think that if Nintendo would have just said from the get-go, "Oh, if you're playing traditional games, you can definitely use the classic controller if you want to," we wouldn't be having many of these arguments right now. They could have even innovated some of the Wiimote features (vibrational feedback, shaking gyro, limited motion capture) into that classic controller and still tell the market that it's all still part of their big overall plan, and I think people would have caught on in droves.

    It could be something as simple as having an option box in the settings that said, "Wii Remote Yes/No?" and that would have been that. Then, all your third-party developers would have taken a sigh of relief, traditional gaming aficionados wouldn't be grumbling about waggles for FPSers and whatnot, and you could still have the Wiimote as a kickass peripheral for games where its use is apparent, like sports games.

    Which, correct me if I'm wrong, it seems that's the approach Nintendo is taking with the Wii-U: traditional controller scheme input with support for many different peripherals (including the Wiimote).

    Pretty much, yes.

    Switch: 3947-4890-9293
  • AtomikaAtomika Live fast and get fucked or whatever Registered User regular
    cloudeagle wrote: »
    Julius wrote: »
    Nintendo just didn't do it well. Games didn't do well because the motion control was gimicky, not because motion control sucks. The "games in the traditional gaming sense" don't get a boost from involving motion controls because why on earth would they? They're traditional games, games where the old controllers still make sense. Motion control needs to get a lot better to get over that.

    See, I think that if Nintendo would have just said from the get-go, "Oh, if you're playing traditional games, you can definitely use the classic controller if you want to," we wouldn't be having many of these arguments right now. They could have even innovated some of the Wiimote features (vibrational feedback, shaking gyro, limited motion capture) into that classic controller and still tell the market that it's all still part of their big overall plan, and I think people would have caught on in droves.

    It could be something as simple as having an option box in the settings that said, "Wii Remote Yes/No?" and that would have been that. Then, all your third-party developers would have taken a sigh of relief, traditional gaming aficionados wouldn't be grumbling about waggles for FPSers and whatnot, and you could still have the Wiimote as a kickass peripheral for games where its use is apparent, like sports games.

    Which, correct me if I'm wrong, it seems that's the approach Nintendo is taking with the Wii-U: traditional controller scheme input with support for many different peripherals (including the Wiimote).

    Pretty much, yes.

    I know that only time will tell if the Wii-U catches on, but it really seems to be in many ways the system the Wii should have been.

    Hindsight's a motherfucker and all that, but I'm thinking if they could have just branded the Wii as a must-have peripheral device for the GameCube it would have at least bought them a year or two to develop a legitimate HD console more like what the Wii-U is shaping up to look like, but with the bonus of not risking looking like a Johnny-come-lately to the HD/online gaming party.

    If the Wii-U had come out two-to-four years ago and had offered a competitive next-gen gaming experience PLUS all the same Nintendo-y goodness that people love to love, there's every chance Nintendo would be at the top of the console heap.

  • Linespider5Linespider5 ALL HAIL KING KILLMONGER Registered User regular
    cloudeagle wrote: »
    Julius wrote: »
    Nintendo just didn't do it well. Games didn't do well because the motion control was gimicky, not because motion control sucks. The "games in the traditional gaming sense" don't get a boost from involving motion controls because why on earth would they? They're traditional games, games where the old controllers still make sense. Motion control needs to get a lot better to get over that.

    See, I think that if Nintendo would have just said from the get-go, "Oh, if you're playing traditional games, you can definitely use the classic controller if you want to," we wouldn't be having many of these arguments right now. They could have even innovated some of the Wiimote features (vibrational feedback, shaking gyro, limited motion capture) into that classic controller and still tell the market that it's all still part of their big overall plan, and I think people would have caught on in droves.

    It could be something as simple as having an option box in the settings that said, "Wii Remote Yes/No?" and that would have been that. Then, all your third-party developers would have taken a sigh of relief, traditional gaming aficionados wouldn't be grumbling about waggles for FPSers and whatnot, and you could still have the Wiimote as a kickass peripheral for games where its use is apparent, like sports games.

    Which, correct me if I'm wrong, it seems that's the approach Nintendo is taking with the Wii-U: traditional controller scheme input with support for many different peripherals (including the Wiimote).

    Pretty much, yes.

    I know that only time will tell if the Wii-U catches on, but it really seems to be in many ways the system the Wii should have been.

    Hindsight's a motherfucker and all that, but I'm thinking if they could have just branded the Wii as a must-have peripheral device for the GameCube it would have at least bought them a year or two to develop a legitimate HD console more like what the Wii-U is shaping up to look like, but with the bonus of not risking looking like a Johnny-come-lately to the HD/online gaming party.

    If the Wii-U had come out two-to-four years ago and had offered a competitive next-gen gaming experience PLUS all the same Nintendo-y goodness that people love to love, there's every chance Nintendo would be at the top of the console heap.

    Yeah, well, you know what they say. If my aunt had balls she'd be my uncle too.

    The Gamecube didn't sell enough to talk a serious game for a motion control attachment being released later in its life. Nintendo was wise to rebrand that shit as a fresh start. Releasing a major upgrade for a struggling system is Sega Talk, in the 'let's take the anchor with us in the lifeboat' rationale of doing things.

    The Wii on the other hand did just fine outside of the context of third party sales. We didn't all get the games we wanted, but it did more than well enough to demonstrate Nintendo still Had It. They know how to make stuff that sells, and make more stuff that sells for the stuff that sells. Was it the advanced followthrough and philosophical dedication to a concept a lot of people wanted? Hell no. Did it vaporize those IGN laying-of-entrails pronouncements of being bought up by Microsoft/Sony/Sega/EA? Yes, yes it did that. It got a new system in the door of a lot of people that hadn't had one in a while too. Now, what they elected to do for those new customers is debatable, especially since we'd need a few bits of input from gamers of that stripe and I don't know if any are around these parts.

    I don't think it's a bad thing for them to be making an HD console now. Would it have been better to have the Wii be HD? ...Maybe. I say maybe because I'm basing the body of work of the Wii and wondering how much of it might look even more threadbare in high definition. Maybe the switch to HD development would've proven too expensive for many of those Carnival/Party games that, while being the knock-off Ramen of the gaming world, sold tons of. Maybe it would've been better not to have those titles as success stories, but who knows? They happened, the people that have wanted them are getting them, and it was a respectable chunk of the Wii's business for a major period of its life. Do I want to see shit like that another time around? Not really.

  • shrykeshryke Member of the Beast Registered User regular
    cloudeagle wrote: »
    shryke wrote: »
    cloudeagle wrote: »
    shryke wrote: »
    ME1 and ME2 are two different examples, not one game improving on the other.

    In ME1, hardware limitations are obviously responsible for the nature of things like the Citadel. It's broken up into sections because the 360 literally couldn't load environments that large all at once.

    The ME2 issue is related to loadscreens. Specifically on the PC version you can disable them and transitions become almost instantaneous because a decent computer can handle loading ME2's levels way way faster then the 360 can.

    Both are examples of how hardware enforces limitations and compromises on game design.

    The issue is, as I said, the only way your argument holds water is if developers aren't pushing the limits of current technology (which they are, as examples like the 2 above show) or they are just not interested in pushing those limits.

    And considering developers have historically always been chomping at the bit to push limits on things like level design, detail, physics, etc (as evidenced by the fact that games have drastically changed in more then just graphics over the past ... any time frame), I find the second argument kinda ridiculous and would require some explanation/evidence for why that's suddenly changed.

    I'm still a little confused as to how that disproves my argument, considering my argument is that horesepower improvements from one console to the next don't really result in tangible stuff besides visuals that most people will notice. Mass Effect 1 and 2 were on the same system. I promise I am open to examples, but that one doesn't really apply in this case.

    /facepalm

    Did you really just miss how I just explained in the post you are replying to right now why the fact that they are both on the same system is irrelevant since I'm not comparing the two games to one another?

    ...and did you really miss that I'm talking about the progression from one console to another? That's what I've been talking about this entire time. It really sounds like we're having two different arguments here.

    We are because you are practically trying to miss the point it seems.

    Well, have developers really been chomping at the bit for things besides an increase in visuals? I mean, yes, we got more physics, but that was possible last generation. Sure, they all talk about how game X is the best thing in every single way, but that's part of advertising. Yes, they all say they have awesome AI, but have we really gotten more awesome AI that was, for whatever reason, not possible on last-gen systems? I'm not saying anything has suddenly changed as far as what developers say,, I'm just saying that they haven't truly delivered.

    According to whom? According to what?

    I mean, even ignoring visuals (which we shouldn't since they are very important for immersion), are you really saying you can't tell the difference between a game on the PS2 and one on the PS3?

    Ignoring visuals (and, of course, online, which is an important caveat I brought up earlier, since it really has changed the way games are produced), then I'd say yes, with the exceptions brought up above (Ninety Nine Nights, Dead Rising, and L.A. Noire).

    I'm happy to be proven wrong, though.

    So basically "other then the games where you can tell the difference, no I can't".

    Um ... ok.


    Oh, that reminds me, Shadow of the Colossus, another game that chugged because of the PS2's hardware.

  • UltimanecatUltimanecat Registered User regular
    edited February 2012
    I know that only time will tell if the Wii-U catches on, but it really seems to be in many ways the system the Wii should have been.

    Hindsight's a motherfucker and all that, but I'm thinking if they could have just branded the Wii as a must-have peripheral device for the GameCube it would have at least bought them a year or two to develop a legitimate HD console more like what the Wii-U is shaping up to look like, but with the bonus of not risking looking like a Johnny-come-lately to the HD/online gaming party.

    If the Wii-U had come out two-to-four years ago and had offered a competitive next-gen gaming experience PLUS all the same Nintendo-y goodness that people love to love, there's every chance Nintendo would be at the top of the console heap.

    But like I pointed out, the other HD consoles only relatively recently became profitable, and only one of them is now pulling itself out of the hole it dug for its company's bottom line...

    ...and both really only after they risked looking like Johnny-come-lately's to the motion/hyper-casual gaming party...

    ...except of course if you include Sixaxis-functionality, which apropos of your other point, was included with the PS3 like you suggest the WiiU should do and is now pretty much ignored along with analogue buttons (except for the cost it adds to controllers)...

    ...plus Wii Remotes, which the WiiU is compatible with, continue to function as normal (w/ IR Tracking, motion, and limited data storage) even with a Classic Controller plugged in.

    Edit:
    shryke wrote: »
    Oh, that reminds me, Shadow of the Colossus, another game that chugged because of the PS2's hardware.

    By the way, if Cloudeagle's point is that there are few games this generation that simply couldn't have been done on previous hardware even with reduced visual impact, naming a game that was actually done on previous hardware with less visual impact isn't refuting him.

    Ultimanecat on
    SteamID : same as my PA forum name
  • mcdermottmcdermott Registered User regular
    The lack of Classic Controller compatibility stopped me from buying DKC Returns. I love that franchise, haters gonna hate, but waggle for attacking is rubbish.

    This started for me with the Metal Slug Anthology. One of the first games out for the Wii, IIRC, and no Classic support...so you got to choose between having to use the 'mote/'chuk, or wagglewagglewaggle.

    Pretty much set the ball rolling on my disappointments with the Wii.
    ...except of course if you include Sixaxis-functionality, which apropos of your other point, was included with the PS3 like you suggest the WiiU should do and is now pretty much ignored along with analogue buttons (except for the cost it adds to controllers)...

    This is a better outcome. I'd much rather that gimmicky bullshit merely add a little cost then be fucking ignored by developers than have it add cost and be forced to use it in like every fucking game.

    There's a long tradition of gimmicky bullshit being released as add-on peripherals, and being largely ignored except for a handful of games. Nintendo decided that no, they were going to include the gimmicky bullshit and then encourage every game to use it.

    Feel free to substitute "innovative" for "gimmicky," depending on taste. I guess.

  • UltimanecatUltimanecat Registered User regular
    I think you're in luck then as far as the WiiU is concerned, because I'm not aware of anything gimmicky/innovative developers will be forced to use (arguably the touch/second screen, but if the DS is any indicator, developers can get away with treating those like an afterthought if they want).

    SteamID : same as my PA forum name
  • shrykeshryke Member of the Beast Registered User regular
    shryke wrote: »
    Oh, that reminds me, Shadow of the Colossus, another game that chugged because of the PS2's hardware.

    By the way, if Cloudeagle's point is that there are few games this generation that simply couldn't have been done on previous hardware even with reduced visual impact, naming a game that was actually done on previous hardware with less visual impact isn't refuting him.

    Except it was barely done on previous hardware and only by some tricks that make the PS2 chug like a little bitch. I mean, unless you are saying chugging and the associated framerate issues are not bad things, this is a perfect example of the limitations of hardware. Shit, with SotC there's even a PS3 rerelease of sorts that solves these issues. Hey, look at that, better hardware ftw.

    This is really getting silly. I never thought I'd see people denying that better hardware allows better games.

  • shrykeshryke Member of the Beast Registered User regular
    mcdermott wrote: »
    ...except of course if you include Sixaxis-functionality, which apropos of your other point, was included with the PS3 like you suggest the WiiU should do and is now pretty much ignored along with analogue buttons (except for the cost it adds to controllers)...

    This is a better outcome. I'd much rather that gimmicky bullshit merely add a little cost then be fucking ignored by developers than have it add cost and be forced to use it in like every fucking game.

    There's a long tradition of gimmicky bullshit being released as add-on peripherals, and being largely ignored except for a handful of games. Nintendo decided that no, they were going to include the gimmicky bullshit and then encourage every game to use it.

    Feel free to substitute "innovative" for "gimmicky," depending on taste. I guess.

    And there were a few good uses of the motion sensing.

    I thought Resistance 1 had one of the best uses of motion controls in a traditional game. When that enemy that grabbed you in the face did so, you shook the controller to dislodge them. I think it was the same if you got set on fire or something.

    It felt really good. It was a visceral connection between what you wanted to do and the action you took to do it. "There's something on my face, OMG get it off!!!" It's the same reason Wii Sports worked.

  • AtomikaAtomika Live fast and get fucked or whatever Registered User regular
    Maybe the switch to HD development would've proven too expensive for many of those Carnival/Party games that, while being the knock-off Ramen of the gaming world, sold tons of. Maybe it would've been better not to have those titles as success stories, but who knows? They happened, the people that have wanted them are getting them, and it was a respectable chunk of the Wii's business for a major period of its life. Do I want to see shit like that another time around? Not really.

    This is the live-or-die question for the Wii-U, I feel, and mostly because I don't think Nintendo has any real plans to deviate too far from the current business model, regardless of what they tell the press. For starters, their selling this product as the Wii 2.0. "Wii" is even in the title of the console (unless/until that changes), despite the fact that the "gamer" community that Nintendo is allegedly chasing pretty roundly rejected the Wii as its preferred console for those types of games.

    Hindsight is much more accurate than foresight, but I'm highly suspicious Nintendo is going to successfully court the traditional gaming community with this new box, simply because I can't see that they're going to be offering A) anything remarkably different (graphics and power-wise) from the current gen, or B) a gameplay experience remarkably different and still apparent enough to justify its purchase.

  • cloudeaglecloudeagle Registered User regular
    edited February 2012
    shryke wrote: »
    cloudeagle wrote: »
    shryke wrote: »
    cloudeagle wrote: »
    shryke wrote: »
    ME1 and ME2 are two different examples, not one game improving on the other.

    In ME1, hardware limitations are obviously responsible for the nature of things like the Citadel. It's broken up into sections because the 360 literally couldn't load environments that large all at once.

    The ME2 issue is related to loadscreens. Specifically on the PC version you can disable them and transitions become almost instantaneous because a decent computer can handle loading ME2's levels way way faster then the 360 can.

    Both are examples of how hardware enforces limitations and compromises on game design.

    The issue is, as I said, the only way your argument holds water is if developers aren't pushing the limits of current technology (which they are, as examples like the 2 above show) or they are just not interested in pushing those limits.

    And considering developers have historically always been chomping at the bit to push limits on things like level design, detail, physics, etc (as evidenced by the fact that games have drastically changed in more then just graphics over the past ... any time frame), I find the second argument kinda ridiculous and would require some explanation/evidence for why that's suddenly changed.

    I'm still a little confused as to how that disproves my argument, considering my argument is that horesepower improvements from one console to the next don't really result in tangible stuff besides visuals that most people will notice. Mass Effect 1 and 2 were on the same system. I promise I am open to examples, but that one doesn't really apply in this case.

    /facepalm

    Did you really just miss how I just explained in the post you are replying to right now why the fact that they are both on the same system is irrelevant since I'm not comparing the two games to one another?

    ...and did you really miss that I'm talking about the progression from one console to another? That's what I've been talking about this entire time. It really sounds like we're having two different arguments here.

    We are because you are practically trying to miss the point it seems.

    ...what point? Look, here's the argument as I presented it:
    cloudeagle wrote:
    You know, I saw the argument that more power for consoles bring better AI, physics, etc. a lot when the 360 and PS3 came out. But, in all honestly, did all that power truly give us gaming experiences that weren't possible on the oXbox and PS2? The only games I've seen that I think are literally impossible to replicate on the older systems are Ninety Nine Nights, with its umptizillion enemies, and L.A. Noire, with its insanely detailed facial animation (which was absolutely necessary for its game mechanics). Everything else could be perfectly replicated by just downgrading the graphics, I'd argue. I mean, I played Half-Life 2 on the oXbox, and its physics were utterly amazing for its time, even though it was technically downgraded from the PC. Or am I missing something? Has there been games that have without a shadow of a doubt demonstrated AI and physics that were completely impossible to replicate on older consoles? I get the feeling this will be impossible to settle outside of personal opinion, but hey.

    Clearly, from the very beginning, I'm talking about the move from last gen to this gen. And the example you gave is between two games in this gen... which resulted in an improvement that was available on the PS2 early in its life. You're apparently arguing something completely different. Which is fine, but it doesn't disprove my point.
    Well, have developers really been chomping at the bit for things besides an increase in visuals? I mean, yes, we got more physics, but that was possible last generation. Sure, they all talk about how game X is the best thing in every single way, but that's part of advertising. Yes, they all say they have awesome AI, but have we really gotten more awesome AI that was, for whatever reason, not possible on last-gen systems? I'm not saying anything has suddenly changed as far as what developers say,, I'm just saying that they haven't truly delivered.

    According to whom? According to what?

    I mean, even ignoring visuals (which we shouldn't since they are very important for immersion), are you really saying you can't tell the difference between a game on the PS2 and one on the PS3?

    Ignoring visuals (and, of course, online, which is an important caveat I brought up earlier, since it really has changed the way games are produced), then I'd say yes, with the exceptions brought up above (Ninety Nine Nights, Dead Rising, and L.A. Noire).

    I'm happy to be proven wrong, though.

    So basically "other then the games where you can tell the difference, no I can't".

    Um ... ok.

    Oh, that reminds me, Shadow of the Colossus, another game that chugged because of the PS2's hardware.

    And you're saying that you can tell the difference. Okay then. So.... how does that make me inherently wrong? Like I said, I get the feeling this is all subjective, but I really am curious to see if I'm missing something.

    Shadow of the Colossus chugged, for sure... but that was due to the game's design. If it was optimized or designed differently, it would have run fine. The system's capabilities isn't to blame for that.

    cloudeagle on
    Switch: 3947-4890-9293
  • OptyOpty Registered User regular
    You could have Dynasty Warrior games with tons of characters on screen on the PS2 but Dead Rising Wii shit the bed when it came to having multiple characters on screen to the point it made the system look weaker than a PS2. Where Nintendo went wrong was overestimating third party companies willingness to work with them/on their underpowered system. The first thing a third party company sees when they look at upgraded hardware isn't what they'll be able to do, it's what they'll be able to stop doing. They go "now I won't have to pack textures in some stupid way" or "now I don't have to optimize my code/mesh so tightly" or other similar reliefs. Only after that do they bother trying to learn the ins and outs of the new system and trying to eke new power from it.

    The Wii didn't offer that, which is mainly what turned them away. Companies were stuck having to do extra coding to do things that just "worked" on the new consoles. If the system had the power of the competitors then third parties would have more actively engaged with trying to figure out new ways to use the remote than "replace button with shake." As it was though, they just threw their intern teams on a PS2->Wii port while their big boys were busy on the newer systems.

    But if the Wii was as powerful as the competitors, then would it have been as cheap? Would the Wii have been a runaway success without being over $100 cheaper than the next console? I doubt it. That presents an awful rock and a hard place situation for Nintendo. If they had a powerful enough system and a comparable online infrastructure that was good enough entice third parties to actually make games on their system, then they wouldn't have had a hugeass audience and would have taken a financial bath and probably still end up in third place due to third parties' documented anti-Nintendo bias. On the other hand, by releasing an underpowered cheap console they expanded their gaming audience, achieved record sales and are now sitting on a huge warchest, but in the process they ended up pushing third parties away to the point that without massive money hats there's no way they'll develop for the WiiU and they'll all jump ship the second a MS/Sony next gen console comes out.

    They're pretty screwed at this point and this next generation is not going to be kind for them in the least and I have no idea what they could do to make things better. They've already shot themselves in the foot by announcing the WiiU way too early, not only killing off Wii sales but underwhelming the industry. This E3 is make or break for them and I'm predicting break to be honest.

  • cloudeaglecloudeagle Registered User regular
    Maybe the switch to HD development would've proven too expensive for many of those Carnival/Party games that, while being the knock-off Ramen of the gaming world, sold tons of. Maybe it would've been better not to have those titles as success stories, but who knows? They happened, the people that have wanted them are getting them, and it was a respectable chunk of the Wii's business for a major period of its life. Do I want to see shit like that another time around? Not really.

    This is the live-or-die question for the Wii-U, I feel, and mostly because I don't think Nintendo has any real plans to deviate too far from the current business model, regardless of what they tell the press. For starters, their selling this product as the Wii 2.0. "Wii" is even in the title of the console (unless/until that changes), despite the fact that the "gamer" community that Nintendo is allegedly chasing pretty roundly rejected the Wii as its preferred console for those types of games.

    Hindsight is much more accurate than foresight, but I'm highly suspicious Nintendo is going to successfully court the traditional gaming community with this new box, simply because I can't see that they're going to be offering A) anything remarkably different (graphics and power-wise) from the current gen, or B) a gameplay experience remarkably different and still apparent enough to justify its purchase.

    Actually, because the Wii was so relatively underpowered, they have room to seem like they've improved a lot more than the jump from the 360 to the PS3, which already looked great to the non tech-nerd, even if they don't actually match the 720 and PS4. So in this case, the Wii's lack of power may be an advantage when it comes to the next gen. And honestly, the Zelda demo looks a LOT better than anything capable on the Wii:

    http://www.youtube.com/watch?v=27Lf4uVuE50

    And yes, I do believe that the 720 and PS4 have room to look better than the 360 and PS3. But as the level of graphics continue to improve, the improvements become harder and harder to notice unless you're fairly versed in what to look for, as we are.

    But... why does the Wii U have to offer a gameplay experience remarkably different and still apparent enough to justify its purchase? I thought you've been arguing from the beginning that gamers didn't want different gameplay experiences?

    Switch: 3947-4890-9293
  • UltimanecatUltimanecat Registered User regular
    shryke wrote: »
    Except it was barely done on previous hardware and only by some tricks that make the PS2 chug like a little bitch. I mean, unless you are saying chugging and the associated framerate issues are not bad things, this is a perfect example of the limitations of hardware. Shit, with SotC there's even a PS3 rerelease of sorts that solves these issues. Hey, look at that, better hardware ftw.

    This is really getting silly. I never thought I'd see people denying that better hardware allows better games.

    I don't think he's arguing that improved visuals are bad, or that visual issues are not bad. I know I am definitely not saying either of those things (feel free to find where I did). His point is that you could have even knocked down the graphical fidelity of PS2 SoTC to something that would have run decent on that system and it would have been the same game just with worse graphics, just like you probably could do that with most games this generation.

    As far as gameplay is concerned, I do agree with him - I haven't seen much that couldn't have been done somehow years ago (like I said before Halo is very much the same game it was years ago in almost all regards other than gameplay tweaks and graphics). That said, I don't agree with him that we should discount visual fidelity altogether - for one, it does increase immersion significantly, which can be important for many games. In that case, it'd be hard to say that the same game with worse graphics is actually the "same" at all.

    Secondly, love it or leave it, but the better graphics are, the less they need to appear as abstractions or heavily stylized representations of things, which is important for gaming to move into and be accepted in the mainstream (my relatives and significant others may not understand the appeal at all when I fire up Nethack or the first Legend of Zelda, but despite being uninterested they don't really question why I play modern FPS games since nearly every guy my age does that).

    SteamID : same as my PA forum name
  • mcdermottmcdermott Registered User regular
    cloudeagle wrote: »
    And you're saying that you can tell the difference. Okay then. So.... how does that make me inherently wrong? Like I said, I get the feeling this is all subjective, but I really am curious to see if I'm missing something.

    Shadow of the Colossus chugged, for sure... but that was due to the game's design. If it was optimized or designed differently, it would have run fine. The system's capabilities isn't to blame for that.

    I'm skeptical that at that point in the PS2's lifecycle it was just an optimization issue. And sure, if it was designed differently it would have run fine. But it's arguable that this design was better, and thus better hardware would have allowed for a better game (one with the superior design, and without the performance issues).

  • cloudeaglecloudeagle Registered User regular
    edited February 2012
    mcdermott wrote: »
    cloudeagle wrote: »
    And you're saying that you can tell the difference. Okay then. So.... how does that make me inherently wrong? Like I said, I get the feeling this is all subjective, but I really am curious to see if I'm missing something.

    Shadow of the Colossus chugged, for sure... but that was due to the game's design. If it was optimized or designed differently, it would have run fine. The system's capabilities isn't to blame for that.

    I'm skeptical that at that point in the PS2's lifecycle it was just an optimization issue. And sure, if it was designed differently it would have run fine. But it's arguable that this design was better, and thus better hardware would have allowed for a better game (one with the superior design, and without the performance issues).

    Oh, certainly. And I think that it would have certainly suffered visually if it were downgraded at all, which would be a shame. But still -- that's a graphics issue. And I've never argued that the jump from the PSOne to N64 didn't provide enough of a boost to make new gameplay possible. I'm just saying that the second jump to the current gen didn't.

    cloudeagle on
    Switch: 3947-4890-9293
  • shrykeshryke Member of the Beast Registered User regular
    cloudeagle wrote: »
    shryke wrote: »
    Oh, that reminds me, Shadow of the Colossus, another game that chugged because of the PS2's hardware.

    And you're saying that you can tell the difference. Okay then. So.... how does that make me inherently wrong? Like I said, I get the feeling this is all subjective, but I really am curious to see if I'm missing something.

    Shadow of the Colossus chugged, for sure... but that was due to the game's design. If it was optimized or designed differently, it would have run fine. The system's capabilities isn't to blame for that.

    DINGDINGDINGDINGDING!

    Do you get it now? Cause you just fucking said it. Hardware impacting game design.

    SotC is a game who's mood depends on the hugeness of the colossi in their large environments. This is key to the game's design and in order to accomplish that, the developers had to do things that made the PS2 chug like Chris Christie crossing the street to get to a donut shop.

    The hardware was not adequate (or, alternatively, barely adequate) to accomplish their design goals. The systems capabilities are completely to blame for this because they are the only thing holding back the game.

    Now they could have changed the design, but by saying that you are admitting that the inferior hardware is limiting the design possibilities of the developers. And thus you have made my point, that better hardware is good for more then just fancier graphics.

  • shrykeshryke Member of the Beast Registered User regular
    Opty wrote: »
    You could have Dynasty Warrior games with tons of characters on screen on the PS2 but Dead Rising Wii shit the bed when it came to having multiple characters on screen to the point it made the system look weaker than a PS2. Where Nintendo went wrong was overestimating third party companies willingness to work with them/on their underpowered system. The first thing a third party company sees when they look at upgraded hardware isn't what they'll be able to do, it's what they'll be able to stop doing. They go "now I won't have to pack textures in some stupid way" or "now I don't have to optimize my code/mesh so tightly" or other similar reliefs. Only after that do they bother trying to learn the ins and outs of the new system and trying to eke new power from it.

    And funnily enough, as someone pointed out recently in another thread, this is a HUGE boon for independent developers. Getting a system with dated hardware to do some stuff requires tricks and lots of optimization and the like. Better hardware lets developers ignore alot of that stuff and that's good! It makes games, especially games without cutting edge graphics and the like, easier to develop.

  • shrykeshryke Member of the Beast Registered User regular
    Opty wrote: »
    But if the Wii was as powerful as the competitors, then would it have been as cheap? Would the Wii have been a runaway success without being over $100 cheaper than the next console? I doubt it. That presents an awful rock and a hard place situation for Nintendo. If they had a powerful enough system and a comparable online infrastructure that was good enough entice third parties to actually make games on their system, then they wouldn't have had a hugeass audience and would have taken a financial bath and probably still end up in third place due to third parties' documented anti-Nintendo bias. On the other hand, by releasing an underpowered cheap console they expanded their gaming audience, achieved record sales and are now sitting on a huge warchest, but in the process they ended up pushing third parties away to the point that without massive money hats there's no way they'll develop for the WiiU and they'll all jump ship the second a MS/Sony next gen console comes out.

    They're pretty screwed at this point and this next generation is not going to be kind for them in the least and I have no idea what they could do to make things better. They've already shot themselves in the foot by announcing the WiiU way too early, not only killing off Wii sales but underwhelming the industry. This E3 is make or break for them and I'm predicting break to be honest.

    I don't agree, I think Nintendo recognized this issue and is working to fix it with the WiiU. They've talked alot about how "It'll be powerful enough to compete with MS and Sony" and stuff like that.

    They've made it really obvious a big goal of theirs is to make the WiiU powerful enough so that's it eligible for cross-platform development with the PS4 and the Xbox1000.

    I think they are trying to hit that sweet spot between "cheap" and "just powerful enough for cross-platform development".

  • AbsalonAbsalon Lands of Always WinterRegistered User regular
    edited February 2012
    I wonder how much of the contention here would have been avoided if we accepted some kind of distinction between those who had previously bought a GC/PS2/XB before buying a Wii and those who bought a Wii pretty much fresh.

    Because getting millions of Wiis and the Wii-oriented games sold is different from furthering game design and game mechanics quality for the benefit of those of us who has been around since the N64/PS/SEGA and before, you know? There is business/marketing success and then there is artistic/quality success, and it is not crazy to suggest the two are separate.

    We will see a convergence of accessibility power and entertainment power this next generation, but currently I feel you are talking of two separate things.

    Absalon on
  • cloudeaglecloudeagle Registered User regular
    edited February 2012
    shryke wrote: »
    cloudeagle wrote: »
    shryke wrote: »
    Oh, that reminds me, Shadow of the Colossus, another game that chugged because of the PS2's hardware.

    And you're saying that you can tell the difference. Okay then. So.... how does that make me inherently wrong? Like I said, I get the feeling this is all subjective, but I really am curious to see if I'm missing something.

    Shadow of the Colossus chugged, for sure... but that was due to the game's design. If it was optimized or designed differently, it would have run fine. The system's capabilities isn't to blame for that.

    DINGDINGDINGDINGDING!

    Do you get it now? Cause you just fucking said it. Hardware impacting game design.

    SotC is a game who's mood depends on the hugeness of the colossi in their large environments. This is key to the game's design and in order to accomplish that, the developers had to do things that made the PS2 chug like Chris Christie crossing the street to get to a donut shop.

    The hardware was not adequate (or, alternatively, barely adequate) to accomplish their design goals. The systems capabilities are completely to blame for this because they are the only thing holding back the game.

    Now they could have changed the design, but by saying that you are admitting that the inferior hardware is limiting the design possibilities of the developers. And thus you have made my point, that better hardware is good for more then just fancier graphics.

    Right. That was because of the PS2.

    Now: have we had the 360 and PS3 hardware impact design? Which is what I was specifically asking before. (See above quote.)

    I'm not denying that hardware has the potential to impact design, I'm asking if it did, to any large degree, in the case of the 360 and PS3.

    cloudeagle on
    Switch: 3947-4890-9293
  • reVersereVerse Attack and Dethrone God Registered User regular
    edited February 2012
    cloudeagle wrote: »
    shryke wrote: »
    cloudeagle wrote: »
    shryke wrote: »
    Oh, that reminds me, Shadow of the Colossus, another game that chugged because of the PS2's hardware.

    And you're saying that you can tell the difference. Okay then. So.... how does that make me inherently wrong? Like I said, I get the feeling this is all subjective, but I really am curious to see if I'm missing something.

    Shadow of the Colossus chugged, for sure... but that was due to the game's design. If it was optimized or designed differently, it would have run fine. The system's capabilities isn't to blame for that.

    DINGDINGDINGDINGDING!

    Do you get it now? Cause you just fucking said it. Hardware impacting game design.

    SotC is a game who's mood depends on the hugeness of the colossi in their large environments. This is key to the game's design and in order to accomplish that, the developers had to do things that made the PS2 chug like Chris Christie crossing the street to get to a donut shop.

    The hardware was not adequate (or, alternatively, barely adequate) to accomplish their design goals. The systems capabilities are completely to blame for this because they are the only thing holding back the game.

    Now they could have changed the design, but by saying that you are admitting that the inferior hardware is limiting the design possibilities of the developers. And thus you have made my point, that better hardware is good for more then just fancier graphics.

    Right. That was because of the PS2.

    Now: have we had the 360 and PS3 hardware impact design? Which is what I was specifically asking before. (See above quote.)

    I'm not denying that hardware has the potential to impact design, I'm asking if it did, to any large degree, in the case of the 360 and PS3.

    There are no female Turians in Mass Effect because the 360 doesn't have enough RAM.

    reVerse on
  • shrykeshryke Member of the Beast Registered User regular
    cloudeagle wrote: »
    shryke wrote: »
    cloudeagle wrote: »
    shryke wrote: »
    Oh, that reminds me, Shadow of the Colossus, another game that chugged because of the PS2's hardware.

    And you're saying that you can tell the difference. Okay then. So.... how does that make me inherently wrong? Like I said, I get the feeling this is all subjective, but I really am curious to see if I'm missing something.

    Shadow of the Colossus chugged, for sure... but that was due to the game's design. If it was optimized or designed differently, it would have run fine. The system's capabilities isn't to blame for that.

    DINGDINGDINGDINGDING!

    Do you get it now? Cause you just fucking said it. Hardware impacting game design.

    SotC is a game who's mood depends on the hugeness of the colossi in their large environments. This is key to the game's design and in order to accomplish that, the developers had to do things that made the PS2 chug like Chris Christie crossing the street to get to a donut shop.

    The hardware was not adequate (or, alternatively, barely adequate) to accomplish their design goals. The systems capabilities are completely to blame for this because they are the only thing holding back the game.

    Now they could have changed the design, but by saying that you are admitting that the inferior hardware is limiting the design possibilities of the developers. And thus you have made my point, that better hardware is good for more then just fancier graphics.

    Right. That was because of the PS2.

    Now: have we had the 360 and PS3 hardware impact design? Which is what I was specifically asking before. (See above quote.)

    I'm not denying that hardware has the potential to impact design, I'm asking if it did, to any large degree, in the case of the 360 and PS3.

    Wait, so you are saying "Yes, it impacts design, but not this time"?

    So why don't you explain why this time is different from every other time.

  • KrathoonKrathoon Registered User regular
    I would be great if the Wii U Zelda has the OoT Link. The fairy showing up makes me wonder if it is.

Sign In or Register to comment.