As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/

[Game Dev] I don't have a publisher. What I do have are a very particular set of skills.

1444547495092

Posts

  • KoopahTroopahKoopahTroopah The koopas, the troopas. Philadelphia, PARegistered User regular
    edited July 2018
    Unity 2018.2 has been released. Lots of cool features, including a pretty sick HDR pipeline and the pixel perfect camera preview for 2d games. They're boasting 184 improvements and over 1k bug fixes.
    One of the goals for Unity 2018.2 has been to build on the Scriptable Render Pipelines (SRPs) in order to enable next-level rendering. Another focus area has been to develop a range of features and improvements that will help you succeed in mobile. Let’s take a brief look at what we’ve done in these two areas before going into more detail on the entire release.

    Unity 2018.2 optimizes the performance of the Lightweight Render Pipeline (LWRP) and enhances the High Definition Render Pipeline (HDRP) to help you achieve high-end visual quality, including multiple improvements to the Shader Graph, which now supports both pipelines (please note that both the LWRP and HDRP are currently in preview.)

    We also added support for managed code debugging on iOS and Android, Windows, macOS, UWP and PS4 for IL2CPP, and we started adding some mobile optimizations to the Lightweight Render Pipeline (LWRP).

    For Android projects, 64-bit (ARM64) support gets its final release, and we now let you add Java code to your Unity plugins folder without needing to create libraries in advance.

    Finally, several new 2D features are available as Preview packages, including the Vector Graphics importer and Pixel Perfect. The Vector Graphics importer makes it easier for you to work with SVG graphics, and Pixel Perfect makes it easier for you to achieve a perfect retro look across different resolutions on a wide range of devices.

    KoopahTroopah on
  • halkunhalkun Registered User regular
    Yay! I have my beta done for the game I posted about above. It has an installer and a demo version all set up. Here's my conundrum. The game is feature complete and has a few cosmetic bugs, so it functions almost how the final release is going to be. However, things like themes, and tutorials, and other non-critical content still needs to be completed, so it's not really for public consumption just yet. That being said, I could use a beta tester or two. I do have a functioning demo, so I can ask Tube to post me on the Indy sub forum.. But don't want to take the leap just yet... Any advice.

    Also, I'm trying to keep my nose clean so I'm not posting the demo here. What should be my next step though?

  • rembrandtqeinsteinrembrandtqeinstein Registered User regular
    halkun wrote: »
    Also, I'm trying to keep my nose clean so I'm not posting the demo here.

    Wait I'm confused, we aren't supposed to post demos here?

  • HandkorHandkor Registered User regular
    halkun wrote: »
    Also, I'm trying to keep my nose clean so I'm not posting the demo here.

    Wait I'm confused, we aren't supposed to post demos here?

    There is a forum rule to prevent people shilling whatever product they have all over the forum. This thread is a bit special and it is usually allowed if you've been posting for months the development of your game or app to give us a final link when it is released as many of us are interested in seeing the final product. Mod approval is still a good idea.

    Personally I think @halkun has already shared enough of his project and it also comes off as a work of passion to post here and if it is to request help testing or to gather feedback, it is totally acceptable.

  • LD50LD50 Registered User regular
    edited July 2018
    Halkun can always ask tube. I'm sure he'll be happy to chime in and give an answer.

    LD50 on
  • halkunhalkun Registered User regular
    I sent a message to Tube if it's ok to post the demo here and also in the programming subforum for some feedback. It's feature complete and doen't crash(?) but it's ugly as sin and needs and art pass badly. I even have an artist budget and can't find someone to take my money :(

  • halkunhalkun Registered User regular
    Tube got back to me. Rules stand. I'm going to be doing a youtube walkthough of the demo and write up a pitch for posting in the Indy forum. The game itself, though being from 1991, is fairly complex and comes with a 200 page manual. You are literally sitting at the con of a Star Trek: Next Gen - type ship. (The manual is included in the demo as a PDF). However a walkthough of the demo mission may help out a bit. It's easy when you learn all the subsystems, but as someone who spent the last six months with the code, I may have climbed the difficulty curve differently. This would normally be done with the tutorials, bit that can't be completed until the art is done.

  • halkunhalkun Registered User regular
    Pitch has been sent and awaiting Tube approval. That will have a link to the demo. I included a video walk tough of the demo mission that's included. I'll post that here for you guys to peek at. It's a bit off the cuff and I made a few errors in my commentary. Feel free to ask any questions :)

    https://youtu.be/Spdu6FH-MM8

  • LilnoobsLilnoobs Alpha Queue Registered User regular
    edited July 2018
    Brainstorm time!

    I've been working on an RPG Battle System in Unity for a bit. Basically, it emulates traditional JRPG battles with issuing commands and then watching them play out; rinse and repeat. I'm thinking of releasing a version on the Unity Asset Store that provides a system that allows people to use this as their 'battle phase' for their own RPGs. What would you like to see in such a system?

    Some features that are already included or are planned include
    -A basic sequencer that orders the actors by X attribute (duh)
    -Ability to work with 2D and 3D equally
    -Camera system - Intro cam, Command Cameras (if I select 'attack', camera focuses on things i can attack), Sequence Cameras (the action cameras during the battle phase)
    -UI System that includes
    -Command Menu UI (Attack -> Attack choices, Defend -> defend choices, Items, Abilities, etc)
    -UI Health with portrait, name, health, mana/other bar
    -UI Movement order (shows who is next)
    -Floating Text damage numbers
    -Ability to define Attack/Defend/Item/Etc types (number of attacks, multiplier, hit %, range)

    I also plan on making a "Battle Planner", basically a class that creates the battle scene dynamically based on dropped prefabs. I will also need some class that transfers the stats to the battle scene, e.g. a character's name from the game should transfer over to the appropriate character that appears in the battle scene.

    I'm debating about whether to include a sound system, and I am leaning towards adding a Unity Editor feature that allows people to play animations without entering playmode (since the system is based around animations). This would help people define attack and defend attributes more conveniently.

    What other options would you like to see in such an system? I have some of the features showcased on my instagram, if it helps visualize it easier,
    https://www.instagram.com/greg_w_lyons/




    Lilnoobs on
  • halkunhalkun Registered User regular
    edited July 2018
    I know that you are just doing the battle bit, but I would kill for a spreadsheet that plots out the progression of characters throughout the course of the game. For example. In Final Fantasy 7, (a game I reversed engineered), there is a file that holds the experience curves for the characters from level 0 to 99. However, the experience curves are logarithmic, and between each level there is a "zone" that corresponds with the experience you need to gain from enemies. But to figure out you need to kill x amount enemies for y amount of experience to hit the next level in z amount of minutes is the magic formula that was pre-calculated by the game designers. This is not the only part of the secret sauce. Either in the zone or at the end when you level, you get level bonuses such as speed+ HP+ and other things. These have to be pre-calculated to make sure that you don't get power creep. Along with you you also have other things that are acquired that just happen to fit in the zone you are in. Hey I'm in a swamp now with lots of monsters that can cast poison all of the sudden, good thing that last level up gave me the Depoisin spell and a heal bonus!

    Oh and each job has their own progression.

    With FF10 and FF13 you can unwrap the skill trees and somewhat work the math backwards. FF12 is a little trickier as the skills are blocked by actual blocks with a cost.

    It's just that whole experience curve to game alignment is magic every time I see it.

    halkun on
  • RoyceSraphimRoyceSraphim Registered User regular
    What did you think of similar curves in rpg maker?

  • halkunhalkun Registered User regular
    I never really played with the PC RPGMakers that much. The Console versions had prefabs, which were, once again, precalculated. I think I still have the PS1 version of RPGMaker hanging about. I remember not being able to make much of a game, as you only had 1MB memory cards as storage media.

  • agoajagoaj Top Tier One FearRegistered User regular
    AnimationCurve in Unity is a general purpose curve datatype with a decent curve editor. If you're doing a RPG they're perfect for tweaking leveling curves.

    ujav5b9gwj1s.png
  • LilnoobsLilnoobs Alpha Queue Registered User regular
    edited July 2018
    halkun wrote: »
    I know that you are just doing the battle bit, but I would kill for a spreadsheet that plots out the progression of characters throughout the course of the game. For example. In Final Fantasy 7, (a game I reversed engineered), there is a file that holds the experience curves for the characters from level 0 to 99. However, the experience curves are logarithmic, and between each level there is a "zone" that corresponds with the experience you need to gain from enemies. But to figure out you need to kill x amount enemies for y amount of experience to hit the next level in z amount of minutes is the magic formula that was pre-calculated by the game designers. This is not the only part of the secret sauce. Either in the zone or at the end when you level, you get level bonuses such as speed+ HP+ and other things. These have to be pre-calculated to make sure that you don't get power creep. Along with you you also have other things that are acquired that just happen to fit in the zone you are in. Hey I'm in a swamp now with lots of monsters that can cast poison all of the sudden, good thing that last level up gave me the Depoisin spell and a heal bonus!

    Oh and each job has their own progression.

    With FF10 and FF13 you can unwrap the skill trees and somewhat work the math backwards. FF12 is a little trickier as the skills are blocked by actual blocks with a cost.

    It's just that whole experience curve to game alignment is magic every time I see it.

    I never thought of that, but it sounds interesting. That's impressive you reverse-engineered FF7, do you have any resources or links that explains this system in some detail? Using some stuff in unity, as agoaj keenly observed, I wonder if I can incorporate a 'template' that bends the curve relative to the number of 'zones' that are predefined by the user. But maybe it would be better to bend the curve based on the expected level for each zone and then use that to determine leveling rates. I think this would be pretty useful.

    Lilnoobs on
  • KoopahTroopahKoopahTroopah The koopas, the troopas. Philadelphia, PARegistered User regular
    Unreal released their new Unreal Engine Blaze-It Edition:
    Unreal Engine 4.20 delivers on our promises to give developers the scalable tools they need to succeed. Create a future-focused mobile game, explore the impact of Niagara, breathe life into compelling, believable digital humans, and take advantage of workflow optimizations on all platforms.

    You can now build life-like digital characters and believable worlds with unparalleled realism. Take your visual effects to the next level with Unreal Engine’s new Niagara particle editor to add amazing detail to all aspects of your project. Use the new Digital Humans technology powering the “Meet Mike” and “Siren” demos to raise the bar on realism. With the new Cinematic Depth of Field, you can achieve cinema quality camera effects in real-time.

    Unreal Engine empowers you to make things your way by giving you the tools to customize the creation process to your preferred style and workflow. With the new Editor Scripting and Automation Libraries, you can can create completely customized tools and workflows. Make the lives of designers and artists easier by adding new actions to apply to Actors or assets thanks to scripted extensions for Actor and Content Browser context menus.

    Battle-tested mobile and console support means you can create once and play on any device to deliver experiences anywhere users want to enjoy them. Epic has rallied around the mobile release of Fortnite to optimize Unreal Engine for mobile game development. We have made tons of performance improvements including implementing both hardware and software occlusion queries to limit the amount of work the hardware needs to do. Proxy LOD is now production-ready and can further reduce the complexity of the geometry that needs to be rendered at any time.

    In addition to all of the updates from Epic, this release includes 165 improvements submitted by the incredible community of Unreal Engine developers on GitHub!

  • halkunhalkun Registered User regular
    edited July 2018
    Lilnoobs wrote: »
    halkun wrote: »
    I know that you are just doing the battle bit, but I would kill for a spreadsheet that plots out the progression of characters throughout the course of the game. For example. In Final Fantasy 7, (a game I reversed engineered), there is a file that holds the experience curves for the characters from level 0 to 99. However, the experience curves are logarithmic, and between each level there is a "zone" that corresponds with the experience you need to gain from enemies. But to figure out you need to kill x amount enemies for y amount of experience to hit the next level in z amount of minutes is the magic formula that was pre-calculated by the game designers. This is not the only part of the secret sauce. Either in the zone or at the end when you level, you get level bonuses such as speed+ HP+ and other things. These have to be pre-calculated to make sure that you don't get power creep. Along with you you also have other things that are acquired that just happen to fit in the zone you are in. Hey I'm in a swamp now with lots of monsters that can cast poison all of the sudden, good thing that last level up gave me the Depoisin spell and a heal bonus!

    Oh and each job has their own progression.

    With FF10 and FF13 you can unwrap the skill trees and somewhat work the math backwards. FF12 is a little trickier as the skills are blocked by actual blocks with a cost.

    It's just that whole experience curve to game alignment is magic every time I see it.

    I never thought of that, but it sounds interesting. That's impressive you reverse-engineered FF7, do you have any resources or links that explains this system in some detail? Using some stuff in unity, as agoaj keenly observed, I wonder if I can incorporate a 'template' that bends the curve relative to the number of 'zones' that are predefined by the user. But maybe it would be better to bend the curve based on the expected level for each zone and then use that to determine leveling rates. I think this would be pretty useful.

    You can find all that at http://forums.qhimm.com/index.php

    ---- Unrelated ----
    Oh hey, Post is up in the Indie forum with a Beta Demo available for Rules of Engagement. It's a google drive download so I'm not sure how that is going to hold out, but I'm not expecting a lot of traffic while it's in beta... Feel free to contact me if anyone wants to be a beta tester. :D

    -- Ninja Edit ---
    I think the D/L link was broken - Fixed it.

    halkun on
  • bowenbowen How you doin'? Registered User regular
    Unreal released their new Unreal Engine Blaze-It Edition:
    Unreal Engine 4.20 delivers on our promises to give developers the scalable tools they need to succeed. Create a future-focused mobile game, explore the impact of Niagara, breathe life into compelling, believable digital humans, and take advantage of workflow optimizations on all platforms.

    You can now build life-like digital characters and believable worlds with unparalleled realism. Take your visual effects to the next level with Unreal Engine’s new Niagara particle editor to add amazing detail to all aspects of your project. Use the new Digital Humans technology powering the “Meet Mike” and “Siren” demos to raise the bar on realism. With the new Cinematic Depth of Field, you can achieve cinema quality camera effects in real-time.

    Unreal Engine empowers you to make things your way by giving you the tools to customize the creation process to your preferred style and workflow. With the new Editor Scripting and Automation Libraries, you can can create completely customized tools and workflows. Make the lives of designers and artists easier by adding new actions to apply to Actors or assets thanks to scripted extensions for Actor and Content Browser context menus.

    Battle-tested mobile and console support means you can create once and play on any device to deliver experiences anywhere users want to enjoy them. Epic has rallied around the mobile release of Fortnite to optimize Unreal Engine for mobile game development. We have made tons of performance improvements including implementing both hardware and software occlusion queries to limit the amount of work the hardware needs to do. Proxy LOD is now production-ready and can further reduce the complexity of the geometry that needs to be rendered at any time.

    In addition to all of the updates from Epic, this release includes 165 improvements submitted by the incredible community of Unreal Engine developers on GitHub!

    4.20 edition tl;dr - "you, too, can build a better battle royale than the makers of pubg"

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • LD50LD50 Registered User regular
    bowen wrote: »
    Unreal released their new Unreal Engine Blaze-It Edition:
    Unreal Engine 4.20 delivers on our promises to give developers the scalable tools they need to succeed. Create a future-focused mobile game, explore the impact of Niagara, breathe life into compelling, believable digital humans, and take advantage of workflow optimizations on all platforms.

    You can now build life-like digital characters and believable worlds with unparalleled realism. Take your visual effects to the next level with Unreal Engine’s new Niagara particle editor to add amazing detail to all aspects of your project. Use the new Digital Humans technology powering the “Meet Mike” and “Siren” demos to raise the bar on realism. With the new Cinematic Depth of Field, you can achieve cinema quality camera effects in real-time.

    Unreal Engine empowers you to make things your way by giving you the tools to customize the creation process to your preferred style and workflow. With the new Editor Scripting and Automation Libraries, you can can create completely customized tools and workflows. Make the lives of designers and artists easier by adding new actions to apply to Actors or assets thanks to scripted extensions for Actor and Content Browser context menus.

    Battle-tested mobile and console support means you can create once and play on any device to deliver experiences anywhere users want to enjoy them. Epic has rallied around the mobile release of Fortnite to optimize Unreal Engine for mobile game development. We have made tons of performance improvements including implementing both hardware and software occlusion queries to limit the amount of work the hardware needs to do. Proxy LOD is now production-ready and can further reduce the complexity of the geometry that needs to be rendered at any time.

    In addition to all of the updates from Epic, this release includes 165 improvements submitted by the incredible community of Unreal Engine developers on GitHub!

    4.20 edition tl;dr - "you, too, can build a better battle royale than the makers of pubg"

    I thought that was already an asset store package.

  • Mc zanyMc zany Registered User regular
    Upgrading Unreal sounds great till you realise it breaks half your game and have to spend ages fixing it.

  • LaCabraLaCabra MelbourneRegistered User regular
    I've never found that to be the case. Nothin' ever breaks when I upgrade.

    I'm mostly using Blueprints, but that's the objectively correct way to operate.

  • Mc zanyMc zany Registered User regular
    Might just be me then. Like just recently they changed the way applying damage to a mesh worked so none of the enemies in my game could be destroyed anymore untill I went through and updated it.

    Mind you I started this game on 4.12 so this is probably a sign that I am taking too long:)

  • LaCabraLaCabra MelbourneRegistered User regular
    Like a destructible mesh? That's probably an update on Nvidia's side getting rolled in, I guess.

  • HeartlashHeartlash Registered User regular
    edited July 2018
    Thread question: I have been trying to figure out better methods of applying responsive thinking to mobile game assets. The game I just finished relies pretty heavily on assets that involve just kind of assuming a relatively standard resolution and DP and creating assets accordingly. Does anyone have any experience with asset generation for mobile that is, I guess, more adaptive across different screens? Or is the world really pretty stuck in this one asset for tablet another for phone kind of mentality?

    Heartlash on
    My indie mobile gaming studio: Elder Aeons
    Our first game is now available for free on Google Play: Frontier: Isle of the Seven Gods
  • rembrandtqeinsteinrembrandtqeinstein Registered User regular
    Heartlash wrote: »
    Thread question: I have been trying to figure out better methods of applying responsive thinking to mobile game assets. The game I just finished relies pretty heavily on assets that involve just kind of assuming a relatively standard resolution and DP and creating assets accordingly. Does anyone have any experience with asset generation for mobile that is, I guess, more adaptive across different screens? Or is the world really pretty stuck in this one asset for tablet another for phone kind of mentality?

    Another poster found what seems like a good technique, you convert your sprites to meshes using a voxel tool. Then use those meshes as your assets. The rendering will scale with the screen resolution. As long as you aren't using mesh collisions it shouldn't significantly affect performance.

  • KhavallKhavall British ColumbiaRegistered User regular
    So I don't know if anyone remembers, but one of my white whales was finding easy ways of synthesizing music in Unity. My last solution was basically just to build an FM Synth using OnAudioFilterRead(), but it wasn't great, it wasn't very flexible, and it wasn't very easy to use. Everyone else that I've talked to about doing generative music or synthesis within Unity had also either kludged together a solution, or ran an external program.

    The problems were many, such as some solutions working, but only on Windows. Or needing to use OSC and an external program. Or working, but first you had to generate a MIDI file and then play it back, which made linking things all together smoothly very difficult. There was an $80 thing on the asset store that was super old, but it only worked with Unity Web Player which, well....

    It seems this year is the year of people figuring out Unity synthesizers. A few months ago, without my noticing, the $80 one started supporting other platforms. It's some tie in with another synthesizer, and I haven't really had time to play with it yet.

    ...but even more amazingly, some, like, IT architecture dude at goddamn Volvo in his free time put together a MIDI synth that's cross-platform in Unity, that's super easy to use, and also has a free version. He posted it like 2 weeks ago and it basically changes my life. Also while I haven't delved too deep into it, it also seems to be able to read MIDI files and get a piano-roll version of them, which means that I don't have to go through some weird thing to get my training data, I can just have a folder full of MIDI and have this thing do the parsing of the MIDI file.

    I cannot emphasize how great it is to be able to go "Hey Unity, play me a C on Piano" and it just, like.... does it.

  • halkunhalkun Registered User regular
    edited July 2018
    MIDI seems to be a technology lost nowadays. Funny story. When Porting my game from DOS to Windows, the original opening was not only MIDI, but Roland MT-32 MIDI, which was before General MIDI was standardized. Originally it was piped though a SoundBlaster MIDI synth, but when I played it on Microsoft's built-in wavetable it sounded like butt. Turns out I had to get a MIDI editor that had an event editor(!) remove all the SysEx tags, and repatch all the instrument changes. My game lib that I use doen't have a MIDI synth in it, and I wanted to make the game as device-independent as possible. What I wound up doing was render each track using a sondfont as a WAV file and then mixed them in Audicity.

    Video links below:
    Original vs Fixed Wavetable version 1
    Final Version is here

    halkun on
  • KhavallKhavall British ColumbiaRegistered User regular
    Yeah apparently this MIDI thing basically can just load soundfonts and play them, but comes standard with General MIDI as the stock soundfont, so you get MIDI without actually having to go through the normal MIDI shit. Which also means that in theory you can go "Hey I want this SNES game's soundfont" or whatever and use that instead if you don't want the MIDI sounds.

    It's old and not great, but nothing else has the flexibility for symbolic generative stuff. It's just so nice to be able to focus on the actual generative system and just go "when it's a note on, hit note-on, when it's a note-off, hit note-off." instead of having to go "Ok, well, I need to program in the idea of an envelope so that there's a difference between a sustain and an onset, blah, blah, blah".

    It sounds worse than other solutions, but it's so much more flexible and standard and easy to use that it's just fantastic.

    Also, we pretty much store everything in MIDI values anyways since the easiest way to get a corpus of music that can be read and parsed as note events and pitches is using a MIDI file. It's nice knowing that the input to the learning agents is MIDI values, and then they can just spit out MIDI values for synthesis.

    It's just great in general.

  • halkunhalkun Registered User regular
    Believe or not, the MIDI protocol is still used for electronic-contolled stage lighting. MIDI is initialized with a tempo and time signature. This defines speed and measure placement. Each event in MIDI is timestamped, so as opposed to playing a note via a piano scroll, you play lights and color. So hey, I don't know if you can tap into the MIDI stream on not, but it would be fun to insert and commandeer a few SysEx commands and make the game do something at a particular MIDI mark.

  • LD50LD50 Registered User regular
    It's used for other surprising things too. Multitouch in windows was originally jury rigged through multiple 'instruments' via the midi interface (this was back in the Vista-based surface table days).

  • KhavallKhavall British ColumbiaRegistered User regular
    Oh yeah there's tons of use for MIDI as a data interface sort of thing.

    That was actually one of the bits of noise that was tough to sort through during my monthly searches for MIDI synthesis. There were lots of tools for making Unity able to send and receive MIDI control messages for exterior devices and shit, but they didn't do anything about using General MIDI as sound synthesis.

  • rembrandtqeinsteinrembrandtqeinstein Registered User regular
    Khavall wrote: »
    ...but even more amazingly, some, like, IT architecture dude at goddamn Volvo in his free time put together a MIDI synth that's cross-platform in Unity, that's super easy to use, and also has a free version

    got a link in case we want to check it out?

  • KupiKupi Registered User regular
    I'm Not Even Going To Pretend This Is In Continuity With Existing Or Future Posts

    Because I am some combination of imbecile, madman, and genius*, heavily weighted toward the former two, I've taken it upon myself to write an academically pure ECS library that I can integrate with MonoGame. Meaning, all Components are implemented as structs kept in tightly-packed arrays with no behavior other than enforcing legal transformations of their state. So far I've been able to work my way around the worst of the constraints that arise from the rules I've set out, like being able to do interface calls on structs without causing boxing.

    * Also because my day job leaves me stranded without access to creative software and I can write an ECS architecture in a paper notebook.

    My favorite musical instrument is the air-raid siren.
  • halkunhalkun Registered User regular
    Oh BTW. I didn't know this, but I saw pretty cool assets in the Unity store that I wanted for my game. Bought it, and was denied a download because I didn't have Unity installed. Now I know you guys are all "well, duh! It's the Unity store!", but in my defense it was some pointer cursors. I just needed the raw .png file for my own game.

    I was sent a copy of the files from the seller after giving him the invoice. But yea, that was something to be learned.

  • KhavallKhavall British ColumbiaRegistered User regular
    edited July 2018
    So, in addition to the fun MIDI stuff I was doing, I'm in the last few days of my one-week vacation, and I finally got around to prototyping out an idea that's been kicking around in my head for a while. It's going to be a phone game I think.



    I'm calling it "A Bad MMO (that you play by yourself)"

    It's basically exactly what it sounds like... you play as the MMO trinity of Tank/DPS/Healer, and each class only has 3 abilities. Tanks can do a high-aggro strike, a defensive CD, and a taunt. DPS has 3 chained hits that do more damage if they're done 1-2-3 near each other. Healers have a filler/strike for dealing extra damage, a HoT, and an AoE heal.

    Then the monsters have a normal attack that just slowly whittles down the health of whoever they're attacking, a 2x attack, an AoE attack, and the most insidious: a 2x attack that also moves positioning.

    I'm planning on adding some more MMO tropes, like starting with leveling up from 1-60 where each class gets exp together, and then once you're at level 60 having an arbitrary item level that only one class increases after each boss. And also doing bosses with patterns and whatnot.

    It's... surprisingly fun, though will need quite a bit of tweaking for balance, and a lot of work on giving more feedback.

    Khavall on
  • HandkorHandkor Registered User regular
    Nice idea, I like that you end up with a rhythm to the actions based on the cooldowns.

  • RoyceSraphimRoyceSraphim Registered User regular
    Implemented side portrait talking for my visual novel.

    Currently planning to scrap all that as this doesn't really reflect the tone I was going for.

    Ergo, going back and rewriting every damn thing I did for the main character's dialog.

  • HandkorHandkor Registered User regular
    edited July 2018
    I have Ninjas



    Enemy AI is gonna need some work to increase variance and I need to put in more animations for various weapons but, I have Ninjas also I need to reduce the i-frame duration for the enemies. Right now they are invincible for the whole duration of their flickering but it's too long. The knockdown animation also needs to be less frequent.

    Handkor on
  • ElvenshaeElvenshae Registered User regular
    ... and also, you have Ninjas!

  • RendRend Registered User regular
    Good morning, game development thread!

    So I've been fiddling and hobby-ing around with game dev on and off for like seven years now, and in that time a lot has happened. Instead of writing up a big, emotional, personal post, suffice it to say that this time I've got to make this shit count. So, to help keep myself accountable I'm going to dev blog here to you lot while I'm doing the thing. In the best case, I finish it up and you guys care about it and cheer me on, and in the worst case it will have been a fun journey that I actually documented for once.

    The game is the one I've spent the most time designing over the last many years, by a good margin. It's a strategy rpg, grid based, turn based, like fft, fire emblem, xcom etc. I have what I think is a pretty good set of mechanics but, we'll see once it gets actually playable (again). I did a prototype of this game a few years back as well, but I decided to rewrite it using Entitas' Unity ECS framework since my prior prototype suffered from some suboptimal design decisions.

    For now, I've got a lot of the work done to make something technically playable. At this point I need to make a basic UI (right now it's all test buttons to manually fire events), a basic attack ability, and implement a behavior tree to get some basic AI in there. Hopefully by the end of the week I'll have that much. Depending on how busy I am in general could be a bit less or even a bit more though!

    We'll see how it goes, and I'll continue to update the thread periodically on my progress.

Sign In or Register to comment.