As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/

[Game Dev] I don't have a publisher. What I do have are a very particular set of skills.

17778798082

Posts

  • PeewiPeewi I'm a cube now Registered User regular
    I think I've finished my collision bug. Or at least I haven't seen it happen since this last change I made and I think I can explain why. It would probably help if I had a more reliable reproduction method than simply enclosing a ball in a small area and waiting.

    So, my collision system works on the assumption that a ball's position before moving does not collide with any walls and that moving back towards it will fix collisions. This works, but has the downside of potentially failing if the starting position is invalid. Not really a problem if the rest of the collision system works as intended.

    It turned out, that when calculating how far back a ball should be moved from a wall, it would sometimes produce results that were very slightly inside the wall. And then some of those times, bad things could happen because the previously mentioned assumption wasn't actually true. My understanding of why amounts to "because floating point number precision".

    So to prevent this, instead of moving the ball to be "exactly" its radius away from the wall, I move it to its radius + 0.1. This does cause problems if a ball is in a gap with its exact width, but given that those don't occur in my current game unless the player does some very precise inputs I'm willing to live with that.

    Switch: SW-6132-4331-5349 || Steam profile
    Nz3ekNM.png
    dipuc4lifeCornucopiistRoyceSraphim
  • CornucopiistCornucopiist Registered User regular
    edited October 12
    So, here's a little rant about Slerp and Lerp.
    I've been using these, as indicated, mainly to dampen camera motion as well as the adherence of my player prefab (with the 3d model) to the player game object (with the code).

    These functions tend to break spectaculary once the framerate drops... it looks like buffeting and shaking, and is entirely unwanted. Now the way these are used is weird.
    The function is usually set up like this:
    camera.transform.position = Lerp(camera.transform.position, waypoint.transform.position, time.deltatime * multiplier).
    

    That doesn't make any sense, because really that last value is not a 'step' but a percentage.

    So it should be
    duration = 25.0f;
    counter += time.deltatime;
    percentage = counter/duration;
    camera.transform.position = Lerp(camera.transform.position, waypoint.transform.position, percentage)
    

    But wait! it's still nonsense! Our percentage is now correct, but it's always a percentage of the last position! It should be
    camera.transform.position = Lerp(waypoint1.transform.position, waypoint2.transform.position, percentage)
    

    now, I actually have a use case for that with a roll-your-own animator.

    But for dampening camera motion, it obviously doesn't work very well. I'm trying to follow a moving object, the camerahook. I could store multiple previous positions and round between them, but this give increasing delay.
    So, my idea is the following:

    oldposition = camera.transform.position;
    camera.transform.position = Lerp(camera.transform.position + offset;  camerahook.transform.position, 0.5f)
    offset = oldposition - camera.transform.position
    

    The idea is that I find the halfway point between where the the camera would go if it had a specific inertia, and where the camerahook is.

    Of course, this has two drawbacks: there's only limited 'dampening' happening, and it can 'fall back' as the gap between the offset and the camerahook acceleration increases.

    So, like a bungee cord, there should be a limit to the give, and lots of slack in small motion. So we iterate to the next concept:

    oldposition = camera.transform.position;
    limit = 5;
    difference = Vector3.Distance(camera.transform.position+ offset, camerahook.transform.position);
    lerpval = difference/limit;
    lerpval > 1 ? lerpval = 1;
    camera.transform.position = Lerp(camera.transform.position + offset;  camerahook.transform.position, lerpval)
    offset = oldposition - camera.transform.position
    

    Anyway, can't free up time to actually implement and test is, so thought I'd post this here for comments :)

    Cornucopiist on
    KoopahTroopah
  • IzzimachIzzimach Fighter/Mage/Chef Registered User regular
    IMHO you might be better off tracking and explicitly controlling velocity. Something like:

    1. find direction vector from camera to target. To find this compute the vector (targetposition - cameraposition) and normalize it.
    2. multiply the direction vector by the velocity and deltaTime
    3. Add result of 2 to the camera position.

    This gives you explicit control over velocity. You can make velocity constant, or make it directly related to the distance from camera to target (this is what lerp does basically). However you can also clamp the min/max velocity if you need to, or even smooth it out over several timesteps. Also if you're going to overshoot by a lot you can heavily dampen the velocity.

    Cornucopiist
  • ElvenshaeElvenshae Registered User regular
    When I've done doing camera motion, I LERP while the distance is large between the camera target and the current camera position, but once I get close I snap the last bit of distance and then follow without damping.

    Cornucopiist
  • HandkorHandkor Registered User regular
    Yeah I was setting up a fighting game camera this weekend, just snapping it to the average position between the two players and I tried smoothing the motion. It was horrible and felt wrong. You expected the camera to only move if you move and not continue sliding a bit.

    Cornucopiist
  • CornucopiistCornucopiist Registered User regular
    Handkor wrote: »
    Yeah I was setting up a fighting game camera this weekend, just snapping it to the average position between the two players and I tried smoothing the motion. It was horrible and felt wrong. You expected the camera to only move if you move and not continue sliding a bit.

    The cinemachine take on fighting games (an others) is to have a 'safe' zone in which the characters can move without camera motion, and then once a character goes outside the safe zone, the camera moves or zooms out to keep the action centered.

    As for my code, it sucked. I need to grok this at a fundamental level, but imho most of the things I've tried are OK, they just break down very easily if deltatime is very erratic in the editor. So I may look at this next. The main issue might be that indeed updates (in the editor) can run at very offset timings, but physics run on fixedupdate. Afaik Unity fixedupdate is an exact 30fps but of course drawing happens at the variable update mode.
    I move the player in fixedupdate as I use physics to do so (it being an airplane that needs collision). As the lerp code is from before I switched to physics, it runs on the normal update.
    As I'm moving the camera to match the player, it might make sense to also do so in fixedupdate. Otherwise the movement changed because of the update deltatime will be different from the supposed matching movement which runs on fixed update...

    Handkor
  • KupiKupi Registered User regular
    I did a game dev for the first time in close to three months.

    Because I've had the subject on my mind for a while and wanted to see how it would look in actual practice, I decided to prototype the handling of rotation for the player character when they go airborne in my Sonic-like. Sonic has the advantage of his "rolling" form which turns him into a ball. When he jumps, he turns into a circle-- the rotation he had on the ground is irrelevant. In my game, the player character has to maintain a "neutral" stance at all times due to certain other abilities, which means that when you jump, somehow the rotation needs to be handled appropriately. The design I landed on has a few simple rules:

    - Above the threshold Y-velocity, the character is considered "rising". While rising, the character rotates to point in the direction of motion.
    - Below the negative of the threshold Y-velocity, the character is considered "falling". While falling, the characters rotates to point opposite the direction of motion. (This is, generally speaking, still "upward", but reversed across the X-axis. The videos will make this clearer.)
    - Between those two values, the character is considered "somewhere in between rising and falling, I'm sure there's a better name for it". In that state, the rotation angle is calculated by lerping between the rising and falling angles based on how far between the rising and falling velocity threshold values they are.

    The video below shows how this looks in actual practice. Using Sonic's rolling animation for the transition is somewhat contrary to the premise in the first paragraph, but I work with the tools I have. This video also contains a small experiment in the sprite rendering where rotation is quantized to an interval (Pi / 12), making the animation "clickier" and more like the fixed rotations that Genesis games used in the absence of something like the SNES's Mode 7.



    And this video demonstrates it without the rotational snapping and the hitboxes visible, so you can see how the physics volume rotates around the pivot point (the bottom of the triangle).



    It also demonstrates some bugginess I ran into while testing it. These glitches occur for two reasons, suggesting possible avenues for further work:

    - The physics system only handles linear motion. Rotation changes can easily cause physics volumes to overlap, especially when they're significant changes in rotation. For instance, when you collide with a wall using the tip of the triangle (around the character's pivot point), the rotation update on the same frame more or less flips the volume around the x-axis, putting most of the volume inside the wall. The physics system has a "depenetration" phase to handle this, but it results in multiple subsequent collisions. This causes problems because...
    - The "platforming physics" system currently doesn't have logic to disregard "irrelevant" collisions yet. Meaning, let's say a platforming body strikes two wall tiles in the same frame, and is set to reflect off of them. The platforming physics system will perform the first reflection, and then reflect the reflected velocity over the surface normal for the second collision. Generally this results in the volume traveling in a completely new direction! And when you've just rotated yourself into a wall, some of those surface normals might actually be internal segments, making the behavior even weirder. Frankly, I'm astonished that I didn't get Sonic entirely on the other side of the wall here.

    The first issue can be worked around by only allowing the character's rotation to change by a certain amount each frame, giving the volume time to move along its new trajectory and clear the wall. The second issue can be fixed by adding a check in the platforming physics system that the volume's velocity still opposes the surface normal of the collision before accepting that collision as valid.

    My favorite musical instrument is the air-raid siren.
    ElvenshaePeewiDisruptedCapitalistCornucopiistLilnoobs
  • PeewiPeewi I'm a cube now Registered User regular
    Back when Guilty Gear Strive came out I played a bunch of that and thought about making my own fighting game. I decided to not start a new project without finishing my current one, and since I finished that I decided to dive right in.

    The game part so far is mostly basic movement and input handling, using graphics stolen from the Game Boy version of Street Fighter 2.


    What I have spent the most time on so far is making a character editor, because defining everything in code or by manually editing text files would be annoying.

    It's got a sprite editor. Not an image editor, but merely a tool for marking sprites in a spritesheet. I had previously made a similar tool for another project, but I think this one is better.
    8BLaKqs.png

    It's got a hitbox editor. In both of these editors I can drag the boxes with the mouse (both dragging to move the whole box and dragging just a single side or corner) or enter numbers in the boxes.
    TU5Aq0l.png

    And I have some table views for editing all the character states and other data. Pictured data very incomplete.
    cRrt7up.png

    I have realized that the data to include in the character file is going to be highly dependent on the game design and I have realized that I have more opinions about how to program a fighting game than how to design one. I think I'll start with making a version of Ryu from Street Fighter and then try and see what makes sense after that.

    Next up: actually loading this data in the game and making use of it.

    Switch: SW-6132-4331-5349 || Steam profile
    Nz3ekNM.png
    KupiIncenjucarDrovekElvenshaeIanatorCornucopiistHandkorLilnoobs
  • PeewiPeewi I'm a cube now Registered User regular
    It doesn't look much different, but I'm making decent progress on the fighting game. I'm loading character data made in the editor and I have working attacking and blocking.

    I wanted to avoid defining any universal inputs outside of my character file, so I ended up doing block inputs in a way I feel is a little weird. I need to define SOMEWHERE that holding down and back can block low and mid attacks, but not high, and that it puts you into the crouching block state. So I ended up adding a block field to my input definitions and gave my block inputs a priority of -1, so that it gets ignored when checking all the other inputs.

    I added an input buffer, which turned out to be fairly easy. Instead of simply keeping the input from the previous frame, I keep the inputs from all previous frames and check if the button in question is pressed on the current frame and was not pressed on a recent frame. In most cases you'd probably only want to keep the inputs you're going to be actually checking, but later I'll be saving it as replay data.

    I thought about maybe having to clear the input history if someone were to run training mode for a really long time, but some quick math shows that 24 hours of inputs would still only be a few MB.

    Switch: SW-6132-4331-5349 || Steam profile
    Nz3ekNM.png
    Incenjucardipuc4lifeLilnoobs
  • CornucopiistCornucopiist Registered User regular
    So I switched over to Enum gamestates.
    Full disclosure; the gamemanager script was passed on from two earlier game, and I had a 'knocked out' enum gamestate all ready to go.
    The one issue is that I've serialised a class to set up animations, triggered by the gamestate.
    This way I can easily set these up in the editor.
    ziwk03xa1zlk.png
    HOWEVER!

    If I add an enum, the animations I have already set up change their enum value, because they're stored as the index rather than the string.
    So after adding an enum I need to look at all my animations and reset them.
    Would be useful to have a fix for that...

  • ZekZek Registered User regular
    I've been trying to work myself up to getting into game development again, but I can't bring myself to continue my earlier projects. I've made a number of prototypes now that I feel are pretty good quality, but when the newness wears off and the low-hanging fruit are all finished, I find myself no longer interested in plugging away at it for months to make a finished game. I'm thinking I can work with that tendency in myself by sticking to minimalistic games with a tight scope, and make a game that I can call finished in a month instead of just being a prototype of something bigger that never pans out. I'm still brainstorming what that looks like exactly - maybe I'll challenge myself to build many small games into a single generic project, and then I can preserve the boilerplate stuff between games and have the possibility of doing a Warioware style thing with them later on.

  • KoopahTroopahKoopahTroopah The koopas, the troopas. Philadelphia, PARegistered User regular

    Twitch: KoopahTroopah - Steam: Koopah
    Switch: 1639-6388-9968- PSN: Koopah089 - Extra-Life 2021 Info!
    DisruptedCapitalistElvenshae
  • IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    Zek wrote: »
    I've been trying to work myself up to getting into game development again, but I can't bring myself to continue my earlier projects. I've made a number of prototypes now that I feel are pretty good quality, but when the newness wears off and the low-hanging fruit are all finished, I find myself no longer interested in plugging away at it for months to make a finished game. I'm thinking I can work with that tendency in myself by sticking to minimalistic games with a tight scope, and make a game that I can call finished in a month instead of just being a prototype of something bigger that never pans out. I'm still brainstorming what that looks like exactly - maybe I'll challenge myself to build many small games into a single generic project, and then I can preserve the boilerplate stuff between games and have the possibility of doing a Warioware style thing with them later on.

    If the games you like to make can be converted to text, I strongly advocate voice games for small scopes. It's a wide open space with a *need* for games to stay simple.

  • PeewiPeewi I'm a cube now Registered User regular

    I see that there's a link the description, but I feel like that video was awfully light on details you'd need to decide whether to use this thing.

    My very first question is if it's part of the Unity game engine or available for use with any engine.

    Clicking through, I see it advertised with games that have been out for a while, so I guess this is a rebranding of something that's been around for a while.

    Switch: SW-6132-4331-5349 || Steam profile
    Nz3ekNM.png
    Elvenshae
  • KupiKupi Registered User regular
    Kupi's Second Consecutive Friday Game Dev Status Report

    This week I did some bug-hunting regarding the glitchiness observed from last week's "match rotation to direction of travel" feature. In addition to what you can see in the video I posted, it was also possible to get Sonic to attach to a wall (which you're normally only supposed to be able to do by transitioning from a more traditionally walkable surface) or even squirt yourself entirely through the cracks between tiles and wind up embedded in the wall or falling through the sky on the other side. Ultimately, this boiled down to two cases with independent solutions, shown here in MSPaint-o-vision:

    ne8r2znrhug8.png

    In the first case (top two pictures), there's a basic airborne collision involving the moving character striking the wall and reflecting off of it. The velocity updates correctly, but the problem is that the character's pivot point sits on the point of the triangle. On the same frame that its velocity changes due to the collision, it immediately updates its rotation to match the new trajectory-- which puts the volume mostly inside the wall. As you'll see in the second case, it turns out that overlap collisions are unpredictable in terms of how they determine what the collision point actually is. In an ideal case, the volume only strikes a single volume-- but if the rotation placed it over the seam between two tiles, it might actually register the collision point as one of the upward-facing surfaces on the inside of the wall, call that a viable attachment point, and hey presto you're walking inside of a brick wall.

    The solution to that problem was not to allow extreme changes in rotation in a single frame. Instead, the system now calculates the target angle, determines whether rotating clockwise or counter-clockwise involves less total angular movement, and nudges the character's rotation in that direction by a specified step value. Most of the time you can't tell the difference, and now you can't get yourself stuck in a wall.

    However, as I played around with the prototype a bit, I found that you could still get Sonic running along a wall if you just barely tapped him off the edge of a platform, which, while potentially interesting from a gameplay standpoint, wasn't my intent, so I set out to fix it. At first I thought the problem was something I'd left as a TODO item, namely, that walkability analysis (determining if the surface you've struck can be attached to) doesn't take the direction of gravity into account. But it turned out that that was making walkability analysis more strict, not less; to attach to a platform from the air, the surface normal has to be within a certain angle of objective up. So Sonic striking a vertical wall should never have been able to attach to it from the air.

    As it turned out, he wasn't attaching to it from the air! I set the game up to only advance the gameplay state every fifth frame, and noticed that Sonic was somehow winding up on the corner of the tile as he dropped past it. After looking into it a bit more, I determined that it was another case of rotation causing an overlap collision. Basically, after Sonic left the ground and went into the airborne state, the volume started to rotate. This rotation caused a very minor overlap with the tile's corner. Because of the way overlap collisions between polygons are handled in my engine, the collision point from the perspective of the triangle is the corner of the square tile-- and the surface normal of a vertex is the average of the normals of the two segments it helps form. That surface normal was within walkability of objective up, so Sonic attached to the platform at that point. On the next frame he stepped out into space, scanned for a walkable surface, and because walkability analysis is more lenient (no longer constrained to objective up) when already walking on a platform, he transitioned to the new surface-- the side of the tile.

    The solution to this second case is a bit dodgier, but I've made it such that overlap collisions (happening at t=0 within the frame) are always treated as a non-walkable from the air. And so far, it seems to be working. Seems to be.

    My next goal is to create some half-pipe and ramp tiles and build a test level that isn't so angular.

    My favorite musical instrument is the air-raid siren.
    Elvenshae
  • CornucopiistCornucopiist Registered User regular
    So, another request for some basic CS help...

    I have this class in my CommandManager script:
    [System.Serializable]
    public class l2gx_command
    {
        public string commandName;
        public bool isToggle;
        public float commandTimer;
    
        public l2gx_command(string command_name, bool toggle, float command_timer)
        {
    
            commandName = command_name;
            isToggle = toggle;
            commandTimer = command_timer;
        }
    
    }
    

    which is called in this class in the ButtonManager script:
    [System.Serializable]
    public class l2gx_button
    {
        public GameObject buttonObject;
        public Vector2Int xSize; // this shows the leftmost square, then the rightmost square
        public Vector2Int ySize; //this shows the lowest square, then the highest square.
        public string buttonName;
        public l2gx_command command;
    
    
        public l2gx_button(GameObject new_buttonobject, Vector2Int new_vector2Int_X, Vector2Int new_vector2Int_Y, string button_name, l2gx_command _command)
        {
            buttonObject = new_buttonobject;
            xSize = new_vector2Int_X;
            ySize = new_vector2Int_Y;
            buttonName = button_name;
            command = _command;
        }
    }
    

    What I get in the Unity inspector is this:

    ltx4tn0pox62.png


    But what I'd like to do is declare a list of l2gx_commands in the CommandManager and have them selectable in the inspector for the ButtonManager script. Any tips as to where to go would be very appreciated!

  • KupiKupi Registered User regular
    Kupi's Third Consecutive Game Dev Report On A Friday

    Not much impressive to present this week; it takes longer to describe the changes than to feel their effects.

    I set out to build some "pipe" and "half-pipe" tiles to augment my Sonic-like prototype with something like curved surfaces. This led me to realize that I had a pretty glaring error in my Body (non-overlapping, physics-driven collision volume) initialization and update logic. Essentially, I'd built all my static tiles to have sprite animations, which normally updates the collision volume information as well. However, despite having the underlying data updated by the sprite animation, the changes were never actually applied to the collision volume because that's the first step of the physics system, which only operates on dynamic (velocity-having) objects, not static ones. I had previously built in a fallback to have Bodies start with initial data, but that involved manually entering physics volume information unique to each character. I revised this to use a series of fallbacks while initializing a Body:

    - If the Body has a sibling SpriteAnimation component (which itself has a default animation setting), then it takes the physics information from the first frame of the SpriteAnimation component's default animation's first frame.
    - If not, the Body has a new setting for a named content object containing the collision volume data to use. (Essentially, taking its defaults from a reusable file rather than having to enter specific settings.)
    - If the Body's default is not set, then it uses a simple point in space at [0,0] relative coordinates and shame on the designer for not setting their defaults correctly.

    Once I had the actual physics volumes initializing correctly, I noticed three issues with the way definitely-not-Sonic-the-Hedgehog moved:

    - He could accelerate to arbitrary speeds while traveling up a wall, but not along the ground.
    - Sonic never actually detached from the ceiling and could just hang out there indefinitely.
    - Jumping in any circumstance other than flat land tended to put you somewhere in the wall.

    The first issue was because I have been capping his speed on individual axes; it was simple enough to say that while he's in the grounded state the velocity cap applies to the velocity's overall magnitude, not strictly on the X axis.

    I thought I'd put detachment from the ceiling in the base platforming logic, but it just wasn't implemented anywhere. I slapped a simple, not-entirely-realistic bit of ceiling-attachment logic in the Sonic character state machine. Essentially, to remain attached to a surface that's pointed in the direction of gravity, you must be moving above a certain velocity. The more the surface you're on points in the same direction as gravity, the faster this requirement increases, such that it's impossible to walk on the ceiling forever but you can manage it for longer periods the faster you're moving.

    The third issue was a combination of factors; the simplest contributing factor was that my "jump" logic was implemented as an impulse on the velocity straight upward. So, if you were walking on the ceiling, jumping forced you toward the ceiling. I changed the jump impulse to be along the surface normal of the platform, but there were still cases where Sonic would wind up hitching on the wall. And the problem turned out to be, as it was last time, that the rotation logic was putting his body volume in the wall. I tried two things to fix this:

    - First, I scaled Sonic's rotation per-frame according to his velocity. Basically, the faster he was moving, the faster he'd rotate in midair. This would ostensibly permit some time to get distance from the wall before rotating. This fixed some but not all of the wall-hitches, but also made the rotation act wonky during gently-sideways jumps.
    - The more drastic solution was to use a shapecast during the rotation update to determine if the rotation will result in an overlap with a surface and reject the rotation if so. It took me a few attempts to make this visible, but you can see toward the end of the video below how Sonic's hitbox isn't rotating while he's falling next to the wall.



    I'm enjoying tinkering with this prototype, so my next project may be to get layer transitions implemented so we can have a proper loop-de-loop. Either that, or I'm going to switch tacks entirely and put the ideas I've had for an RPG battle engine into a prototype of its own.

    My favorite musical instrument is the air-raid siren.
    Ianator
  • RendRend Registered User regular
    So, another request for some basic CS help...

    I have this class in my CommandManager script:
    [System.Serializable]
    public class l2gx_command
    {
        public string commandName;
        public bool isToggle;
        public float commandTimer;
    
        public l2gx_command(string command_name, bool toggle, float command_timer)
        {
    
            commandName = command_name;
            isToggle = toggle;
            commandTimer = command_timer;
        }
    
    }
    

    which is called in this class in the ButtonManager script:
    [System.Serializable]
    public class l2gx_button
    {
        public GameObject buttonObject;
        public Vector2Int xSize; // this shows the leftmost square, then the rightmost square
        public Vector2Int ySize; //this shows the lowest square, then the highest square.
        public string buttonName;
        public l2gx_command command;
    
    
        public l2gx_button(GameObject new_buttonobject, Vector2Int new_vector2Int_X, Vector2Int new_vector2Int_Y, string button_name, l2gx_command _command)
        {
            buttonObject = new_buttonobject;
            xSize = new_vector2Int_X;
            ySize = new_vector2Int_Y;
            buttonName = button_name;
            command = _command;
        }
    }
    

    What I get in the Unity inspector is this:

    ltx4tn0pox62.png


    But what I'd like to do is declare a list of l2gx_commands in the CommandManager and have them selectable in the inspector for the ButtonManager script. Any tips as to where to go would be very appreciated!

    I’m not certain whether or not you’ve solved this problem yet, but if not, based on your description, it seems like those classes are not the only classes in those files. Have you tried moving those classes into their own files, each with a name of <classname>.cs ?

    DisruptedCapitalistCornucopiist
  • PeewiPeewi I'm a cube now Registered User regular
    I haven't done any work on a specific game in the last week, but I have been working on a text rendering library for Monogame.

    a57bQ4r.png

    It's not completely from scratch, I am basing this off of someone else's proof of concept, but I have greatly improved performance and added features. Turns out that using a separate texture and draw call for every letter isn't great for performance.

    Switch: SW-6132-4331-5349 || Steam profile
    Nz3ekNM.png
    KupiCorsini
  • CornucopiistCornucopiist Registered User regular
    edited November 4
    Rend wrote: »
    I’m not certain whether or not you’ve solved this problem yet, but if not, based on your description, it seems like those classes are not the only classes in those files. Have you tried moving those classes into their own files, each with a name of <classname>.cs ?

    No, I haven't, but that sounds like it might be useful.

    I worked out something that works by using scriptable objects. I might still try your method because I don't *love* having each command as a separate object in my projects folder. However, what I have right now works and creating all my commands is inarguably less work than using another method that works slightly better.

    All this was triggered by me needing a (start new game) menu on top of the regular game controls:
    -move game controls scripts onto the 3d game controller prefab
    -make that testable and independent of the gamemanager
    -create a better gamestate method and handling (move to enums)
    -remove the touch input from the controller
    -give controller an input queue targeted by the gamemanager touch input method based on the gamestate
    -revamp the commands setup so I can easily configure commands on controllers.

    Can't really remember when I got started, and real life interfered with my SO starting school again, a bad cold and buying 600kg of IKEA furniture to assemble into a kids room... but it was quite a lot and I look forward to just finishing a few small details before my vertical slice is done :biggrin:


    Cornucopiist on
  • KupiKupi Registered User regular
    Kupi's Weekly Friday Game Dev Status Report

    This week's project was to implement the classic Sonic the Hedgehog loop. Even if they don't wind up being a major gameplay element, if you're going to do a Sonic send-up, you've got to include at least one loop. Fortunately I already handled the part of the physics engine that observes changes to a volume's collision layer, so this was mostly just a matter of creating some game objects with the appropriate settings and adding a few new behaviors to the Sonic character controller. And then, of course, debugging the issues that stressing my code in these new ways revealed.



    I started with a simple design for how layer transitions would be handled, realized that the simple design wasn't viable for various reasons, came up with a more complicated design, and then revisited the issue after a day or two of sketching out proposals in my notebook and determined that something more like the simple design was actually viable. Here are the core components to how it works:

    - Like in Unity or Unreal, the physics system has a set of "layers". Every collision volume is assigned exactly one physics layer at any time. Physics layers can only interact with the subset of other physics layers that they've been configured to.
    - There are three layers; internally, I refer to them as "A", "B", and "AB". You can also think of them as "Foreground", "Background", and "Both". A interacts with itself and AB, B interacts with itself and AB, and AB interacts with all three layers.
    - While in the grounded state, Sonic updates his layer to match that of whatever tile he's currently standing on. (In the air, he retains whatever his last layer was.)
    - There is a special "layer forcing" component which causes Sonic to assume the specified layer while standing on that tile rather than assuming the layer of the tile itself.
    - The loop tiles themselves consist of a single semi-solid surface each at layers arranged to guide Sonic's progress through the loop. Starting from the "on-ramp" in the lower-right quadrant and proceeding counter-clockwise, their layers are, A, AB, AB, and B.
    - The solid tiles immediately below the loop on either side have the layer-forcing component matching the ramp they approach, forcing the A layer on the approach and the B layer on the departure.

    As ever, this probably works better visually than verbally, so here's a diagram of the construction:

    o63zt0mcu0i3.png

    The idea is that as Sonic approaches from the left, he's forced onto layer A by the solid tiles he's walking on. Even if you approach the loop by jumping from an AB tile, the semi-solid surfaces will permit him to enter the loop, and he'll come into contact with whatever surface his trajectory takes him toward from there. (Saying this made me go check; it is in fact possible to subvert the loop by carefully jumping from an AB tile such that you land on the walkable part of the "off-ramp". Ah well, call it speedrunner tech.) In any event, Sonic will eventually strike the "on-ramp" tile, switching to that tile as his parent object and remaining on layer A. He continues onto the ceiling tiles (which he can collide with because layer A interacts with layer AB). While on the ceiling, he's on layer AB, so he can now interact with the "off-ramp" tile on layer B. He can interact with the solid tiles past the end of the loop because layer B can interact with layer AB, and since he's on layer B he now ignores the "on-ramp" on layer A. Voila! Looping!

    I ran into two significant problems getting this set up. The first was with the nature of semisolid surfaces in my engine. Previously I'd implemented semisolids by having them reject any shape-cast attempted against them which began from an overlapping position. That is, you could only collide with a semisolid surface by moving into it. However, in actual practice what I discovered was that Sonic would arbitrarily detach from whatever part of the loop he was currently on. What I discovered was that the ray-cast the platforming system uses to determine if the character is still above a walkable surface was failing because the point was starting exactly on the surface. Basically, the collision volume was moving juuuuuuust enough that it didn't strike the next line segment on the curve, but also close enough for the code that checks if a point is on a line segment to come up true. Furthermore, it was also possible to drop out of the loop if you fell off the ceiling, for the same reason you could get stuck in the wall in the previous post-- Sonic's checking that his rotation won't cause him to overlap with a surface, but the semi-solid surfaces will never report an overlap! This vexed me for a while until I realized that the physics system is the only thing that actually cares about ignoring overlaps involving semi-solid surfaces. So I restored the ability of semi-solid surfaces to report an overlap, but now the physics engine "one layer up", so to speak, is the one responsible for saying "that collision involved an overlap with a semi-solid surface, so it doesn't count".

    After I fixed that, the other problem I encountered was that Sonic would randomly go back through the loop. The problem there was that I "float" collision volumes that come into contact to reduce the chance that they'll re-collide early in the next frame. And what happened was that the aforementioned raycast used to check if the character is still on a platform (any platform) was striking the start of the loop after attaching to the platform below it.

    bifhur23xvyr.png

    In the span of a single frame, what happens is Sonic (the blue triangle) moves along the end of the off-ramp while on layer B, strikes the solid tile on layer AB and attaches to it. He then floats above the platform which places the start of the raycast above the on-ramp tile. Attaching to the solid tile moves Sonic to the AB layer, which means that his platform-continuity raycast (that tiny little red line) takes place on AB. It catches the on-ramp, and Sonic attaches to it... again. So the loop... loops. That's why the layer-forcing component became necessary; it keeps Sonic on layer B so he doesn't collide with the on-ramp again before leaving the loop.

    This next week I'll be taking a break from gameplay prototyping and revisiting my project editor. The little papercuts involved in using the thing are starting to add up, so I think it's time for a quality-of-life pass on the tool I'm ostensibly going to be spending a lot of time in. See you next week!

    My favorite musical instrument is the air-raid siren.
    PeewiElvenshaeIanatorZavianKoopahTroopah
  • ScooterScooter Registered User regular
    Thought I'd give a bit of an update on my game sales since my release last month. I've been doing basically no additional marketing or anything, spending my time prepping my next game (and of course the day job), so I'm pretty much already into the long tail stage it seems.

    Steam sales have steadied out into an average of 2-4 sales a day, with an occasional peak of 10 or so. At that rate it'll take me 2-3 years to earn back what I spent (mostly art commissions), but hey, it's passive income now. Itch.io sales have dropped to almost nothing once the release week was over. And Gamejolt...still only has 1 sale on it. Honestly, GJ didn't take a lot of work to set up but it didn't even really pay off for that, I doubt I'll bother with it for my next game. And aside from that, the Patreon's been going fairly steady (if very modest) despite the fact that I won't be releasing any new game content til 2022 sometime. Overall my sales are at about 80/20 Steam/itch, but likely to skew way more towards Steam over time. I suppose Steam's algorithm actually manages to get people to look at games, who knew.

    As for my next game, I've been spending a lot of time setting up the new dialogue system and character generation. I've been working to make the dialogue system much more flexible than my last game, which had a static number of option buttons of a fairly small size, and scenes were limited in the number of parts they could have unless I did special case code for them. Scenes in Sequel are much more modular, with chains of text and interaction buttons, which can now be displayed dynamically and contain entire sentences or action descriptions in them. I've also been doing some extra work on the text switching system - as a text-based game with lots of variable character genders, being able to dynamically set pronouns and other words in scenes has always been a major task. In First Game I had to set up the full list of switches for every scene, but in Sequel I'm setting things up to handle a lot of the most common words as part of the system, which will hopefully make writing scenes a bit easier. It's all a bit more complicated to set up, but I'm hoping Future Me will appreciate it.

    CorsiniLilnoobs
  • PeewiPeewi I'm a cube now Registered User regular
    I actually put my text rendering in my game.

    Before (almost, I had already updated the FPS display. But the HUD text is still pixelated.):
    bjgPhaL.png

    After:
    9pWAfPV.png

    Switch: SW-6132-4331-5349 || Steam profile
    Nz3ekNM.png
    DrovekElvenshaeLilnoobsZavianKoopahTroopahDisruptedCapitalist
  • ZavianZavian universal peace sounds better than forever war Registered User regular
    Unity 3D Asset Store is doing a FREE giveaway of three assets, including a pretty neat looking Love/Hate system
    8KS1CBh.jpg
    https://assetstore.unity.com/publisher-sale#free-asset-giveaway-MGg6

    KoopahTroopahElvenshaeDisruptedCapitalist
  • ScooterScooter Registered User regular
    Was just planning a relationship system for my next game, so that could come in well-timed.

    ElvenshaeZavian
  • PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    edited November 8
    How does the first load of a UE5 project take 15 minutes? Why are there like 5000 shaders to compile to start the editor? This is ridiculous

    Why are there 10000 shaders to prepare once it has launched???

    Phyphor on
  • KupiKupi Registered User regular
    Phyphor wrote: »
    How does the first load of a UE5 project take 15 minutes? Why are there like 5000 shaders to compile to start the editor? This is ridiculous

    Why are there 10000 shaders to prepare once it has launched???
    Carl Sagan wrote:
    If you wish to make an apple pie from scratch, you must first invent the universe.

    :lol:

    But, more seriously, that's sort of why I think of Unreal Engine as a "built for professional applications" engine. The default assumption is that you're building a AAA title on some kind of 128-gig RAM machine, and that 15-minute startup is peanuts compared to the four-year dev cycle.

    (It is also entirely possible that UE5 is actually just hilariously inefficient at this juncture.)

    My favorite musical instrument is the air-raid siren.
  • ScooterScooter Registered User regular
    I recently updated to the latest Unity for my new project, from 2017, and I've noticed that's a lot slower now too. Not really sure what it needs to do now that it didn't before.

  • PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    I'm mostly curious as to what it is even doing. There's no assets at this point other than the editors own assets and surely it doesn't need that much

    I want to try ue5 because I think the features will be very helpful, I want to prototype a factorio-like and I've been playing a lot of DSP/satisfactory lately and it's clear where there are concessions to it being 3d

    Factorio is mostly Just A Bunch Of Quads and it has an order of magnitude or two more things going on than in the 3d games, I'm curious if nanite can allow factorio object counts

  • PeewiPeewi I'm a cube now Registered User regular
    At least shader compilation isn't something that has to be done from scratch on every launch. I assume. I don't actually have UE4/5 experience.

    I just learned that I can specify my own vertex format in Monogame. This lets me use vertex colors to specify separate fill and stroke colors on my text.

    Switch: SW-6132-4331-5349 || Steam profile
    Nz3ekNM.png
  • PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    Peewi wrote: »
    At least shader compilation isn't something that has to be done from scratch on every launch. I assume. I don't actually have UE4/5 experience.

    I just learned that I can specify my own vertex format in Monogame. This lets me use vertex colors to specify separate fill and stroke colors on my text.

    Fortunately not. I can deal with the 20 seconds launch-to-editor time especially since it can do dynamic reloads

  • Mc zanyMc zany Registered User regular
    Probably the starter assets.

  • HandkorHandkor Registered User regular
    The first time you launch unreal engine it's painful as it build it's full cache of shader, after that it loads very quickly. What it is doing it setting up all the shader variants to your current settings. If you go and turn on raytracing it's going to rebuild everything for use with raytracing. Switching back though will not take anytime since the cache is already built. That is until you delete that cache because it's taking 60GB just because you loaded some asset pack and then have to wait all over again.

  • KupiKupi Registered User regular
    Kupi's Weekly Friday Game Dev Status Report

    An apology in advance: I've been posting a lot of (comparatively) fun video footage; this week I actually did a lot of work but it's all in that ivory-tower bullshit space that takes a lot of words to describe and doesn't have a lot of presentable outcomes. So, I'll try to be brief in describing what I tried to accomplish this week (interjection from Future Kupi: I was not brief).

    Essentially, I spent the entire week white-whaling in the serialization systems. In a previous game project, I used binary output into and out of files for content management, which had a lot of little pain points (and ultimately led to me to using something more standard for this project). But I thought I might have a way to work around those pain points and get all the file-size advantages of a binary format, so I got myself into a clean state on the ol' git repo and and launched an exploratory branch.

    In my previous project, any object capable of going into or out of a file was obligated to implement two functions: one to write itself out to the file, and one to read itself in from the file. There were helper methods for reading various objects in or out, but by and large it was a pretty manual process. This structure was the source of two major recurring types of errors:
    - A mismatch between the input and output functions resulting in data corruption or EOF-prompted crashes because you weren't at the point in the binary stream you thought you were.
    - Adding a new data member and then forgetting to add it to the serialization method(s).

    The solution I came up with to the first problem was instead of having methods on the objects to directly save or load them, a serializable object is obligated to return a set of data contracts, each of which is an ordered set of individual value serializer objects, which have a getter and a setter function that sets the value on whatever object you're trying to load. The trick is that you can't initialize the individual value serializer without having both the getter and setter in hand, so there's no chance of forgetting to add a new property to either the saving or loading function. And separating the data contracts into individual versions means you can load a previously-saved object into your most current version of the object.

    However, when I ran into collection types (especially some of the weird dictionary structures in my project) I realized that it was going to take far too much effort to get this binary-serialization engine going for too little payoff, so I abandoned that line of reasoning entirely.

    From there, I moved on to trying to replace the Newtonsoft.Json library I'm using for handling JSON files with the equivalents from System.Text.Json. My two three main motivations for that are 1) I'd rather use a built-in library when I can, and System.Text.Json was architected by the guy who wrote Newtonsoft.Json anyway, 2) System.Text.Json is built with a number of optimizations designed to keep allocations low (for instance, it has systems that can deserialize to struct types without boxing, and I use struct types heavily), which is important because I'll frequently be loading files from disc during gameplay and want to trigger as few GCs as possible, and 3) System.Text.Json is more standards-compliant than Newtonsoft.Json, meaning it lets you do fewer screwy things with your JSON structures.

    ... unfortunately, it turns out that MonoGame's main console-compatible version only works with .NET Core 3.1, and the version of System.Text.Json that shipped with that version of .NET Core has a very minimal feature-set. For instance, it can only handle public properties of objects, when I've been working predominantly in public fields on structs*. This is one of those roadblocks which is less difficult than it rather involves affecting a pretty agonizing number of lines of code. Which, in fairness, resulted from my use of non-idiomatic practices in the first place.

    On an entirely different subject, I'm finally getting some wind in my sails on the retraining-for-paid-labor front; this week I spent close to two hours a day working on my other game-dev project, a low-featured Pokemon knockoff played in the console (for now). The idea is that games hit some large portion of the fundamentals of coding (console I/O, data structures, file I/O, building, including content files in a project, networking (in some cases), and so on and so forth) that building a small game is a good way to get familiar with the way a given language or environment handles those things. I'm making this game first in C#, and then I figure I can move on to other languages I want to learn once that's finished. So far, I've built out most of the core game data classes, a data-driven event system, and a basic console-based here's-the-situation-pick-your-option player interface. As that project progresses, I'll continue to report back on it.

    For the next week, I'm continuing the transition to the new JSON library, and really feeling like I should put some time into art practice.


    * It is hilarious to me, in a frustrating way, how "don't use mutable structs" has become something of a religious utterance in the C# questions at StackOverflow more than a reasoned position.

    My favorite musical instrument is the air-raid siren.
    IncenjucarIanatorElvenshaeDisruptedCapitalistCornucopiist
  • halkunhalkun Registered User regular
    edited November 14
    So I picked up a project I abandoned about a year and some change ago, and have invested at least three weeks of work into it. After I get over my few bugs and get a stable base, I'll be recognizing the source files and uploading to my private GitHub to work on it "for reals" with an allocated time bank during the week.

    It's called (for now) Blind. and here is a screenshot.
    pkGD3f2.png

    I am not joking...

    This game is (are you ready for this?) a 3D first-person action adventure game set in a techo-fantasy world ala "Phantasy Star". You play as "Alex" who is a neophyte adventurer utterly and completely blind. The plot is slowly morphing as I add technology to the engine.

    "But Halkun?" I imagine you asking, "How on Earth do you play a 3D first-person game with no graphics?
    I was inspired by this YouTube video of a blind guy playing Zelda:OoT. The tricks he uses to navigate are really neat, but you know.. What if you made a game just a *touch* more blind friendly?"

    So That's what I'm doing. The whole system will be based on 3D audio for queues to navigate the world. (OpenAL).

    I'm also writing my own engine. Right now you can walk into different rooms, align to the cardinal directions, and slide against the walls, and fall. Nothing earth shattering yet. Also audio isn't a thing yet as I have some scaling issue to work out, but after I get things stable where I can make rooms in my editor and have my friend navigate around at least on the ground with obstacles, I'll be adding in the sound and game objects.

    Wish me luck.. This is going to be fun!

    ==NINJA EDIT==

    Here is an update from another thread of my progress..

    I have my sectors all good (for the most part) now I have to do.... scaling.

    I'm using the Duke 3D "build" map format for my game. I like the duke "build" map because it is so simple. Doom's Wads are too complicated and I don't have to worry about nodes, linedefs, sidedefs etc etc.. and I get a level editor for free! (As long as I'm not using any of the duke code and only following the map spec, I have a clear copyright too :) The base engine I'm using is CC0, so I guess it's my engine now. Public domain is really rad!)

    Now here comes the fun part.. Scaling.

    My engine came with units in.. feet(!) and I'm going to convert that to meters. It's coordinate system is Y+ = North, X+ = east, and Z+ = Up. Angles are in terms of Tau (2*pi, counter-clockwise)
    In build, 1 meter = 512 units, and it's units are Y- = North, X+ = east, an Z- = up. Oh and the Z axis is a different scale then the X and Y (8,192 units per meter). Angles are 0-2047 units, clockwise)

    The angle conversion is done and I already flip and scale the Z axis on load, however, when flipping the Y axis, my sector math breaks down because the sector line crossing algorithm assumes the sector vertexes are clockwise. I'll make due with my levels being flipped.

    halkun on
    dA03mgx.png
    IanatorHandkorKupikimeElvenshaeDisruptedCapitalistCornucopiistMechMantisMrBlarneyCalica
  • HandkorHandkor Registered User regular
    I had started a similar blind tech demo with no graphic to try out UE4's new sound engine with positional audio that also supported attenuation and sound absorption through materials. So a wall will actually dampen and filter the sound from a source that is occluded to the player.

    The audio system works well but navigating with sonar ping and environmental sounds is very hard. Head tracking on a VR headset works well for this though as moving your head around to sweep is easier than mouse turning.

    The game was set in a small sub that you moved around a cave environment very slowly.

    ElvenshaeCornucopiist
  • halkunhalkun Registered User regular
    edited November 15
    There are a few "concessions" I added to make the game a little navigable. First with the arrow keys or the d-pad you can orientate yourself to face north, south, east, or west at perfect 90 degree angles. from that you can use "mouselook" to stand in place and turn left or right to follow the sound. The arrow keys will always snap you in a cardinal direction. In fact I've had to rely on this as I can't see the levels I make either and it has proven to be quite useful in navigating.

    The second concession is when a "game object" appears in the same sector as you, you will "alert". From there you can cycle targets using the bumpers, and and audio cue will determine if what you have targeted is hostile. (But with no range info)

    The 3rd concession is a very very basic illumination spell that takes 0 MP to cast. On it's face, it's an absolutely worthless spell, but you can fling the ball of light against a wall or to a targeted object to see how far away it is. It however does next to no damage and it does a great job revealing where you are to enemies, so there are drawbacks.

    Also to help with navigation, while moving, every footfall is exactly one meter apart so you can use that for pace-counting. Crouch-walking will keep you from going over edges (While walking at a slower pace) and while crouched, swinging a melee weapon will swing downward in the space in front of you.

    The floors and walls will have materials so when struck will sound different. (Doors for opening, vines for climbing, wood bridges, stone paths) also, while next to a wall, you can slide against it and it makes a noise.

    For the most part, moving you head up and down does nothing except in swimming which sets if you are diving down or surfacing up. The Middle mouse button or walking will snap to looking straight ahead.

    If you are feeling lucky, you can "fire blind" with no targeting.

    Oh and one last thing. Inventory rotation is the other bumpers. Some items will have "subsections" while selecting them. For example, rotating through your weapon inventory will make the sound of that thing, and while holding that item (Before you release the bumper) you can manipulate some aspect of it. For example, while "holding" the bow, you can press action and it will click though the arrows in your quiver so you can count them.

    Little ideas like that come up. However my story is a still a little light. However, I'm sure each challenge I come across while developing will inform the plot.

    One more mechanic is a "Lighting Save" and "Lighting Load" which is the equivalent of a save state.

    I'm very tempted to make it a comedy.



    halkun on
    dA03mgx.png
    ElvenshaeHandkorCalica
  • CornucopiistCornucopiist Registered User regular
    halkun wrote: »

    The 3rd concession is a very very basic illumination spell that takes 0 MP to cast. On it's face, it's an absolutely worthless spell, but you can fling the ball of light against a wall or to a targeted object to see how far away it is. It however does next to no damage and it does a great job revealing where you are to enemies, so there are drawbacks.
    ....

    The floors and walls will have materials so when struck will sound different. (Doors for opening, vines for climbing, wood bridges, stone paths) also, while next to a wall, you can slide against it and it makes a noise.

    Combine those to make 'throw a pebble'. The pebble will hit the wall and make a sound informing the player of timing and the material. If you really need it for seeing players, you can make a luminescence on impact.

  • HandkorHandkor Registered User regular
    I wonder if there would be a market for audio only games where the input is limited to tapping/jostling your phone while it's essentially in your pocket or hand. Audio games for the bus there all you need is to put on earbuds and close you eyes while you just tap on your phone in you hands without needing to look.

    Easy to do for choose your own adventure audio books but something more involved where the accelerometers pick up the direction of your tap.

    IncenjucarIzzimach
  • IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    Handkor wrote: »
    I wonder if there would be a market for audio only games where the input is limited to tapping/jostling your phone while it's essentially in your pocket or hand. Audio games for the bus there all you need is to put on earbuds and close you eyes while you just tap on your phone in you hands without needing to look.

    Easy to do for choose your own adventure audio books but something more involved where the accelerometers pick up the direction of your tap.

    I've done a lot of theory crafting on this one while developing voice games. Not sure about the phone, but you can add devices to Alexa skills like keyboards and controllers.

Sign In or Register to comment.