Note that the license states that you can only use them with Cryengine so don't buy it thinking you can just rip them to your UE4/Unity project or whatever.
That was a nice write up of separating state from game objects and TDD.
One thing to add is if you are writing unity code this way for testing purposes you are going to have to think about how time is used and tested in Unity and your game. Any dependency on Unitys' built in time has to be abstracted so you can ensure consistency in tests i.e. never call Time.deltaTime directly. I use a separate interface and class with one time instance per scene but the nice thing once it's abstract a timer can be sped up, slowed it down or potentially reversed or depend on another source of time (say if you want to pause time globally but have more than one clock/time source, or just have a simple switch which elapses after a period of time).
It does add a bit of book keeping to game objects but having testable components is worth it. They have to find the global time source via either a singleton or a parameter to the game object and then pass that to state object (via constructor). The timer scripts also have to run prior to game objects which depend on them which can be set in the editor (under edit->project settings->script execution order).
This is what I use atm.
public interface ITimeKeeper {
/**
* Time past since last update.
*/
float DeltaTime {
get;
}
/*
* Time since the source started.
*/
float Time {
get;
}
/**
* Unmodified time since the timer started.
*/
float RawTime {
get;
}
/**
* Amount of timie spent paused.
*/
float PauseTime {
get;
}
/**
* Scale at which time passes. By default it is 1.0f.
*
* Setting this to 0 will cause no time to pass when running or paused.
*
* This affects Time, PauseTime & DeltaTime.
*/
float Scale {
set; get;
}
/**
* Pause regular time and keep track of time spent paused.
*
* This is not the same as stopping.
*/
void Pause();
/**
* Resume keeping time.
*
* Does nothing for stopped or new timers.
*/
void Resume();
/**
* Start counting time.
*
* Does nothing if running or paused.
*/
void Start();
/**
* Stop counting time altogether.
*
* Properties of the source such as time elapsed will be frozen
* until Start is called.
*/
void Stop();
/**
* Called once per frame before any dependencies.
*/
void OnUpdate();
}
There is a default time keeper which is updated once per frame before any other scripts are ran.
An alternative to this approach would be to pass elapsed time from Unity to the state object whenever an Update call happens. I prefer the approach above because of the benefits. It makes pausing and having separate timers progressing at different rates easier.
MachwingIt looks like a harmless old computer, doesn't it?Left in this cave to rot ... or to flower!Registered Userregular
A programmer here recently made a very compelling argument ~against~ test-driven development. The gist is that time spent writing tests is time spent not writing code, and there's no way you'll cover all possible states of your object(s) in your tests. He linked a study that indicated unit testing to have virtually no impact on how many bugs a team produced for a given project size.
He suggested using asserts heavily and following design-by-contract (basically, asserting the pre- and post-conditions of a function before writing the function) instead-there's less back and forth between writing your program and writing the program to test your program, and by leaving the asserts in the release version of your software you're testing all the states your program will be in.
I am not wholly surprised that bug count produced is identical, to be honest. If you don't think to address it you probably won't think to test for it.
That being said, having an automated test framework and suite makes regression tests immensely easier to implement, as well as having very strong answers to how to test code which relies on big systems like databases and networks, not to mention removing a significant amount of uncertainty and fear in the process of refactoring.
Saying "time spent writing the code to test your code is not time spent writing code" is nonsense, since everyone should be testing their code anyway, and that takes time. Whether you're using TDD or not you spend a ton of time testing code. The question is, do you test it with automated tests or do you run it through manually?
Rend on
0
MachwingIt looks like a harmless old computer, doesn't it?Left in this cave to rot ... or to flower!Registered Userregular
The argument isn't that you shouldn't test your code, it's that TDD specifically is more likely to produce code that's good at passing tests, but it's not particularly better at producing code that serves the stated purpose. You should deffo write tests--but the back and forth of write test--modify code to pass test--modify test to fail--etc. probably isn't test best way to test or write or structure your code.
IDK. The best approach is project and team specific, ultimately.
The argument isn't that you shouldn't test your code, it's that TDD specifically is more likely to produce code that's good at passing tests, but it's not particularly better at producing code that serves the stated purpose.
Okay, but assuming you're 100% right, isn't code that is easier to test and also serves its purpose strictly superior to code which serves its purpose the same way but is also more difficult to test?
You should deffo write tests--but the back and forth of write test--modify code to pass test--modify test to fail--etc. probably isn't test best way to test or write or structure your code.
IDK. The best approach is project and team specific, ultimately.
I have actually found that my code adopts a far superior structure when I use TDD than when I don't. The thing is that structures which are easy to test and structures which follow good design are almost the exact same set.
As a person who resisted TDD for a pretty long time (for a lot of the reasons you've already mentioned) before finally giving it a shot at work, I have been nothing except impressed with every aspect of how it works, and how I work with it. I am pretty certain it's saved me WAY more hours than it's cost me on debugging alone.
At the end of the day for this thread I definitely advocate using whatever method you want, because for the most part this thread is full of people who are making games in their free time because we think it's awesome. If that method is TDD, then I'm excited to keep showing people how it works.
I much prefer integration tests to unit tests because you can pass unit tests and still fail to actually work. Unit testing does force you to think about your interfaces though
I'm looking at that Humble CRYENGINE bundle and I'm wondering, if you were looking to try and make a realistic looking exploration-type game, would CRYENGINE be the way to go?
I'm looking at that Humble CRYENGINE bundle and I'm wondering, if you were looking to try and make a realistic looking exploration-type game, would CRYENGINE be the way to go?
Not if you want to actually finish making a game at any point.
Granted while this is rendered at realtime in 30fps and 1440p on a 980GTX standard, the camera shots they've chosen are obviously really specific to keep from frame drops of excessive detail. I highly doubt it can publish this kind of quality all around a fully realized world. So definitely not showcasing the power of the game engine in terms of games, but totally looks like it's breaking ground for film & CG sequences.
Granted while this is rendered at realtime in 30fps and 1440p on a 980GTX standard, the camera shots they've chosen are obviously really specific to keep from frame drops of excessive detail. I highly doubt it can publish this kind of quality all around a fully realized world. So definitely not showcasing the power of the game engine in terms of games, but totally looks like it's breaking ground for film & CG sequences.
What's more intersting imho is that their rendering doesn't look crap for a change. That's a good looking demo!
https://youtu.be/3LsKw-OGRP4
Close to four hours of research and experiments today to get her limited to that wall, and having a walking animation of sorts.
All I have in my head is black office lady + final fight.
Granted while this is rendered at realtime in 30fps and 1440p on a 980GTX standard, the camera shots they've chosen are obviously really specific to keep from frame drops of excessive detail. I highly doubt it can publish this kind of quality all around a fully realized world. So definitely not showcasing the power of the game engine in terms of games, but totally looks like it's breaking ground for film & CG sequences.
What's more intersting imho is that their rendering doesn't look crap for a change. That's a good looking demo!
So after several hours of trying to make this work in GML to simulate brawler movement, I gave up and looked online, and decided to make my office lady game look like https://www.youtube.com/watch?v=h4ZFuU_BPOo
code for anyone curious.
x = BLShadow.x
if (grounded) y = BLShadow.y;
if (grounded) && (key_jump)
{
grounded = !grounded
y = (BLShadow.y - jumpspeed + grav); //the y point of bosslady will be the y point of bosslady's shadow, minus the jumpspeed, plus gravity, for every frame
while !(grounded)
if (y >= BLShadow.y)
{
if (y > BLShadow.y) y = BLShadow.y;
if(y == BLShadow.y) y = BLShadow.y;
grounded = true;
}
}
line 8 is what shit the bed.
RoyceSraphim on
0
MachwingIt looks like a harmless old computer, doesn't it?Left in this cave to rot ... or to flower!Registered Userregular
GDC IS OVER, fuck
I met so many cool indies and AAA devs alike! I had a brief conversation with willy chyr abt shaders, watched zach gage demo a sick card game, was blown away by a talk by Katherine cross
Drank a lot
I'll be transcribing the notes I took for the talks I attended at some point this week. I'll post 'em here!
Saying "time spent writing the code to test your code is not time spent writing code" is nonsense, since everyone should be testing their code anyway, and that takes time. Whether you're using TDD or not you spend a ton of time testing code. The question is, do you test it with automated tests or do you run it through manually?
Also, a big point of TDD is future-proofing - you make sure that new code doesn't break old code, because old tests will start to fail. I see writing a test as a very small time expense considering the automation it gives me.
But in the world of web apps, on a tiny team it's definitely a pretty large time investment. I know the generally-accepted wisdom is that tests are great, but my personal experience is that you end up with two code bases -- a small one, your application, and then another that's huge, your tests. Maintaining tests -- especially fixtures and UI-related tests -- ends up taking a ton of time.
My experience in the last five years of web app programming (half of that for a startup) is that it's a tradeoff. Tests ensure stability. But they cost you time. If you have the time, go for it. But if you don't, don't sweat it -- just work as quick as you can on the product itself and get it done.
I definitely do not buy the argument that it saves you time. That has not been my experience at least. It gives you confidence, it helps make your application more stable, but it does not save you any time, not even in the long run.
Edit: Also, you can decide to have tests for some parts of your application but not others. At my current job, we have pretty good test coverage of our API, because we want it to be ROCK SOLID in terms of stability. And it's much less time consuming to write tests for an API than it does for our website. For our API, we have valued stability over time.
Our website though, used to have tests, but we decided they took too much damn time to maintain, and would rather be coding new features as quick as we could. There, we value time-saved and quick iteration with new features over stability.
My previous job was on the backend platform for online games, and tests were definitely a godsend when we had them. If a test broke because things have change too much since you wrote it, then it probably would have broken production.
I get the feeling you're talking more about testing the gui as an end user would see it, like with selenium or sikuli. Yes, those are horrible to maintain, but they can also be worth it compared to a manual test. I guess it depends what parts of your product are changing, and how frequently you deploy.
XBL GamerTag: Comrade Nexus
0
MachwingIt looks like a harmless old computer, doesn't it?Left in this cave to rot ... or to flower!Registered Userregular
edited March 2016
Here's my GDC notes, somewhat literally transcribed. I've paraphrased the talk names because me lazy:
edit: FROM HELL'S HEART, I STAB AT THEE NEW SPOILERS. Just quote the post I guess if you want the content? fffffff
Animating Quadrupeds in The Fire and the Flame - Gwen Frey, T. Artist
Talked about 2 creatures needing different solutions: A stout, stubby boar and a slinking wolf. Major difference: Wolf front and hindpaws needed to function independently for turning. Wolf needed to be able to go many different speeds, boar is a charging animal and would mostly go one speed
Biped techniques are no good
Major motions: Forward, Turning, and transitions from start/stop and stop/start
Forward Motion
Boar
Footfalls at same frame for all speeds! Okay since it didn't vary speed much. If the boar sped up/slowed down, speed the anims up/down appropriately.
Linear blending looked awful for just a walk/run. Had more animations for slower speeds. Ex. anim for 10% speed, 25% speed, 50% speed, 100% speed
Wolf
Foot phasing did not line up for all speeds. Solution: find a midpoint between walk/run where the phasing aligns, do a QUICK blend there. Didn't need transition anims as a result.
Turning
Look-ats are super unnatural if you just turn the head. Animals don't just turn their head! They turn their whole body, moving forward. Front paws turn immediately, back paws tend to "catch up" as the animal moves.
Boar
Desire was for the boar to always look at the player. No inter-spine animation really needed (the boar was stubby), so accomplished with blendshapes. Had a forward running shape, strafing-left shape, strafing-right.
Wolf
No special anims to separate front/rear movement. Measured the angle between the wolf's current facing and it's desired facing (the player), distributed that angle through each spine bone. Required that the wolf be moving forward to turn, linked turn rate with movement rate to prevent foot slipping.
Start/Stop Transitions
W/ bipeds, only need to worry about which foot is planted and transition the other foot downward. With quadrupeds, all sorts of foot combinations might be planted. Static anims for each transition were no good!
Boar
Wanted to avoid runtime IK. Solution was to author a "foot-down" version of each movement anim, identical but with the feet vertical position locked to the ground. The anim is played in sync w/ the walk anim, 0% blend until the creature stops. On stopping, the foot-down is blended in and both animations are paused. When movement starts again, the normal walk anim is blended in and sped up to normal.
B/C foot placement when idle is determined by where the walk anim was paused, idle anims were additive and didn't include foot/leg movement. To avoid drifting on legs, the boar's hooves were parented directly to the root and the leg heirarchy was reversed (the "hip"-bone being the last in the chain). An odd solution, not viable across production.
Wolf
Used walk anim freezing like above, but used runtime IK unstead of a foot-down anim so they could do better idles than with the boar. Conclusion: runtime IK is probably a necessity for good quadruped anims (or good transition anims)
Animating With Math - Natalie Burke, T. Artist (Destiny)
Summary: Vertex animation in shaders! Typically seen in vegetation (grass and trees blowing in wind) or cloth (tarps blowing in wind), but used all over the place in destiny to good effect
Driven in destiny w/ gameplay properties, basic math (sin, time, etc.) and COPIOUS usage of vertex colors. Like, 3+ sets of vertex colors on a model. Dang.
Usually applied a dampening function to gameplay properties to prevent instant value changes.
Vertex color uses:
Velocity vectors (RGB) or phase offsets (B) to wave functions.
Baked blendshapes! (vertex color acting as a vector displacement map). Limited to linear blends or pivots about a single point, but this worked well for certain things (jiggling bits on belts, danging chains, etc.)
Very easy to mimic dangling rope/chain deformation w/ vertex colors. (Vex?) architecture had lots of mechanical objects w/chains that go taut, lift an object, drop it down and go slack again. Bone at each end of a rope to attach it, slackening all vertex animation.
Wrote tools! Shader editor. Vertex color editor (ex. for authoring gradients easily, sampling color, managing different color sets)
Art of Destruction in Rainbow 6: Siege - Julien L'Heureux, Technical Lead at Ubisoft
Runtime procedural destruction!
Ubisoft History:
2012 - Procedural glass in primarily decals and basic geometry splitting.
2013 - In-house concept for complex geo destruction, boolean operations on geo. Not synchronous, not optimized.
2015 - Final tech for R6:S!
RealBlast. by Ubisoft's R&D group, consists of a code library for runtime destruction, tools for prebaked destruction, debugger, and a "properties editor" (for authoring different destructable materials)
Runtime destruction limited to planar surfaces, which are projected to 2D, sliced by "cutters", reprojected to 3D space.
Types of destruction in-game categorized as "trapdoors", "barricades", "LOS floors", "breakable walls" (mostly differ in how AI deals with 'em)
Models must be made destruction-ready (airtight geo, interior materials defined. Changed modelling workflow greatly (how many artists worry about cabinet wall thickness?)
AI needed to be aware of destruction (LOS and pathing changes), sound needed to be aware (a room's acoustics change when half the room is blown away), a mandate for state-based destruction.
An object's state is managed as a tree graph, branches are geo chunks with further chunks as potential leaf nodes. Graph could be pre-calculated (deterministic destruction on props like tables, vases, etc.) or dynamic (wall destruction). (Consistent data structure between baked and dynamic destruction?)
"Cutters" authored by artists. Could combine mathematical shapes (voronoi cells, random ellipsoids, etc.) with custom shapes (ex. glass spider-cracking) on a per-material basis. Limited to 2D projection as above.
"Decals" and "Decorators" authored. Decals we know (projected textures), decorators are models place dynamically at site of destruction (pipes and studs in a wall, curled and burnt wallpaper along the edge of a wall hole, etc.). Bullet holes treated as a type of Decorator with stencil shading. Ability to link placement to geo in the scene (ensure studs placed regularly in world space), or to the local space of the destruction site, or random placement. Artist authored!
Performance considerations:
Big cost to updating collision meshes. Ignore small changes (bullet holes) and aggregate groups of changes when possible (if the player blows a bunch of holes into a wall, combine 'em into one big hole). Lots of triangulation into convex shapes.
Debris were not procedural fragments instanced and recycled for performance.
Event-driven. Sleeping object->destroy event->destruction sim-> sleeping object. Used FX to hide the resulting delay. Allowed for pre-calculating destruction in some cases (player plants C4? Pre-destruct the surroundings before they trigger it off).
"Time-sliced" functions. Destruction sim not finished in one cpu cycle? Save its state, postpone the continuation of the function to the next appropriate cpu cycle. (Obtrustive state saving written into eligable functions, state saved as a huge thing in memory. Was hard to do in C++. Mentioned something about "reschedulers" I didn't catch).
Need to limit conditions, avoid degenerate states. Don't let debris subdivide too many times (didn't let it subdivide at all for R6:S). Combine destruction objects when possible (See above comment on collision mesh changes)
Debris physics were asynchronous. Holes consistent across clients, but the debris they spawn were not.
Future plans:
More tools! Support for curved surfaces. More destruction types, ex. plastic deformation.
Comments:
Devote a full feature team to coding destruction! Train the artists to model for destruction. Production must be on-board for the long haul (this isn't a weekend project, impacts more than just the engineering team)
FX Roundtable Day 1
Billboards in VR?
- Large BB close to camera are bad Push away with an offset, consider tesselating and warping to wrap around the viewer
Lighting FX in HDR pipelines?
- Author particle density in a channel, attenuate light as it passes through
- Search "Black Ops 3 Lighting" on google for their solve
- "Temporal blending" for low-density particles, fails for thick fog/smoke
Why isn't Overdraw a problem in Just Cause 3???
-They profiled overdraw extensively during development.
-Engine was built to handle it
-Profiling overdraw is difficult in Unity, nearly impossible on mobile
Is Substance relevant for FX pipelines yet?
-Generators are cool! (substance designer)
-Gimp and Paint.net have some cool generators, are free
-Filterforge is an awesome node-based generator! It's p. cheap for single users.
Where the heck can FX artists collaborate?
-FB group is hitting its limit, tech-artists.org is kinda dead
-There's a Slack channel
-Visual Effects Society had a rep, mentioned they wanted to make inroads into realtime
-FX isn't taught in schools! This is a problem, discussion continued on day 2/3.
Tech Artist Roundtable Day 1
What did we lose in the transition to PBR?
-Deferred rendering is weird. We get cheap lights, lose shader flexibility. Forward+ is rad, still being used
-What if we want to break lighting? PBR gurus say don't apply arbitrary multipliers, SCREW THOSE GUYS. Sometimes you wanna boost something's specular highlights for the hell of it
-PBR and VR is hard
Good tools for procedural generation?
-Lots of folks were like "HOUDINI IS LORD, LEARN IT". Python support means tooling support!
Shader authorship--should it be gated behind tech art? How much can we put in artist hands?
EA: limited to tech artists, but they confer with artists & engineers constantly
UE4 Dev: artists author freely, TAs get involved when performance becomes an issue
(Another UE4 dev concurred)
(Large chinese team): 100+ artists, can't afford everybody authorship. TAs only.
Volition: Everybody makes shaders. Even contractors. Great as a prototyping tool?
Crystal Dynamics: Foisted authorship on one guy, who left team. Difficult to ramp somebody else up into his system.
Alim: Give senior artists access to shaders! (I see what you did there, Alim)
???: Visual Studio now has a node based shader editor. (Everybody was like "node based is the light", UE4 proves it baby)
???: Hide lighting and vertex shaders, but expose pixel shaders to artists
Volition: good debugging, costing tools are key. Got those? Be free with shaders.
Monolith: Exposed artists to full shader authorship, but wrote more limited tools which the artists preferred to use.
Certain Affinity: Shaders not exposed to artists! But their PBR material properties were, giving 'em some flexibility.
PBR Material libraries: purchase, or author?
???: In-house photogrammetry, but didn't use raw. Created materials manually in Substance based on photo results.
???: Their hard surface guy became their capture guy. Having a dedicated materials guy was handy.
(It was noted at this point that very few ppl were using NDO, Substance was vouched for widely)
???: Material charts are cool but rigid. Bend to what your art director wants! White-value restrictions per-material were useful, other restrictions were not.
???: Photoshop plugins to enforce PBR standards were handy. Scream at the user when a texture in a material makes no sense (invalid albedo for a metal, or whatever)
Stylized PBR?
-Should still define master materials, but stylize 'em!
No good homebrew photogrammetry systems. I mentioned polarizing a light with one lens of a pair of IMAX 3D glasses and polarizing your camera with the other lens to filter out specular lighting. People laughed. I was being serious
There's a Tech Artist Slack channel! Again, Tech-Artists.org is kinda garbo???
(Continued in the next post, hit the character limit)
Machwing on
+2
MachwingIt looks like a harmless old computer, doesn't it?Left in this cave to rot ... or to flower!Registered Userregular
Ripple Effect: Women in Games Initiatives
5 speakers. Talked abt what came after their first WIG events!
Game and coding initiatives targetting women exist! But didn't until very recently 10 yrs ago, very hard to find one.
2011, "Difference Engine Initiative", targeted adult women, 6 wk program. Ran for 2 years, ended due to a lack of funding.
Common criticism of DEI: was more of a 'support group', taught more abt forming social bonds between women in gamedev vs actual coding skills? Participants disagree, point to the many events they participated in and organized, and successful funding raised afterwards as evidence of DEI's success. Short-term "bootcamps" like DEI are useful!
(At this point she showed a diagram of the web of events that came after DEI. Dozens!)
1) Sagan Yee - Hand Eye Society
DEI 1 participant, DEI 2 leader. "Artsy Games Incubator". "Game Curious", outreach to traditional non-gamers
2) Rebecca Cohen-Palacios - Pixelles cofounder, UI artist at Ubisoft
Was programmer in silicon valley before DEI. Formed Pixelles in Montreal in 2014. 65+ participant applications! Game development mentorship programs. Another cool web diagram of things that came after DEI.
3) Zoe Quinn - You know her story
DEI after moving to Toronto. "Dame Game". "Sorting Hat", tool to help new devs find specific resources (here's where to find common use artwork and sounds! etc.). Depression Quest. Antholojam, itch.io work.
- Made point that it's one thing to get women in gamedev, another thing to keep them in gamedev. Women-in-gaming initiatives take time (it takes years to see results! Need constant support along the way). Encourage participants to lead future events, open their own spaces. Go beyond "starting conversations" and maintain those conversations!
Gemma Thompson - Diversi chairperson, xx game jam
(didn't catch if she participated in DEI)
2012 - did first all-women jam in Europe
Cofounder of LadyCade, GameCity (UK, Sweden). LadyCade has had 7 events, 450 participants, 15 games
LOTS of subsequent jams. Showed another sick web diagram. So sick.
Conclusions: WIG initiatives are super duper useful! Targetting adult women is useful, not just young girls. THIS SHOULD BE A NO-BRAINER.
Talked about creating safe spaces. Don't hold meetings at bars if it makes people uncomfortable. Pairing with non-development activities (dev and tea!) fosters community growth and bonding. Allies are good. Don't need to create an event on your own, look for local folks trying to start WIGs and help 'em out.
Tools Development: Build Systems, Asset Optimization Day 2
Outsourcing: Tools for managing assets?
???: Past, manual management. Now, bundle appropriate tools and pass to outsourcers. Web-based tools can be slow (internet is spottier outside the West), plan accordingly.
CI, Build intervals?
???: Allow individuals to trigger builds freely, distributed building.
???: We distribute build tools with our general toolset
Iteration times? Hot reloading: worth the effort?
Hot reloading (assets, scripts) is good! Not easy for console games.
Build assets as the engine requests them?
Console dev: console requests assets from host PC
???: Support varies by asset type. Shader hotloading was valuable for quick iteration, script hotloading less valuable and tougher to implement so we didn't.
Commit frequency?
???: "Bulk commits" w/ scripts to bisect commits when conflicts occur
Build on every commit? No consensus, some said yes and some said no. The folks who said yes agreed that this is only reasonable with a distributed build system.
(Tangent: Lots of proponents for intermediate file formats for assets)
Serious Games SIG Roundtable
What the heck is a serious game???
Many definitions given: Govt contracted games, "anything that isn't purely for entertainment", games that attempt to address real-world problems, "games for change", any game that tries to change a prevailing attitude or alter confidence in a held belief/convention
Where's the line? Can AAA be serious? Is Minecraft a serious game?
-Nonserious games can touch serious topics, or treat a topic with varying degrees of seriousness (I prefer the term fidelity)
How can the serious games dev community work together?
Collaboration between devs, but so much variety in what a srs game is (ARMA vs a game created for a university study, how can these devs work together?)
Sharing of data/user research
Clarification to funders of what srs game devs do differently from tradition gamedev (due diligence)
Collaborate with educators on teaching prospective devs that srs gamedev isn't all edutainment and military sims. (I say: teach "seriousness" as a design methodology, to be applied in varying degrees to subject matter depending on the outcome. Teaching orbital mechanics in detail = NASA game, teaching basic orbital mechanics = Kerbal Space Program, blowing spaceships up = Star Citizen)
Hard to find srs game devs and games. Academic papers relating to srs games are locked behind paywalls
Do srs games track their efficacy? Most funders aren't willing to pay for followup studies. We need to teach 'em to.
Building art tools in UE4 for Obduction - Eric Anderson, Cyan art director
This was mostly a talk about tricks in UE4's material editor. Couple tools things at the end.
Material Editor Tricks:
- Multilayer blend. Can do on individual textures, or use make- and break-material nodes to blend entire materials. Don't try to blend 3+ layers using 1 blend texture; use a diff texture for each set of blends (pack 'em together as one RGB, be wary of compression). Want to be able to control UVs independently per set.
- Rock shader tricks! Combined worldspace and UV'd maps for detail texturing, allowed copious model reusage without it being too obvious. UV'd maps mostly limited to ridge normals, AO map. Authored world maps for diffuse, normal, spec, etc. "Scale-aware tiling" (drive with object scale node) so you can size a rock up/down and maintain texel density. Worldspace-aligned details like wetness (around water level) and geological stratification, w/ vertex color as a multiplier.
- Architectural shader tricks! Same worldspace/UV tricks as above.
- UE4 doesn't like normal maps on UV1+. Generates 1 tangent basis for UV0 on asset import. Epic claims this can't be changed. Cyan modified UE4 to support multiple tangent basis (might push to the UE4 codebase after release). UE4.11 supposedly added multiple tangent basis, but it's probably not as optimized (Eric speculated it's generated every frame, whereas their solution occurs on asset import only)
- Solution to killing worldspace mapping seams/stretching: opposing spherical mappings in UV0 and 1 (pinch at top and bottom), mask the transition between. Basically: a weird biplanar texture implementation.
- Worldspace mapped textures and movement don't mix
- Authoring flowmaps is hard! Engineers modified vertex painter in UE4 to use brush speed (similar to the Unity app everybody uses). Added extra vertex color sets for multiple flows, blending between 'em used in gameplay (player flips switch, water flow changes in a complex way)
- Custom spline tools for pipe, wire placement. Liberal use of snapping points!
FX Roundtable Day 2
MAXIMUM OVERDRAW would make a good FX community name
Students - Where can they get FX info??? HOW 2 LEARNED FX PLS
- Tutorials exist, but their level of beginner-ness is hard to diagnose
- Contact professionals! They love talking about FX.
- Tool around in all those free engines we've got now! UE4, Unity, Lumberyard
- All education is good! Try doing something you like (stylized, realistic, sim)
Problem: good FX requires knowledge of animation, modelling. Not a "baseline" skill you're gonna find out about in 1st year of your game dev degree or whatever.
Schools lack FX classes. Good FX usually means math, and people shy away from math for some reason??? Gotta contextualize the math as a tool for making real cool art.
FX interns should be a position at studios. Tap students, tap QA for ppl who wanna be artists and have the ~spark~ to be FX artists
- GDC Tech Art bootcamp has caused an uptick in ppl interested in TA positions. Lots of discussion abt starting an FX bootcamp next year.
Shader knowledge is paramount for good FX! MORE SHADER TALKS. More sharing of shader tips/tricks in FX communities desired
Value of marketplace assets/doing work on marketplaces (UE4, Unity)?
- Marketplace asset creation as a structured learning tool (esp. for students)
- Purchase marketplace assets and tear 'em down, see what makes them tick. Good learning tool.
-Keijiro Takahashi mentioned as a good "open source" FX artist on the unity store, bitbucket. Look at what he does! LEARN, YOUNG PADAWAN
Katherine Cross's CRAZY GOOD Talk on writing Immoral Women in Games
I can't do this talk justice. Watch it on the vault when it comes out! IT'S REAL GOOD
"Neither saints, nor whores, but human." (why do we tend to make 'em saints or whores anyway???)
Good example: Kreia from KotOR 2. Driven to questionable actions by a life of experience. We're given a reason for WHY she's immoral, rather than just being told she is. Extensive exposure to her character (she's a mentor, the tutorial, a running commentary on the player's actions).
Not so good example: Madame from Remember Me. A compelling concept (master of memory and trama, power, psychological control) aesthetically and somewhat mechanically realized but not in plot or dialogue. Her fights are akin to a "hall of mirrors", emphasizing illusions and the fallibility of perception. But she's too freudian and, unlike Kreia, there's nothing to like about her. She's evil for evil's sake. No obvious ideology to emphathize with that would have driven her to become what she is. We don't know why she's immoral.
Good example: Hyun-Ae from Hate Plus. Similar to Madame in her position in a society, but she has an ethos (her motivation is to prevent social breakdown on a colony ship). We can imagine ourselves in her situation, follow her conclusions.
Characters must "show their work" (we must understand how they become who they are). Often afforded for male characters in media (ex. Walter White's evolution through Breaking Bad) but rarely afforded to female characters. House of Cards does it apparently (never seen it)
Easy to fall into the trap of making a female/minority character JUST their motivation/ethos/etc. Don't just make them symbols! Kreia has a unique personality, demands, reasons for believing what she does which extend beyond the "logic" of her beliefs. Don't say "we need a reason to make this character a woman" because then all they are in the context of the game is their woman-ness, THAT'S LAME MAN. Extends to race, gender, etc.
Almost-Good example: Knight Commander Merideth from Dragon Age 2. Pragmatic to a fault. Only failure is that she's the final boss and is relieved from responsibility by "oh she was influenced by a cursed sword all along". Totally removes her agency in the story. LET YOUR CHARACTERS OWN THEIR IMMORALITY ON THEIR OWN TERMS (cough cough bad Kerrigan writing COUGH COUGH)
Tech Issues in Tools Development: Usability, Languages, etc. Day 3
There's a Slack! thetoolsmiths.slack.com
Worthwhile to hire a UX specialist for tools?
-Bungie: we have one, huge help for the engineers, provides usability feedback, makes tools better overall
-???: Have one, important that they have some tools dev experience and aren't just UX ppl (other studios concurred, but this is really hard to hire for)
-Might be reasonable to earmark a dev to become the local UX expert
Cross-Platform Scripting?
-Unity: funnel down to the fewest languages possible. Python isn't scripty enough, but Perl is
-???: "Whaat Perl is garbage, Python is great! (Unity guy was like "wat", luckily a nuclear-level argument was averted)
- Murmurs for LUA expressed. Lots of folks expressed support for node.js
Migration Frameworks (moving tools to a new language)
-Embed the old tool in a layer (ex. a Qt interface) and slowly rewrite. Qt support across languages, signals are generally reliable, good for this.
- Lots of people were doing this for old tools written in MFC? Everybody was like "MFC was bad, why'd we ever use it
- Overall this is a costly thing to do. Has to be justified by IMPROVING the tool during the rewrite, not just porting the functionality 1:1. Incentivize the transition.
Modular vs. Monolothic tools
-Me: Modular tools, monolothic deployment!
Nickelodeon: old monolithic tool being slowly phased out as its functionality is rewritten into smaller tools
Bungie: actually moving towards more monolothic development, inconvenient for users to need to constantly move between tools
(Lots of studios said this can be mitigated with a single access point to tools (ex. system tray icon), consistent visual semantics across tools, etc.)
Tools for Procedural Generation?
Ubisoft: needed tools for city generation, Houdini is the light. Didn't integrate directly into the engine, too costly.
DICE: tried to generate meshes from photogrammetry (prodecural rocks, etc.), didn't get great results. Better for artists to work with the data. Did see some success in tools for art cleanup (ex. capping mesh bottoms)
Writing your own language?
- LISP, ocaml cited as good metaprogramming languages
- After lots of discussion on benefits and drawbacks, agreement not to make your own language unless it's similar to/built off of an existing language. Too costly to train new engineers to your in-house language they've never worked with. You lose access to open source libraries unless you're closely wrapping the appropriate language. Support might be strong for your language initially, but tends to wither as more folks need to work w/ it.
Deployment techniques?
-Tray icon as updater (aw yiss)
-???: Version tracking of tools, prompting user to update manually
-???: Symlinking of repos to odd directories for things like photoshop plugin distribution (b/c you probably don't want to version control the photoshop installation)
-???: Let IT handle installation of external software. Your tools guys shouldn't worry about photoshop beyond the plugins they've written for it.
-DICE: Autosync tools, user sets interval (10 minutes, once per day, etc.). Perforce commands executed in Python.
Deployment as a byproduct of CI reasonable? Not really any thoughts on this
How to express complex data with a simple UI?
-JS frameworks for visualizing data are out there
-Angular.js tool "UI Grid" mentioned
- Edward Tuftee's "Qualitative Display of Quantitative Information" cited as excellent resource
I am trying to play a video file full screen before a level loads but I can't find the command to do so. I thought media player would do the trick but that is just seems to be for playing a movie onto a texture. Any ideas?
Here's my GDC notes, somewhat literally transcribed. I've paraphrased the talk names because me lazy:
edit: FROM HELL'S HEART, I STAB AT THEE NEW SPOILERS. Just quote the post I guess if you want the content? fffffff
Animating With Math - Natalie Burke, T. Artist (Destiny)
Summary: Vertex animation in shaders! Typically seen in vegetation (grass and trees blowing in wind) or cloth (tarps blowing in wind), but used all over the place in destiny to good effect
Driven in destiny w/ gameplay properties, basic math (sin, time, etc.) and COPIOUS usage of vertex colors. Like, 3+ sets of vertex colors on a model. Dang.
Usually applied a dampening function to gameplay properties to prevent instant value changes.
Vertex color uses:
Velocity vectors (RGB) or phase offsets (B) to wave functions.
Baked blendshapes! (vertex color acting as a vector displacement map). Limited to linear blends or pivots about a single point, but this worked well for certain things (jiggling bits on belts, danging chains, etc.)
Very easy to mimic dangling rope/chain deformation w/ vertex colors. (Vex?) architecture had lots of mechanical objects w/chains that go taut, lift an object, drop it down and go slack again. Bone at each end of a rope to attach it, slackening all vertex animation.
Wrote tools! Shader editor. Vertex color editor (ex. for authoring gradients easily, sampling color, managing different color sets)
Heh, speaking of Destiny animating long dangly objects. My friend Mike had a cape that would freak out and start animating as if the end of it were stuck somewhere off the left of screen. The whole thing would get stretched out and distorted. It only happened to him for some reason. Other people in our group had no problem with that same cape.
So my friend is a long-time industry animator, and probably one of the best in the field. He created a video about using a center of mass script. Extremely informative and worth a watch if you're into animation at all.
Need a voice actor? Hire me at bengrayVO.com
Legends of Runeterra: MNCdover #moc
Switch ID: MNC Dover SW-1154-3107-1051 Steam ID Twitch Page
Almost two hours of figuring out animations, tossed in a little killing of the secretary enemies, still stuck on stopping my angry office lady from punching continuously.......
*tries harder, goofs around*
Anyone want to see what happens when you combine a jackhammer office lady with evil secretaries that spawn every second and have kill code that works?
Hoping to do some more focused development shortly. Right now I'm working on some game design planning, particularly the progression pacing with upgrades. I set up a nifty spreadsheet to track how far I expect the player to be able to get with each run, which I can then use to tune the cost and efficacy of the upgrades:
The news from among the anthills on my end is that I created a basic AABB collision system in a fit of pique during a particularly content-free period. It's actually a regression from some collision detection work that I've already done involving moving points, line segments, circles, and polygons (in 2D), which already had a basic AABB heuristic built in for performance, but hey, progress is progress.
My motivation is some ongoing flabbergast with the way most commercial physics engines (as you find in Unity and Unreal) always come with a caveat somewhere in the documentation that fast-moving objects may fail to detect a collision. It suggests that what they're doing is just advancing time in really small increments and then fine-tuning the results to maintain the illusion of continuity. I'm sure there are very good reasons for taking that approach (possibly performance deriving from mass culling operations, though more likely it's the easy guarantee that collisions happen in the proper order), but as my dream-game design (the kind that floats around in the head of people who will never actually accomplish what they set out to do) entails characters at least temporarily moving around at large multiples of the width of potential obstructions, so that caveat is more like a serious prospective source of bugs. I want perfect continuity, hence this system.
After searching around the Internet for a starting point (certainly someone must have attempted this kind of thing before...!) and finding mostly either half-solutions or articles where the first several comments were "the code presented is buggy and doesn't work", I decided to just go ahead and start from first principles. The key insight that this entire function runs on is that when you move an AABB with a 2D velocity, you create a pair of time intervals on which the moving AABB overlaps the target AABB. If the two time intervals overlap, then the two bounding boxes collide. If the two bounding boxes don't overlap, or either of the overlap intervals falls out of the range [0, 1], then there is no collision. Then, based on which interval started last and which direction the velocity is in, you can determine and report which side of the target box was hit. This could be useful to determine if, say, wall-jumping should be available, whether you stomped on the enemy or smashed your face into it, any number of things that you could want to ram boxes into each other with.
Here's some pictures of the test application in action, showing that it correctly detects collisions when it should and doesn't detect them when it shouldn't:
And if you're curious, the full text of the C# function (as a member function of a class called "Block"; it could be trivially modified to operate on two Blocks passed in as parameters) is beneath this spoiler. The property definitions are left as a exercise to the reader.
public CollideType TryCollide(Block other, float velocityX, float velocityY, out float collideTime)
{
collideTime = float.NaN;
bool movingRight = velocityX > 0;
bool movingLeft = velocityX < 0;
float xCollisionIntervalStart = float.NaN;
float xCollisionIntervalEnd = float.NaN;
float yCollisionIntervalStart = float.NaN;
float yCollisionIntervalEnd = float.NaN;
// If moving right, x collision interval is from time for right to hit target left (clamp to 0) to time for left to hit target right.
if(movingRight)
{
xCollisionIntervalEnd = (other.RightX - LeftX) / velocityX;
// If time for left to hit target right is negative, we never overlapped.
if (xCollisionIntervalEnd < 0)
{
return CollideType.None;
}
xCollisionIntervalStart = Math.Max((other.LeftX - RightX) / velocityX, 0);
if (xCollisionIntervalStart > 1)
{
return CollideType.None;
}
}
// If moving left, x collision interval is from time for left to hit target right to time for right to hit target left.
else if (movingLeft)
{
xCollisionIntervalEnd = (other.LeftX - RightX) / velocityX;
// If time for right to hit target left is negative, we never overlapped.
if(xCollisionIntervalEnd < 0)
{
return CollideType.None;
}
xCollisionIntervalStart = Math.Max((other.RightX - LeftX) / velocityX, 0);
if(xCollisionIntervalStart > 1)
{
return CollideType.None;
}
}
// If not moving horizontally, check raw overlap. If raw overlap, x collision interval is 0 to 1. Else no collide.
else
{
if(!(LeftX > other.RightX || RightX < other.LeftX))
{
xCollisionIntervalStart = 0;
xCollisionIntervalEnd = 1;
}
else
{
return CollideType.None;
}
}
bool movingUp = velocityY > 0;
bool movingDown = velocityY < 0;
// If moving up, y collision interval is from time for top to hit target bottom (clamp to 0) to time for bottom to hit target top (clamp to 0).
if(movingUp)
{
yCollisionIntervalEnd = (other.TopY - BottomY) / velocityY;
if(yCollisionIntervalEnd < 0)
{
return CollideType.None;
}
yCollisionIntervalStart = Math.Max((other.BottomY - TopY) / velocityY, 0);
if(yCollisionIntervalStart > 1)
{
return CollideType.None;
}
}
// If moving down, y collision interval is from time for bottom to hit target top (clamp to 0) to time for top to hit target bottom (clamp to 0).
else if(movingDown)
{
yCollisionIntervalEnd = (other.BottomY - TopY) / velocityY;
if(yCollisionIntervalEnd < 0)
{
return CollideType.None;
}
yCollisionIntervalStart = Math.Max((other.TopY - BottomY) / velocityY, 0);
if(yCollisionIntervalStart > 1)
{
return CollideType.None;
}
}
// If not moving vertically, check raw overlap. If raw overlap, y collision interval is 0 to 1. Else no collide.
else
{
if(!(BottomY > other.TopY || TopY < other.BottomY))
{
yCollisionIntervalStart = 0;
yCollisionIntervalEnd = 1;
}
else
{
return CollideType.None;
}
}
// If x interval and y interval intersect:
if (!(xCollisionIntervalEnd < yCollisionIntervalStart || yCollisionIntervalEnd < xCollisionIntervalStart))
{
// - If both interval beginnings are 0, type is Overlap.
if(xCollisionIntervalStart == 0 && yCollisionIntervalStart == 0)
{
collideTime = 0;
return CollideType.Overlap;
}
// - Otherwise, if x interval beginning > y interval beginning:
if(xCollisionIntervalStart > yCollisionIntervalStart)
{
collideTime = xCollisionIntervalStart;
// - If moving right, type is Left and collision time is x interval beginning.
if (movingRight)
{
return CollideType.Left;
}
// - If moving left, type is Right and collision time is x interval beginning.
else
{
return CollideType.Right;
}
}
// - Otherwise (y interval beginning > x interval beginning)
else
{
collideTime = yCollisionIntervalStart;
// - If moving up, type is Bottom and collision time is y interval beginning.
if (movingUp)
{
return CollideType.Bottom;
}
// - If moving down, type is Top and collision time is y interval beginning.
else
{
return CollideType.Top;
}
}
}
// If x interval and y interval do not intersect, no collision.
else
{
return CollideType.None;
}
}
This function is probably a bit too verbose to serve as a culling operation for the more complicated geometry collision engine I mentioned before (it's far simpler to just expand the bounding box by the velocity if all if you want is a general "should I bother to do more complicated collision on the things these AABBs are bounding" or as a parameter into a quadtree query), but it could still readily power some kind of NES-style game.
My favorite musical instrument is the air-raid siren.
really long ass invisible block for everyone to walk on
base enemy type for all other enemies to inherit code
spawners with alarms set to 30 frames on creation, counts down to 0 and the alarm (checks to make sure boss lady is out of range, drops an enemy, resets alarm to a random number between 1 and 4)
Note that the license states that you can only use them with Cryengine so don't buy it thinking you can just rip them to your UE4/Unity project or whatever.
You can with the assets from Madison Pike. Unfortunately the motion capture assets are only allowed in CryEngine, but that particular company does have a coupon for half off on all their products until the 31st.
This is relevant to my interests!
How are you doing it?
A greyscale depthmap, here it's a grey megaman that causes black lines to pop the most and white to be flat.
I'm reading the colored texture and generating a voxel mesh for each non-transparent pixel, this guy's got as many polygons as an Assassins Creed character because of that. Good enough to test with.
I thought about a vertex shader but I don't think that would create the right boxy look.
agoaj on
+1
MachwingIt looks like a harmless old computer, doesn't it?Left in this cave to rot ... or to flower!Registered Userregular
edited March 2016
question for folks more aquainted with C# (and Unity) than I:
-I've got a singleton "Manager" object with a public variable exposed to the Inspector. On start-up, the Manager creates an object (let's call it an Actor) and passes the variable's value as a parameter. I want to be able to update the variable's value in the Inspector and see the change reflected in the Actor's behavior.
I was initially under the impression that I could pass the variable in by ref, but I can't hold on to the reference this way-- calling "this.value = value" in the Actor's constructor ends up copying the reference by value What I want to do is hold on to the reference--I can do this with a pointer, but that requires me to mark the Actor as unsafe. While the manager technically *could* get deleted without the Actor's knowledge, I know that this won't happen.
Solutions I've seen:
- Using the Manager's Update() to reassign the variable to the Actor on every frame. I don't like this b/c A) I'm constantly doing assignments, and I want to keep all of an Actor's properties private and use setters/getters for everything inside it. The Manager is the only object that should have reference ties to the Actor.
- Box the variable as an object, pass it by reference to the Actor on creation, and unbox the variable whenever the Actor needs to read its value. This seems like a reasonable approach right now, but I only want to update the variable at runtime for ease of tweaking during development. I don't want to have to box every variable I add to the Manager during development (apparently a costly thing to do), only to rip all the boxing/unboxing code out when I settle on their values.
Anybody have any thoughts? Alternate solutions? Should I just mark the dang thing as unsafe and use a pointer
Posts
One thing to add is if you are writing unity code this way for testing purposes you are going to have to think about how time is used and tested in Unity and your game. Any dependency on Unitys' built in time has to be abstracted so you can ensure consistency in tests i.e. never call Time.deltaTime directly. I use a separate interface and class with one time instance per scene but the nice thing once it's abstract a timer can be sped up, slowed it down or potentially reversed or depend on another source of time (say if you want to pause time globally but have more than one clock/time source, or just have a simple switch which elapses after a period of time).
It does add a bit of book keeping to game objects but having testable components is worth it. They have to find the global time source via either a singleton or a parameter to the game object and then pass that to state object (via constructor). The timer scripts also have to run prior to game objects which depend on them which can be set in the editor (under edit->project settings->script execution order).
This is what I use atm.
There is a default time keeper which is updated once per frame before any other scripts are ran.
An alternative to this approach would be to pass elapsed time from Unity to the state object whenever an Update call happens. I prefer the approach above because of the benefits. It makes pausing and having separate timers progressing at different rates easier.
Speaking of Cryengine they just announced that Cryengine 5 will adopt a pay what you want model.
http://www.gamesindustry.biz/articles/2016-03-15-crytek-announces-cryengine-5-adopts-pay-what-you-want-model
He suggested using asserts heavily and following design-by-contract (basically, asserting the pre- and post-conditions of a function before writing the function) instead-there's less back and forth between writing your program and writing the program to test your program, and by leaving the asserts in the release version of your software you're testing all the states your program will be in.
That being said, having an automated test framework and suite makes regression tests immensely easier to implement, as well as having very strong answers to how to test code which relies on big systems like databases and networks, not to mention removing a significant amount of uncertainty and fear in the process of refactoring.
Saying "time spent writing the code to test your code is not time spent writing code" is nonsense, since everyone should be testing their code anyway, and that takes time. Whether you're using TDD or not you spend a ton of time testing code. The question is, do you test it with automated tests or do you run it through manually?
IDK. The best approach is project and team specific, ultimately.
Okay, but assuming you're 100% right, isn't code that is easier to test and also serves its purpose strictly superior to code which serves its purpose the same way but is also more difficult to test?
I have actually found that my code adopts a far superior structure when I use TDD than when I don't. The thing is that structures which are easy to test and structures which follow good design are almost the exact same set.
As a person who resisted TDD for a pretty long time (for a lot of the reasons you've already mentioned) before finally giving it a shot at work, I have been nothing except impressed with every aspect of how it works, and how I work with it. I am pretty certain it's saved me WAY more hours than it's cost me on debugging alone.
At the end of the day for this thread I definitely advocate using whatever method you want, because for the most part this thread is full of people who are making games in their free time because we think it's awesome. If that method is TDD, then I'm excited to keep showing people how it works.
SteamID: edgruberman GOG Galaxy: EdGruberman
Not if you want to actually finish making a game at any point.
https://www.youtube.com/watch?v=44M7JsKqwow
Granted while this is rendered at realtime in 30fps and 1440p on a 980GTX standard, the camera shots they've chosen are obviously really specific to keep from frame drops of excessive detail. I highly doubt it can publish this kind of quality all around a fully realized world. So definitely not showcasing the power of the game engine in terms of games, but totally looks like it's breaking ground for film & CG sequences.
Twitch: KoopahTroopah - Steam: Koopah
What's more intersting imho is that their rendering doesn't look crap for a change. That's a good looking demo!
Unreal Engine 4 Developers Community.
I'm working on a cute little video game! Here's a link for you.
Close to four hours of research and experiments today to get her limited to that wall, and having a walking animation of sorts.
All I have in my head is black office lady + final fight.
Is that motion capped?
https://www.youtube.com/watch?v=h4ZFuU_BPOo
code for anyone curious.
line 8 is what shit the bed.
I met so many cool indies and AAA devs alike! I had a brief conversation with willy chyr abt shaders, watched zach gage demo a sick card game, was blown away by a talk by Katherine cross
Drank a lot
I'll be transcribing the notes I took for the talks I attended at some point this week. I'll post 'em here!
Also, a big point of TDD is future-proofing - you make sure that new code doesn't break old code, because old tests will start to fail. I see writing a test as a very small time expense considering the automation it gives me.
But in the world of web apps, on a tiny team it's definitely a pretty large time investment. I know the generally-accepted wisdom is that tests are great, but my personal experience is that you end up with two code bases -- a small one, your application, and then another that's huge, your tests. Maintaining tests -- especially fixtures and UI-related tests -- ends up taking a ton of time.
My experience in the last five years of web app programming (half of that for a startup) is that it's a tradeoff. Tests ensure stability. But they cost you time. If you have the time, go for it. But if you don't, don't sweat it -- just work as quick as you can on the product itself and get it done.
I definitely do not buy the argument that it saves you time. That has not been my experience at least. It gives you confidence, it helps make your application more stable, but it does not save you any time, not even in the long run.
Edit: Also, you can decide to have tests for some parts of your application but not others. At my current job, we have pretty good test coverage of our API, because we want it to be ROCK SOLID in terms of stability. And it's much less time consuming to write tests for an API than it does for our website. For our API, we have valued stability over time.
Our website though, used to have tests, but we decided they took too much damn time to maintain, and would rather be coding new features as quick as we could. There, we value time-saved and quick iteration with new features over stability.
I get the feeling you're talking more about testing the gui as an end user would see it, like with selenium or sikuli. Yes, those are horrible to maintain, but they can also be worth it compared to a manual test. I guess it depends what parts of your product are changing, and how frequently you deploy.
edit: FROM HELL'S HEART, I STAB AT THEE NEW SPOILERS. Just quote the post I guess if you want the content? fffffff
Animating Quadrupeds in The Fire and the Flame - Gwen Frey, T. Artist
Biped techniques are no good
Major motions: Forward, Turning, and transitions from start/stop and stop/start
Forward Motion
Boar
Footfalls at same frame for all speeds! Okay since it didn't vary speed much. If the boar sped up/slowed down, speed the anims up/down appropriately.
Linear blending looked awful for just a walk/run. Had more animations for slower speeds. Ex. anim for 10% speed, 25% speed, 50% speed, 100% speed
Wolf
Foot phasing did not line up for all speeds. Solution: find a midpoint between walk/run where the phasing aligns, do a QUICK blend there. Didn't need transition anims as a result.
Turning
Look-ats are super unnatural if you just turn the head. Animals don't just turn their head! They turn their whole body, moving forward. Front paws turn immediately, back paws tend to "catch up" as the animal moves.
Boar
Desire was for the boar to always look at the player. No inter-spine animation really needed (the boar was stubby), so accomplished with blendshapes. Had a forward running shape, strafing-left shape, strafing-right.
Wolf
No special anims to separate front/rear movement. Measured the angle between the wolf's current facing and it's desired facing (the player), distributed that angle through each spine bone. Required that the wolf be moving forward to turn, linked turn rate with movement rate to prevent foot slipping.
Start/Stop Transitions
W/ bipeds, only need to worry about which foot is planted and transition the other foot downward. With quadrupeds, all sorts of foot combinations might be planted. Static anims for each transition were no good!
Boar
Wanted to avoid runtime IK. Solution was to author a "foot-down" version of each movement anim, identical but with the feet vertical position locked to the ground. The anim is played in sync w/ the walk anim, 0% blend until the creature stops. On stopping, the foot-down is blended in and both animations are paused. When movement starts again, the normal walk anim is blended in and sped up to normal.
B/C foot placement when idle is determined by where the walk anim was paused, idle anims were additive and didn't include foot/leg movement. To avoid drifting on legs, the boar's hooves were parented directly to the root and the leg heirarchy was reversed (the "hip"-bone being the last in the chain). An odd solution, not viable across production.
Wolf
Used walk anim freezing like above, but used runtime IK unstead of a foot-down anim so they could do better idles than with the boar. Conclusion: runtime IK is probably a necessity for good quadruped anims (or good transition anims)
Animating With Math - Natalie Burke, T. Artist (Destiny)
Driven in destiny w/ gameplay properties, basic math (sin, time, etc.) and COPIOUS usage of vertex colors. Like, 3+ sets of vertex colors on a model. Dang.
Usually applied a dampening function to gameplay properties to prevent instant value changes.
Vertex color uses:
Velocity vectors (RGB) or phase offsets (B) to wave functions.
Baked blendshapes! (vertex color acting as a vector displacement map). Limited to linear blends or pivots about a single point, but this worked well for certain things (jiggling bits on belts, danging chains, etc.)
Very easy to mimic dangling rope/chain deformation w/ vertex colors. (Vex?) architecture had lots of mechanical objects w/chains that go taut, lift an object, drop it down and go slack again. Bone at each end of a rope to attach it, slackening all vertex animation.
Wrote tools! Shader editor. Vertex color editor (ex. for authoring gradients easily, sampling color, managing different color sets)
Art of Destruction in Rainbow 6: Siege - Julien L'Heureux, Technical Lead at Ubisoft
Ubisoft History:
2012 - Procedural glass in primarily decals and basic geometry splitting.
2013 - In-house concept for complex geo destruction, boolean operations on geo. Not synchronous, not optimized.
2015 - Final tech for R6:S!
RealBlast. by Ubisoft's R&D group, consists of a code library for runtime destruction, tools for prebaked destruction, debugger, and a "properties editor" (for authoring different destructable materials)
Runtime destruction limited to planar surfaces, which are projected to 2D, sliced by "cutters", reprojected to 3D space.
Types of destruction in-game categorized as "trapdoors", "barricades", "LOS floors", "breakable walls" (mostly differ in how AI deals with 'em)
Models must be made destruction-ready (airtight geo, interior materials defined. Changed modelling workflow greatly (how many artists worry about cabinet wall thickness?)
AI needed to be aware of destruction (LOS and pathing changes), sound needed to be aware (a room's acoustics change when half the room is blown away), a mandate for state-based destruction.
An object's state is managed as a tree graph, branches are geo chunks with further chunks as potential leaf nodes. Graph could be pre-calculated (deterministic destruction on props like tables, vases, etc.) or dynamic (wall destruction). (Consistent data structure between baked and dynamic destruction?)
"Cutters" authored by artists. Could combine mathematical shapes (voronoi cells, random ellipsoids, etc.) with custom shapes (ex. glass spider-cracking) on a per-material basis. Limited to 2D projection as above.
"Decals" and "Decorators" authored. Decals we know (projected textures), decorators are models place dynamically at site of destruction (pipes and studs in a wall, curled and burnt wallpaper along the edge of a wall hole, etc.). Bullet holes treated as a type of Decorator with stencil shading. Ability to link placement to geo in the scene (ensure studs placed regularly in world space), or to the local space of the destruction site, or random placement. Artist authored!
Performance considerations:
Big cost to updating collision meshes. Ignore small changes (bullet holes) and aggregate groups of changes when possible (if the player blows a bunch of holes into a wall, combine 'em into one big hole). Lots of triangulation into convex shapes.
Debris were not procedural fragments instanced and recycled for performance.
Event-driven. Sleeping object->destroy event->destruction sim-> sleeping object. Used FX to hide the resulting delay. Allowed for pre-calculating destruction in some cases (player plants C4? Pre-destruct the surroundings before they trigger it off).
"Time-sliced" functions. Destruction sim not finished in one cpu cycle? Save its state, postpone the continuation of the function to the next appropriate cpu cycle. (Obtrustive state saving written into eligable functions, state saved as a huge thing in memory. Was hard to do in C++. Mentioned something about "reschedulers" I didn't catch).
Need to limit conditions, avoid degenerate states. Don't let debris subdivide too many times (didn't let it subdivide at all for R6:S). Combine destruction objects when possible (See above comment on collision mesh changes)
Debris physics were asynchronous. Holes consistent across clients, but the debris they spawn were not.
Future plans:
More tools! Support for curved surfaces. More destruction types, ex. plastic deformation.
Comments:
Devote a full feature team to coding destruction! Train the artists to model for destruction. Production must be on-board for the long haul (this isn't a weekend project, impacts more than just the engineering team)
FX Roundtable Day 1
- Large BB close to camera are bad Push away with an offset, consider tesselating and warping to wrap around the viewer
Lighting FX in HDR pipelines?
- Author particle density in a channel, attenuate light as it passes through
- Search "Black Ops 3 Lighting" on google for their solve
- "Temporal blending" for low-density particles, fails for thick fog/smoke
Why isn't Overdraw a problem in Just Cause 3???
-They profiled overdraw extensively during development.
-Engine was built to handle it
-Profiling overdraw is difficult in Unity, nearly impossible on mobile
Is Substance relevant for FX pipelines yet?
-Generators are cool! (substance designer)
-Gimp and Paint.net have some cool generators, are free
-Filterforge is an awesome node-based generator! It's p. cheap for single users.
Where the heck can FX artists collaborate?
-FB group is hitting its limit, tech-artists.org is kinda dead
-There's a Slack channel
-Visual Effects Society had a rep, mentioned they wanted to make inroads into realtime
-FX isn't taught in schools! This is a problem, discussion continued on day 2/3.
Tech Artist Roundtable Day 1
What did we lose in the transition to PBR?
-Deferred rendering is weird. We get cheap lights, lose shader flexibility. Forward+ is rad, still being used
-What if we want to break lighting? PBR gurus say don't apply arbitrary multipliers, SCREW THOSE GUYS. Sometimes you wanna boost something's specular highlights for the hell of it
-PBR and VR is hard
Good tools for procedural generation?
-Lots of folks were like "HOUDINI IS LORD, LEARN IT". Python support means tooling support!
Shader authorship--should it be gated behind tech art? How much can we put in artist hands?
EA: limited to tech artists, but they confer with artists & engineers constantly
UE4 Dev: artists author freely, TAs get involved when performance becomes an issue
(Another UE4 dev concurred)
(Large chinese team): 100+ artists, can't afford everybody authorship. TAs only.
Volition: Everybody makes shaders. Even contractors. Great as a prototyping tool?
Crystal Dynamics: Foisted authorship on one guy, who left team. Difficult to ramp somebody else up into his system.
Alim: Give senior artists access to shaders! (I see what you did there, Alim)
???: Visual Studio now has a node based shader editor. (Everybody was like "node based is the light", UE4 proves it baby)
???: Hide lighting and vertex shaders, but expose pixel shaders to artists
Volition: good debugging, costing tools are key. Got those? Be free with shaders.
Monolith: Exposed artists to full shader authorship, but wrote more limited tools which the artists preferred to use.
Certain Affinity: Shaders not exposed to artists! But their PBR material properties were, giving 'em some flexibility.
PBR Material libraries: purchase, or author?
???: In-house photogrammetry, but didn't use raw. Created materials manually in Substance based on photo results.
???: Their hard surface guy became their capture guy. Having a dedicated materials guy was handy.
(It was noted at this point that very few ppl were using NDO, Substance was vouched for widely)
???: Material charts are cool but rigid. Bend to what your art director wants! White-value restrictions per-material were useful, other restrictions were not.
???: Photoshop plugins to enforce PBR standards were handy. Scream at the user when a texture in a material makes no sense (invalid albedo for a metal, or whatever)
Stylized PBR?
-Should still define master materials, but stylize 'em!
No good homebrew photogrammetry systems. I mentioned polarizing a light with one lens of a pair of IMAX 3D glasses and polarizing your camera with the other lens to filter out specular lighting. People laughed. I was being serious
There's a Tech Artist Slack channel! Again, Tech-Artists.org is kinda garbo???
(Continued in the next post, hit the character limit)
Game and coding initiatives targetting women exist! But didn't until very recently 10 yrs ago, very hard to find one.
2011, "Difference Engine Initiative", targeted adult women, 6 wk program. Ran for 2 years, ended due to a lack of funding.
Common criticism of DEI: was more of a 'support group', taught more abt forming social bonds between women in gamedev vs actual coding skills? Participants disagree, point to the many events they participated in and organized, and successful funding raised afterwards as evidence of DEI's success. Short-term "bootcamps" like DEI are useful!
(At this point she showed a diagram of the web of events that came after DEI. Dozens!)
1) Sagan Yee - Hand Eye Society
DEI 1 participant, DEI 2 leader. "Artsy Games Incubator". "Game Curious", outreach to traditional non-gamers
2) Rebecca Cohen-Palacios - Pixelles cofounder, UI artist at Ubisoft
Was programmer in silicon valley before DEI. Formed Pixelles in Montreal in 2014. 65+ participant applications! Game development mentorship programs. Another cool web diagram of things that came after DEI.
3) Zoe Quinn - You know her story
DEI after moving to Toronto. "Dame Game". "Sorting Hat", tool to help new devs find specific resources (here's where to find common use artwork and sounds! etc.). Depression Quest. Antholojam, itch.io work.
- Made point that it's one thing to get women in gamedev, another thing to keep them in gamedev. Women-in-gaming initiatives take time (it takes years to see results! Need constant support along the way). Encourage participants to lead future events, open their own spaces. Go beyond "starting conversations" and maintain those conversations!
Gemma Thompson - Diversi chairperson, xx game jam
(didn't catch if she participated in DEI)
2012 - did first all-women jam in Europe
Cofounder of LadyCade, GameCity (UK, Sweden). LadyCade has had 7 events, 450 participants, 15 games
LOTS of subsequent jams. Showed another sick web diagram. So sick.
Conclusions: WIG initiatives are super duper useful! Targetting adult women is useful, not just young girls. THIS SHOULD BE A NO-BRAINER.
Talked about creating safe spaces. Don't hold meetings at bars if it makes people uncomfortable. Pairing with non-development activities (dev and tea!) fosters community growth and bonding. Allies are good. Don't need to create an event on your own, look for local folks trying to start WIGs and help 'em out.
Tools Development: Build Systems, Asset Optimization Day 2
???: Past, manual management. Now, bundle appropriate tools and pass to outsourcers. Web-based tools can be slow (internet is spottier outside the West), plan accordingly.
CI, Build intervals?
???: Allow individuals to trigger builds freely, distributed building.
???: We distribute build tools with our general toolset
Iteration times? Hot reloading: worth the effort?
Hot reloading (assets, scripts) is good! Not easy for console games.
Build assets as the engine requests them?
Console dev: console requests assets from host PC
???: Support varies by asset type. Shader hotloading was valuable for quick iteration, script hotloading less valuable and tougher to implement so we didn't.
Commit frequency?
???: "Bulk commits" w/ scripts to bisect commits when conflicts occur
Build on every commit? No consensus, some said yes and some said no. The folks who said yes agreed that this is only reasonable with a distributed build system.
(Tangent: Lots of proponents for intermediate file formats for assets)
Serious Games SIG Roundtable
Many definitions given: Govt contracted games, "anything that isn't purely for entertainment", games that attempt to address real-world problems, "games for change", any game that tries to change a prevailing attitude or alter confidence in a held belief/convention
Where's the line? Can AAA be serious? Is Minecraft a serious game?
-Nonserious games can touch serious topics, or treat a topic with varying degrees of seriousness (I prefer the term fidelity)
How can the serious games dev community work together?
Collaboration between devs, but so much variety in what a srs game is (ARMA vs a game created for a university study, how can these devs work together?)
Sharing of data/user research
Clarification to funders of what srs game devs do differently from tradition gamedev (due diligence)
Collaborate with educators on teaching prospective devs that srs gamedev isn't all edutainment and military sims. (I say: teach "seriousness" as a design methodology, to be applied in varying degrees to subject matter depending on the outcome. Teaching orbital mechanics in detail = NASA game, teaching basic orbital mechanics = Kerbal Space Program, blowing spaceships up = Star Citizen)
Hard to find srs game devs and games. Academic papers relating to srs games are locked behind paywalls
Do srs games track their efficacy? Most funders aren't willing to pay for followup studies. We need to teach 'em to.
Building art tools in UE4 for Obduction - Eric Anderson, Cyan art director
Material Editor Tricks:
- Multilayer blend. Can do on individual textures, or use make- and break-material nodes to blend entire materials. Don't try to blend 3+ layers using 1 blend texture; use a diff texture for each set of blends (pack 'em together as one RGB, be wary of compression). Want to be able to control UVs independently per set.
- Rock shader tricks! Combined worldspace and UV'd maps for detail texturing, allowed copious model reusage without it being too obvious. UV'd maps mostly limited to ridge normals, AO map. Authored world maps for diffuse, normal, spec, etc. "Scale-aware tiling" (drive with object scale node) so you can size a rock up/down and maintain texel density. Worldspace-aligned details like wetness (around water level) and geological stratification, w/ vertex color as a multiplier.
- Architectural shader tricks! Same worldspace/UV tricks as above.
- UE4 doesn't like normal maps on UV1+. Generates 1 tangent basis for UV0 on asset import. Epic claims this can't be changed. Cyan modified UE4 to support multiple tangent basis (might push to the UE4 codebase after release). UE4.11 supposedly added multiple tangent basis, but it's probably not as optimized (Eric speculated it's generated every frame, whereas their solution occurs on asset import only)
- Solution to killing worldspace mapping seams/stretching: opposing spherical mappings in UV0 and 1 (pinch at top and bottom), mask the transition between. Basically: a weird biplanar texture implementation.
- Worldspace mapped textures and movement don't mix
- Authoring flowmaps is hard! Engineers modified vertex painter in UE4 to use brush speed (similar to the Unity app everybody uses). Added extra vertex color sets for multiple flows, blending between 'em used in gameplay (player flips switch, water flow changes in a complex way)
- Custom spline tools for pipe, wire placement. Liberal use of snapping points!
FX Roundtable Day 2
Students - Where can they get FX info??? HOW 2 LEARNED FX PLS
- Tutorials exist, but their level of beginner-ness is hard to diagnose
- Contact professionals! They love talking about FX.
- Tool around in all those free engines we've got now! UE4, Unity, Lumberyard
- All education is good! Try doing something you like (stylized, realistic, sim)
Problem: good FX requires knowledge of animation, modelling. Not a "baseline" skill you're gonna find out about in 1st year of your game dev degree or whatever.
Schools lack FX classes. Good FX usually means math, and people shy away from math for some reason??? Gotta contextualize the math as a tool for making real cool art.
FX interns should be a position at studios. Tap students, tap QA for ppl who wanna be artists and have the ~spark~ to be FX artists
- GDC Tech Art bootcamp has caused an uptick in ppl interested in TA positions. Lots of discussion abt starting an FX bootcamp next year.
Shader knowledge is paramount for good FX! MORE SHADER TALKS. More sharing of shader tips/tricks in FX communities desired
Value of marketplace assets/doing work on marketplaces (UE4, Unity)?
- Marketplace asset creation as a structured learning tool (esp. for students)
- Purchase marketplace assets and tear 'em down, see what makes them tick. Good learning tool.
-Keijiro Takahashi mentioned as a good "open source" FX artist on the unity store, bitbucket. Look at what he does! LEARN, YOUNG PADAWAN
Katherine Cross's CRAZY GOOD Talk on writing Immoral Women in Games
I can't do this talk justice. Watch it on the vault when it comes out! IT'S REAL GOOD
"Neither saints, nor whores, but human." (why do we tend to make 'em saints or whores anyway???)
Good example: Kreia from KotOR 2. Driven to questionable actions by a life of experience. We're given a reason for WHY she's immoral, rather than just being told she is. Extensive exposure to her character (she's a mentor, the tutorial, a running commentary on the player's actions).
Not so good example: Madame from Remember Me. A compelling concept (master of memory and trama, power, psychological control) aesthetically and somewhat mechanically realized but not in plot or dialogue. Her fights are akin to a "hall of mirrors", emphasizing illusions and the fallibility of perception. But she's too freudian and, unlike Kreia, there's nothing to like about her. She's evil for evil's sake. No obvious ideology to emphathize with that would have driven her to become what she is. We don't know why she's immoral.
Good example: Hyun-Ae from Hate Plus. Similar to Madame in her position in a society, but she has an ethos (her motivation is to prevent social breakdown on a colony ship). We can imagine ourselves in her situation, follow her conclusions.
Characters must "show their work" (we must understand how they become who they are). Often afforded for male characters in media (ex. Walter White's evolution through Breaking Bad) but rarely afforded to female characters. House of Cards does it apparently (never seen it)
Easy to fall into the trap of making a female/minority character JUST their motivation/ethos/etc. Don't just make them symbols! Kreia has a unique personality, demands, reasons for believing what she does which extend beyond the "logic" of her beliefs. Don't say "we need a reason to make this character a woman" because then all they are in the context of the game is their woman-ness, THAT'S LAME MAN. Extends to race, gender, etc.
Almost-Good example: Knight Commander Merideth from Dragon Age 2. Pragmatic to a fault. Only failure is that she's the final boss and is relieved from responsibility by "oh she was influenced by a cursed sword all along". Totally removes her agency in the story. LET YOUR CHARACTERS OWN THEIR IMMORALITY ON THEIR OWN TERMS (cough cough bad Kerrigan writing COUGH COUGH)
Tech Issues in Tools Development: Usability, Languages, etc. Day 3
Worthwhile to hire a UX specialist for tools?
-Bungie: we have one, huge help for the engineers, provides usability feedback, makes tools better overall
-???: Have one, important that they have some tools dev experience and aren't just UX ppl (other studios concurred, but this is really hard to hire for)
-Might be reasonable to earmark a dev to become the local UX expert
Cross-Platform Scripting?
-Unity: funnel down to the fewest languages possible. Python isn't scripty enough, but Perl is
-???: "Whaat Perl is garbage, Python is great! (Unity guy was like "wat", luckily a nuclear-level argument was averted)
- Murmurs for LUA expressed. Lots of folks expressed support for node.js
Migration Frameworks (moving tools to a new language)
-Embed the old tool in a layer (ex. a Qt interface) and slowly rewrite. Qt support across languages, signals are generally reliable, good for this.
- Lots of people were doing this for old tools written in MFC? Everybody was like "MFC was bad, why'd we ever use it
- Overall this is a costly thing to do. Has to be justified by IMPROVING the tool during the rewrite, not just porting the functionality 1:1. Incentivize the transition.
Modular vs. Monolothic tools
-Me: Modular tools, monolothic deployment!
Nickelodeon: old monolithic tool being slowly phased out as its functionality is rewritten into smaller tools
Bungie: actually moving towards more monolothic development, inconvenient for users to need to constantly move between tools
(Lots of studios said this can be mitigated with a single access point to tools (ex. system tray icon), consistent visual semantics across tools, etc.)
Tools for Procedural Generation?
Ubisoft: needed tools for city generation, Houdini is the light. Didn't integrate directly into the engine, too costly.
DICE: tried to generate meshes from photogrammetry (prodecural rocks, etc.), didn't get great results. Better for artists to work with the data. Did see some success in tools for art cleanup (ex. capping mesh bottoms)
Writing your own language?
- LISP, ocaml cited as good metaprogramming languages
- After lots of discussion on benefits and drawbacks, agreement not to make your own language unless it's similar to/built off of an existing language. Too costly to train new engineers to your in-house language they've never worked with. You lose access to open source libraries unless you're closely wrapping the appropriate language. Support might be strong for your language initially, but tends to wither as more folks need to work w/ it.
Deployment techniques?
-Tray icon as updater (aw yiss)
-???: Version tracking of tools, prompting user to update manually
-???: Symlinking of repos to odd directories for things like photoshop plugin distribution (b/c you probably don't want to version control the photoshop installation)
-???: Let IT handle installation of external software. Your tools guys shouldn't worry about photoshop beyond the plugins they've written for it.
-DICE: Autosync tools, user sets interval (10 minutes, once per day, etc.). Perforce commands executed in Python.
Deployment as a byproduct of CI reasonable? Not really any thoughts on this
How to express complex data with a simple UI?
-JS frameworks for visualizing data are out there
-Angular.js tool "UI Grid" mentioned
- Edward Tuftee's "Qualitative Display of Quantitative Information" cited as excellent resource
Vs
Show sprite punching outside collision box and kill enemy if in range of punch
Next year?
GDC was amazing! And now I'm sick. Blasted con-flu...
Unreal Engine 4 Developers Community.
I'm working on a cute little video game! Here's a link for you.
I am trying to play a video file full screen before a level loads but I can't find the command to do so. I thought media player would do the trick but that is just seems to be for playing a movie onto a texture. Any ideas?
Heh, speaking of Destiny animating long dangly objects. My friend Mike had a cape that would freak out and start animating as if the end of it were stuck somewhere off the left of screen. The whole thing would get stretched out and distorted. It only happened to him for some reason. Other people in our group had no problem with that same cape.
Legends of Runeterra: MNCdover #moc
Switch ID: MNC Dover SW-1154-3107-1051
Steam ID
Twitch Page
*tries harder, goofs around*
Anyone want to see what happens when you combine a jackhammer office lady with evil secretaries that spawn every second and have kill code that works?
https://www.youtube.com/watch?v=dboeDNtvhg8
Hoping to do some more focused development shortly. Right now I'm working on some game design planning, particularly the progression pacing with upgrades. I set up a nifty spreadsheet to track how far I expect the player to be able to get with each run, which I can then use to tune the cost and efficacy of the upgrades:
Spreadsheets are pretty neat!
My motivation is some ongoing flabbergast with the way most commercial physics engines (as you find in Unity and Unreal) always come with a caveat somewhere in the documentation that fast-moving objects may fail to detect a collision. It suggests that what they're doing is just advancing time in really small increments and then fine-tuning the results to maintain the illusion of continuity. I'm sure there are very good reasons for taking that approach (possibly performance deriving from mass culling operations, though more likely it's the easy guarantee that collisions happen in the proper order), but as my dream-game design (the kind that floats around in the head of people who will never actually accomplish what they set out to do) entails characters at least temporarily moving around at large multiples of the width of potential obstructions, so that caveat is more like a serious prospective source of bugs. I want perfect continuity, hence this system.
After searching around the Internet for a starting point (certainly someone must have attempted this kind of thing before...!) and finding mostly either half-solutions or articles where the first several comments were "the code presented is buggy and doesn't work", I decided to just go ahead and start from first principles. The key insight that this entire function runs on is that when you move an AABB with a 2D velocity, you create a pair of time intervals on which the moving AABB overlaps the target AABB. If the two time intervals overlap, then the two bounding boxes collide. If the two bounding boxes don't overlap, or either of the overlap intervals falls out of the range [0, 1], then there is no collision. Then, based on which interval started last and which direction the velocity is in, you can determine and report which side of the target box was hit. This could be useful to determine if, say, wall-jumping should be available, whether you stomped on the enemy or smashed your face into it, any number of things that you could want to ram boxes into each other with.
Here's some pictures of the test application in action, showing that it correctly detects collisions when it should and doesn't detect them when it shouldn't:
And if you're curious, the full text of the C# function (as a member function of a class called "Block"; it could be trivially modified to operate on two Blocks passed in as parameters) is beneath this spoiler. The property definitions are left as a exercise to the reader.
This function is probably a bit too verbose to serve as a culling operation for the more complicated geometry collision engine I mentioned before (it's far simpler to just expand the bounding box by the velocity if all if you want is a general "should I bother to do more complicated collision on the things these AABBs are bounding" or as a parameter into a quadtree query), but it could still readily power some kind of NES-style game.
I'm "kupiyupaekio" on Discord.
really long ass invisible block for everyone to walk on
base enemy type for all other enemies to inherit code
spawners with alarms set to 30 frames on creation, counts down to 0 and the alarm (checks to make sure boss lady is out of range, drops an enemy, resets alarm to a random number between 1 and 4)
Multiple spawners.
https://youtu.be/Fs2oT7EmJ40
https://www.youtube.com/watch?v=B-gyVEqS1Ts
Not bad for a week's work
edit: It occurs to me I forgot to draw shoes and they all look line they are in a Tarantino film....I'll finish it in the morning.
This is relevant to my interests!
How are you doing it?
Unreal Engine 4 Developers Community.
I'm working on a cute little video game! Here's a link for you.
A greyscale depthmap, here it's a grey megaman that causes black lines to pop the most and white to be flat.
I'm reading the colored texture and generating a voxel mesh for each non-transparent pixel, this guy's got as many polygons as an Assassins Creed character because of that. Good enough to test with.
I thought about a vertex shader but I don't think that would create the right boxy look.
-I've got a singleton "Manager" object with a public variable exposed to the Inspector. On start-up, the Manager creates an object (let's call it an Actor) and passes the variable's value as a parameter. I want to be able to update the variable's value in the Inspector and see the change reflected in the Actor's behavior.
I was initially under the impression that I could pass the variable in by ref, but I can't hold on to the reference this way-- calling "this.value = value" in the Actor's constructor ends up copying the reference by value What I want to do is hold on to the reference--I can do this with a pointer, but that requires me to mark the Actor as unsafe. While the manager technically *could* get deleted without the Actor's knowledge, I know that this won't happen.
Solutions I've seen:
- Using the Manager's Update() to reassign the variable to the Actor on every frame. I don't like this b/c A) I'm constantly doing assignments, and I want to keep all of an Actor's properties private and use setters/getters for everything inside it. The Manager is the only object that should have reference ties to the Actor.
- Box the variable as an object, pass it by reference to the Actor on creation, and unbox the variable whenever the Actor needs to read its value. This seems like a reasonable approach right now, but I only want to update the variable at runtime for ease of tweaking during development. I don't want to have to box every variable I add to the Manager during development (apparently a costly thing to do), only to rip all the boxing/unboxing code out when I settle on their values.
Anybody have any thoughts? Alternate solutions? Should I just mark the dang thing as unsafe and use a pointer