As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/

[Programming] djmitchella travelling through snow to rfind duplicate dates for singletons

19495969799

Posts

  • SpawnbrokerSpawnbroker Registered User regular
    Hey testing center, thanks for that super informative email you sent before I travelled all the way to your testing site only to find it closed due to snow.

    Who am I kidding, you guys know there was no email.

    Steam: Spawnbroker
  • urahonkyurahonky Registered User regular
    urahonky wrote: »
    I'd make a separate controller for the barcode stuff, make _that_ controller track if it's loaded the library / set the hooks / whatever so it only happens once, and have each controller that needs the barcode stuff call
    this.get("controllers.barcodeLibraryWrapper").send("initializeBarcodeHooks")
    this.get("controllers.barcodeLibraryWrapper").send("setBarcodeScannedCallback", this.barcodeHandlerForThisController.bind(this));
    

    Alternative 2: have itemScan be the only one that loads things, and have itemDetails pass something to itemScan to say "actually, if you get a scan, don't handle it yourself, pass it on to me instead".

    The problem is that the scan action is completely different per application. So in app1 when I scan a barcode it will need to check the item type and if it's not SomeThing then it needs to throw an error. If it is SomeThing then it will need to call the backend API to transition the workflow.

    right, so that's why you'd pass in a different handler for the barcode getting scanned, line 2 of the code above; to complete that example:
    ...in controller 1...
    this.get("controllers.barcodeLibraryWrapper").send("initializeBarcodeHooks")
    this.get("controllers.barcodeLibraryWrapper").send("setBarcodeScannedCallback", this.theMethodThatController1DoesWhenBarcodeIsScanned.bind(this));
    ...
    
    ... in controller 2...
    this.get("controllers.barcodeLibraryWrapper").send("initializeBarcodeHooks")
    this.get("controllers.barcodeLibraryWrapper").send("setBarcodeScannedCallback", this.someOtherBarcodeMethodInController2.bind(this));
    ...
    

    If your controllers need to do different things depending on where they're getting used from, then you'll have to tell the controller where it's getting used from, or make multiple controllers.

    @djmitchella This worked! Thank you so much! I had to learn how to reinitialize a controller within the setupController method but after that it worked as intended! Thanks!

  • ecco the dolphinecco the dolphin Registered User regular
    edited March 2015
    Nuts.

    Hmm

    Just changed thread title.

    But need to acknowledge djmitchella's contribution.

    I know

    Edit: There. Keep up the good work. =P

    ecco the dolphin on
    Penny Arcade Developers at PADev.net.
  • DehumanizedDehumanized Registered User regular
    Hearing some secondhand info on Vulkan (aka the next gen OpenGL) from GDC



    oh

  • a5ehrena5ehren AtlantaRegistered User regular
    edited March 2015
    That aligns with my OpenCL experience. That stuff is a god-damn nightmare.

    a5ehren on
  • DehumanizedDehumanized Registered User regular
    this is supposed to dethrone directx

  • lazegamerlazegamer The magnanimous cyberspaceRegistered User regular
    I think that's just par for the course. I pulled up the first hit from google for a tutorial for drawing a triangle in directx and it really doesn't seem much better.

    http://www.directxtutorial.com/Lesson.aspx?lessonid=11-4-5

    I would download a car.
  • bowenbowen How you doin'? Registered User regular
    lazegamer wrote: »
    I think that's just par for the course. I pulled up the first hit from google for a tutorial for drawing a triangle in directx and it really doesn't seem much better.

    http://www.directxtutorial.com/Lesson.aspx?lessonid=11-4-5

    A lot of that gets abstracted away from you once you've built your core functionality.

    You wouldn't keep setting that stuff up each time you wanted to draw a triangle, you'd just pass the vertices to a class and boom new triangle with all that backup code.

    OGL is def easier than DX though.

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • urahonkyurahonky Registered User regular
    Computer Graphics was a nightmare for me.

  • djmitchelladjmitchella Registered User regular
    aw, you shouldn't have.

    That said, it's much appreciated, because today Ember is beating me up by somehow managing to create random broken versions of the global Application object if I run it in the super-elderly embedded version of webkit we're stuck with. That was exciting to diagnose -- our code looks like:
    initializer: function() {
      //... populate 'data' appropriately (it's static so we can do it at initializer time)...
      Application.bigBlobOfData = data;
    }
    ...and then, elsewhere...
     for (var i=0; i < Application.bigBlobOfData.items.length; i++) // roughly speaking
    
    In Chrome/firefox/etc, great. In Webkit v.getOffMyLawnYouDarnedKids, Application.bigBlobOfData is valid the first time when it's getting written -- but it has mysteriously become null when I try and retrieve it later on.

    So, time to change it to window.bigBlobOfData which doesn't get stomped on, and NEVER LOOK AT THAT CODE EVER AGAIN. This sort of thing does not make me optimistic that we'll meet our current schedule, but I guess we'll see.

  • ecco the dolphinecco the dolphin Registered User regular
    We're also celebrating that @DyasAlure has forked and is now a parent process to a second child.

    Yaaaaay Dyas!

    Penny Arcade Developers at PADev.net.
  • mightyjongyomightyjongyo Sour Crrm East Bay, CaliforniaRegistered User regular
    okay, so since a bunch of you know database queries like the back of your hands - is there a way to "translate" columns within a sqlite view? so if i have a table with columns like:
    name (varchar) | location (varchar)
    

    and another table like:
    location (varchar) | id (int)
    

    Would it be possible to create a table view such that I have:
    name (varchar) | location (int)
    

    ??

    From what I can see I probably want an inner join? although that would end up with something like
    name (varchar) | location (varchar) | id (int)
    

    ...right? or am I completely misunderstanding joins.

  • DyasAlureDyasAlure SeattleRegistered User regular
    edited March 2015
    We're also celebrating that @DyasAlure has forked and is now a parent process to a second child.

    Yaaaaay Dyas!

    Hey now, that fork happened in another development environment. I'm not sure how it migrated itself over to here, but thank you. I just finished my calc IV homework (Yes, with everything going on, I still have school) and my wife hasn't called me from the hospital. I'm taking that to mean that nothing things are still improving.

    If you don't follow the steam thread, http://forums.penny-arcade.com/discussion/comment/32029659/#Comment_32029659 here is the announcement. I'm sure things are fine, but as a parent to a child object, you worry the fork didn't go well.

    DyasAlure on
    My%20Steam.png?psid=1My%20Twitch%20-%20Mass%20Effect.png?psid=1=1My%20Youtube.png?psid=1
  • TofystedethTofystedeth Registered User regular
    okay, so since a bunch of you know database queries like the back of your hands - is there a way to "translate" columns within a sqlite view? so if i have a table with columns like:
    name (varchar) | location (varchar)
    

    and another table like:
    location (varchar) | id (int)
    

    Would it be possible to create a table view such that I have:
    name (varchar) | location (int)
    

    ??

    From what I can see I probably want an inner join? although that would end up with something like
    name (varchar) | location (varchar) | id (int)
    

    ...right? or am I completely misunderstanding joins.

    Do the inner join then just select name and id as location?

    steam_sig.png
  • mightyjongyomightyjongyo Sour Crrm East Bay, CaliforniaRegistered User regular
    ahhh, okay! thanks!

  • InfidelInfidel Heretic Registered User regular
    I just got home from a late shift.

    I accomplished today what would take someone else on my team at least a week to do if I told them how, and probably would never get it figured out at all if I didn't.

    BALL IS IN YOUR COURT NOW, CLIENT.

    Now we can actually make good on threats of timelines slipping when they don't make time to review the requirements or demos.

    OrokosPA.png
  • a5ehrena5ehren AtlantaRegistered User regular
    edited March 2015
    Hey everyone, a quick PSA from the guy who has to port all your code to 64-bit:

    Stop casting pointers to int types! It was a happy accident that it worked on 32-bit systems, but now I've spent all morning typing "intptr_t" and "#include <inttypes.h>" and it's getting old pretty fast.

    a5ehren on
  • TofystedethTofystedeth Registered User regular
    I think this might be the first time a thing of mine has made it into the buzzword soup of the thread title, and I'm proud that it might be part of the title the thread goes out on.

    steam_sig.png
  • bowenbowen How you doin'? Registered User regular
    Wait... wouldn't that still work on 64bit?

    void* and int are both 4 byte and 8 byte respectively on 32 and 64 bit systems right? Unless the compiler is converting int to int32 for some reason?

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • bowenbowen How you doin'? Registered User regular
    We should just drop int entirely and force int32/int64 naming conventions tbh.

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • RendRend Registered User regular
    bowen wrote: »
    We should just drop int entirely and force int32/int64 naming conventions tbh.

    Why? 32 bit systems are phasing out and will eventually be relics, and there's no other real reason to specify a 32 vs 64 bit integer.

    Enforcing that kind of naming convention is just going to make programmers 20 years from now very angry that they're stuck with this stupid legacy convention from back when they had 32 bit processors.

  • bowenbowen How you doin'? Registered User regular
    Rend wrote: »
    bowen wrote: »
    We should just drop int entirely and force int32/int64 naming conventions tbh.

    Why? 32 bit systems are phasing out and will eventually be relics, and there's no other real reason to specify a 32 vs 64 bit integer.

    Enforcing that kind of naming convention is just going to make programmers 20 years from now very angry that they're stuck with this stupid legacy convention from back when they had 32 bit processors.

    Because platform incongruity is still a thing in 2015.

    What does an int mean? Doesn't really mean anything, it's ambiguous. It can mean anywhere from a short to a N-byte monstrosity depending on the system and implementation.

    Dropping "int" for int32 or int64 would be good in general because it makes people think about bytes more frequently than they do, and you avoid situations where void* is getting cast to int. "Well I know I need 4 bytes, so int it is!"

    Not that it necessarily would have resolved this issue, but there's been a lot of times I've run into code that has issues with bounds because no one understands 'int' isn't shorthand for int32 and it's compiler/platform dependent.

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • bowenbowen How you doin'? Registered User regular
    Also, in 20 years, people should still be only using an int32 if they only need a certain range of numbers.

    You guys just slap ints around even if you only really need a uint16 ?

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • a5ehrena5ehren AtlantaRegistered User regular
    edited March 2015
    bowen wrote: »
    Wait... wouldn't that still work on 64bit?

    void* and int are both 4 byte and 8 byte respectively on 32 and 64 bit systems right? Unless the compiler is converting int to int32 for some reason?

    Pointers are 8 bytes on 64-bit machines (think about it), 4 bytes on 32-bit machines.

    Plain int is 4 bytes on both.

    Long int is 4 bytes on 32-bit, but 8 bytes on 64-bit UNIX machines (Windows forces you to use explicit 64-bit types).

    a5ehren on
  • a5ehrena5ehren AtlantaRegistered User regular
    edited March 2015
    But seriously, just use u/intptr_t for pointer conversions, please. It is guaranteed to grow/shrink to the appropriate int size based on machine type.

    a5ehren on
  • bowenbowen How you doin'? Registered User regular
    a5ehren wrote: »
    .
    bowen wrote: »
    Wait... wouldn't that still work on 64bit?

    void* and int are both 4 byte and 8 byte respectively on 32 and 64 bit systems right? Unless the compiler is converting int to int32 for some reason?

    Pointers are 8 bytes on 64-bit machines (think about it), 4 bytes on 32-bit machines.

    Plain int is 4 bytes on both.

    Long int is 4 bytes on 32-bit, but 8 bytes on 64-bit UNIX machines (Windows forces you to use explicit 64-bit types).

    Ah I was under the assumption that 64bit compilers had moved int into the realm of 8 bytes now. Looks like this is not the case.

    Are they doing it for antiquity still? Obviously recompiling a program with ints thrown around into 64 bit would immediately double its memory footprint eh?

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • RendRend Registered User regular
    bowen wrote: »
    Rend wrote: »
    bowen wrote: »
    We should just drop int entirely and force int32/int64 naming conventions tbh.

    Why? 32 bit systems are phasing out and will eventually be relics, and there's no other real reason to specify a 32 vs 64 bit integer.

    Enforcing that kind of naming convention is just going to make programmers 20 years from now very angry that they're stuck with this stupid legacy convention from back when they had 32 bit processors.

    Because platform incongruity is still a thing in 2015.

    What does an int mean? Doesn't really mean anything, it's ambiguous. It can mean anywhere from a short to a N-byte monstrosity depending on the system and implementation.

    Dropping "int" for int32 or int64 would be good in general because it makes people think about bytes more frequently than they do, and you avoid situations where void* is getting cast to int. "Well I know I need 4 bytes, so int it is!"

    Not that it necessarily would have resolved this issue, but there's been a lot of times I've run into code that has issues with bounds because no one understands 'int' isn't shorthand for int32 and it's compiler/platform dependent.

    Yes, of course it's still a thing in 2015, that's what we're discussing right now. When you are declaring an int, you should not be thinking "I need 4 bytes," you should be thinking "I need a number up to X in magnitude." There are basically three situations where a programmer needs to count bits in their variables.

    1. They have something that might not fit in a smaller variable. Instead of counting them, just use the bigger one. They're cheap. It's fine.
    2. They are counting bits because they need a specific size for a data transfer or storage. In this case, you're already thinking about how big it is, and as such if you're doing it right, you'll declare a size-specific variable like int32 anyway.
    3. You're using it as a container for a bunch of flags, which is not what an int is for. The solution to this problem isn't to force EVERYONE to ALWAYS use int32, it's to implement a bit field if you super need to not use a struct or other data structure.

  • RendRend Registered User regular
    edited March 2015
    bowen wrote: »
    Also, in 20 years, people should still be only using an int32 if they only need a certain range of numbers.

    This is exactly why it's a bad idea to enforce a convention like you're suggesting.

    Rend on
  • bowenbowen How you doin'? Registered User regular
    Pft magnitude, always think in bytes.

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • bowenbowen How you doin'? Registered User regular
    Rend wrote: »
    bowen wrote: »
    Also, in 20 years, people should still be only using an int32 if they only need a certain range of numbers.

    This is exactly why it's a bad idea to enforce a convention like you're suggesting.

    So you use int even though your variable only needs 2 bytes of data, ie the protocol designed will always only have a max value in that range, and if you need to change it, the entire system is changing.

    (I deal with a lot of lower level protocols at the moment where they specify byte lengths)

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • RendRend Registered User regular
    bowen wrote: »
    Rend wrote: »
    bowen wrote: »
    Also, in 20 years, people should still be only using an int32 if they only need a certain range of numbers.

    This is exactly why it's a bad idea to enforce a convention like you're suggesting.

    So you use int even though your variable only needs 2 bytes of data, ie the protocol designed will always only have a max value in that range, and if you need to change it, the entire system is changing.

    (I deal with a lot of lower level protocols at the moment where they specify byte lengths)

    A byte-length specified protocol is, like, the exact situation where, yes, you want to use size specific variables. But just because you see idiots who assume all ints are a certain size, that doesn't mean people who just want a number should have to explicitly specify exactly how large a memory footprint their number is going to take up.

    And yes, I use ints even if my number is guaranteed to be between 1 and 10.

  • bowenbowen How you doin'? Registered User regular
    That seems wasteful, though.

    Eh, I'd rather take a minute to think about the kind of footprint the entire system is going to take up.

    In my system if I was using ints where shorts were needed, the size could balloon up to 2-4+ gigs (this is bad). It's critically important to think about this stuff, maybe not all the time, but I think it's at least something that one should consider. Even if you only ever have 5 ints in your program.

    It seems like it wouldn't really impact your workflow and you'd just have to consider what kind of int you want, and you seem salty about the concept of it? Maybe 15 years from now your program is compiled on a specialized system and works great except for occasionally all the data goes into negatives or wraps back to 0.

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • a5ehrena5ehren AtlantaRegistered User regular
    bowen wrote: »
    a5ehren wrote: »
    .
    bowen wrote: »
    Wait... wouldn't that still work on 64bit?

    void* and int are both 4 byte and 8 byte respectively on 32 and 64 bit systems right? Unless the compiler is converting int to int32 for some reason?

    Pointers are 8 bytes on 64-bit machines (think about it), 4 bytes on 32-bit machines.

    Plain int is 4 bytes on both.

    Long int is 4 bytes on 32-bit, but 8 bytes on 64-bit UNIX machines (Windows forces you to use explicit 64-bit types).

    Ah I was under the assumption that 64bit compilers had moved int into the realm of 8 bytes now. Looks like this is not the case.

    Are they doing it for antiquity still? Obviously recompiling a program with ints thrown around into 64 bit would immediately double its memory footprint eh?

    According to wiki, it's just the data model that UNIX-likes run on. I assume it's a decision lost in the mists of time.

  • RendRend Registered User regular
    bowen wrote: »
    That seems wasteful, though.

    Eh, I'd rather take a minute to think about the kind of footprint the entire system is going to take up.

    In my system if I was using ints where shorts were needed, the size could balloon up to 2-4+ gigs (this is bad). It's critically important to think about this stuff, maybe not all the time, but I think it's at least something that one should consider. Even if you only ever have 5 ints in your program.

    It seems like it wouldn't really impact your workflow and you'd just have to consider what kind of int you want, and you seem salty about the concept of it? Maybe 15 years from now your program is compiled on a specialized system and works great except for occasionally all the data goes into negatives or wraps back to 0.

    None of this has anything to do with renaming int to int32 or int64. If the size of your variable is important to you, you use a variable of specific size. If it's not, then you eyeball it to make sure it'll fit for all reasonable values. I'm not salty about having to consider the size of variables, I do that all the time, and I actually quite enjoy the random byte-counting I have to do. But, I certainly don't want to be bothered with it when I'm just writing fizzbuzz.

  • bowenbowen How you doin'? Registered User regular
    Why? What would it impact? You would implicitly know you need an int32 wouldn't you?

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • a5ehrena5ehren AtlantaRegistered User regular
    LOL
    uint32 timeMin = ULONG_MAX;
    

    Now I'm just getting stabby.

  • bowenbowen How you doin'? Registered User regular
    wow

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • bowenbowen How you doin'? Registered User regular
    Does @ecco the dolphin 's coworker work there?

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • RendRend Registered User regular
    bowen wrote: »
    Why? What would it impact? You would implicitly know you need an int32 wouldn't you?

    It's extra cognitive load. How many bits I want in this float is irrelevant most of the time, and at that point it's just extra language for no reason.

  • bowenbowen How you doin'? Registered User regular
    We agree to disagree, I don't think it's any more load than if in the back of your mind have to go "well shit is int going to work in this case?"

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
Sign In or Register to comment.