Options

[Programming] djmitchella travelling through snow to rfind duplicate dates for singletons

12324262829100

Posts

  • Options
    CokebotleCokebotle 穴掘りの 電車内Registered User regular
    LD50 wrote: »
    If it's actualfact c then it has to be malloc.

    So even if it's C++ (or whatever we're using), I cannot use a variable to set an array size unless I allocate memory? I can't do this:
    int set[<variable>];
    

    工事中
  • Options
    EtheaEthea Registered User regular
    Cokebotle wrote: »
    LD50 wrote: »
    If it's actualfact c then it has to be malloc.

    So even if it's C++ (or whatever we're using), I cannot use a variable to set an array size unless I allocate memory? I can't do this:
    int set[<variable>];
    

    Correct that is not allowed by the C++ spec, but it is allowed by C99 ( see VLA ). Now gcc by default enables some C99 features when compiling C++ code, one of which is VLA.

    So your best options when writing C++ code that you want to make sure matches spec is to at least use "-std=c++98 -pedantic"

  • Options
    CokebotleCokebotle 穴掘りの 電車内Registered User regular
    Hm... Ok. Thanks!

    工事中
  • Options
    ironsizideironsizide You must whip it Registered User regular
    LD50 wrote: »
    ironsizide wrote: »
    If I only had to deal with XML, yeah. But we also send data to set top boxes (Roku for example) where an errant character isn't just gracefully ignored - it crashes the damn app completely.

    Shouldn't that be handled by whatever app is sending the data though?

    It's going to sound like I'm making excuses, but we have a single API all the devices use, and Roku was a device we added later on, and since the system was working perfectly otherwise, and I was adding major modules of code to the system constantly, I just never got to doing something like this (the system doing the substitution for something Roku would understand).

    |_
    Oo\ Ironsizide
    camo_sig2.png
  • Options
    iTunesIsEviliTunesIsEvil Cornfield? Cornfield.Registered User regular
    I hate working on front-end web stuff. :( It is not my comfort-zone.

    I also hate it when my boss thinks that we can continue our "fly by the seat of our pants" strategy that we've used since the 80's with shit like storing electronic-signatures for consenting to receive tax-documents online, etc etc. :(

  • Options
    bowenbowen How you doin'? Registered User regular
    edited October 2014
    Unsanitised human input. Bane of our existence.

    Destroy the humans. It is the only solution.

    Edit: And now we know why SkyNet chose its actions.

    It gets better.

    It also has < and > in it.

    Something tells me I'm going to need to write my own xml reader.

    bowen on
    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • Options
    iTunesIsEviliTunesIsEvil Cornfield? Cornfield.Registered User regular
    bowen wrote: »
    Unsanitised human input. Bane of our existence.

    Destroy the humans. It is the only solution.

    Edit: And now we know why SkyNet chose its actions.

    It gets better.

    It also has < and > in it.

    Something tells me I'm going to need to write my own xml reader.

    The only proper response to the bolded is to post dis...

    the_scream_sm.png

  • Options
    bowenbowen How you doin'? Registered User regular
    I'm just as upset as it is.

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • Options
    DyasAlureDyasAlure SeattleRegistered User regular
    I don't know if it is an option, but in later c++ can't you use vector's? Much more dynamic.

    My%20Steam.png?psid=1My%20Twitch%20-%20Mass%20Effect.png?psid=1=1My%20Youtube.png?psid=1
  • Options
    DyasAlureDyasAlure SeattleRegistered User regular
    http://www.cplusplus.com/reference/vector/vector/

    That goes over things I'm not sure I understand, but some of it I remember, it is C++98 I guess. Well, what I learned. I'm sure there are people here who know way more on what you need for dynamic array though.

    My%20Steam.png?psid=1My%20Twitch%20-%20Mass%20Effect.png?psid=1=1My%20Youtube.png?psid=1
  • Options
    EtheaEthea Registered User regular
    I ignored std::vector as I was expecting that is he being currently taught some bad form of C which uses the C++ compiler.

  • Options
    DaedalusDaedalus Registered User regular
    So I spent part of last week using libnfs to write a small NFS client for this embedded thing I'm working on.

    It would have been nice if libnfs came with any documentation whatsoever, but hey, that's an open source library for ya.

    After I get it working, and test it against some random NFSv3 server on our development LAN, I go to test it with the craptastic legacy embedded system that it needs to work with.

    It doesn't work, because the craptastic legacy system is using NFSv2. libnfs has NFSv2 support, but you need to use a completely different API for it, and I didn't realize this until now because of the aforementioned complete lack of documentation. So I need to rewrite half my damn client. Ugh.

  • Options
    DrezDrez Registered User regular
    This is a novice question I guess but maybe someone can enlighten me. This is a Swift question.

    I know I need to work with Integers larger than 2.15B (I need up to about 84B). So the correct type for me would be Int64. Or just Int. According to Apple:
    Int

    In most cases, you don’t need to pick a specific size of integer to use in your code. Swift provides an additional integer type, Int, which has the same size as the current platform’s native word size:

    On a 32-bit platform, Int is the same size as Int32.
    On a 64-bit platform, Int is the same size as Int64.

    Unless you need to work with a specific size of integer, always use Int for integer values in your code. This aids code consistency and interoperability. Even on 32-bit platforms, Int can store any value between -2,147,483,648 and 2,147,483,647, and is large enough for many integer ranges.

    What do they mean by "current platform"? If I use an Int64 (or just Int, knowing that values will exceed 2.15B), does that mean my app will be incompatible with iDevices running 32-bit versions of iOS?

    Switch: SW-7690-2320-9238Steam/PSN/Xbox: Drezdar
  • Options
    bowenbowen How you doin'? Registered User regular
    Basically, if it's a 32 bit platform, it'll be int32, if it's a 64 bit platform it'll be int64. Never assume anything about how many bytes a datatype can store. If you need a specific size, specify it on the code level.

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • Options
    bowenbowen How you doin'? Registered User regular
    Basically, even in most languages "int" will switch based on what compiler you use. It could be 8, 16, 32, 64, ++ bytes.

    But int32 is always an int32 and an int64 is always an int64.

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • Options
    EtheaEthea Registered User regular
    I am really curious in what you are representing that you require values larger than 2.1B.

  • Options
    bowenbowen How you doin'? Registered User regular
    Ethea wrote: »
    I am really curious in what you are representing that you require values larger than 2.1B.

    Money would be my first guess.

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • Options
    CampyCampy Registered User regular
    So to bring the newbie-ness down another level...

    What the hell is 2.1B?

  • Options
    DaedalusDaedalus Registered User regular
    Ethea wrote: »
    I am really curious in what you are representing that you require values larger than 2.1B.

    File sizes? Memory addresses?

  • Options
    IncindiumIncindium Registered User regular
    edited October 2014
    bowen wrote: »
    Unsanitised human input. Bane of our existence.

    Destroy the humans. It is the only solution.

    Edit: And now we know why SkyNet chose its actions.

    It gets better.

    It also has < and > in it.

    Something tells me I'm going to need to write my own xml reader.

    That's when you just write a data sanitizer to run on the file as a preprocessing step.

    Incindium on
    steam_sig.png
    Nintendo ID: Incindium
    PSN: IncindiumX
  • Options
    bowenbowen How you doin'? Registered User regular
    Incindium wrote: »
    bowen wrote: »
    Unsanitised human input. Bane of our existence.

    Destroy the humans. It is the only solution.

    Edit: And now we know why SkyNet chose its actions.

    It gets better.

    It also has < and > in it.

    Something tells me I'm going to need to write my own xml reader.

    That's when you just write a data sanitizer to run on the file as a preprocessing step.

    Can't!

    <> would flag false positives on XML entities wouldn't it?

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • Options
    PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    Campy wrote: »
    So to bring the newbie-ness down another level...

    What the hell is 2.1B?

    Approximately 2.1 billion, specifically 2**31-1

  • Options
    PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    bowen wrote: »
    Incindium wrote: »
    bowen wrote: »
    Unsanitised human input. Bane of our existence.

    Destroy the humans. It is the only solution.

    Edit: And now we know why SkyNet chose its actions.

    It gets better.

    It also has < and > in it.

    Something tells me I'm going to need to write my own xml reader.

    That's when you just write a data sanitizer to run on the file as a preprocessing step.

    Can't!

    <> would flag false positives on XML entities wouldn't it?

    Speculatively parse when you encounter stray <>s, then if you hit an error, backup and assume it's text!

  • Options
    DaedalusDaedalus Registered User regular
    Campy wrote: »
    So to bring the newbie-ness down another level...

    What the hell is 2.1B?

    The B stands for billion. A signed 32-bit integer can hold values from -(2^31) through (2^31)-1, an unsigned 32-bit integer can hold values from 0 through (2^32)-1.

    (2^31)-1 = 2,147,483,647.

  • Options
    bowenbowen How you doin'? Registered User regular
    Phyphor wrote: »
    bowen wrote: »
    Incindium wrote: »
    bowen wrote: »
    Unsanitised human input. Bane of our existence.

    Destroy the humans. It is the only solution.

    Edit: And now we know why SkyNet chose its actions.

    It gets better.

    It also has < and > in it.

    Something tells me I'm going to need to write my own xml reader.

    That's when you just write a data sanitizer to run on the file as a preprocessing step.

    Can't!

    <> would flag false positives on XML entities wouldn't it?

    Speculatively parse when you encounter stray <>s, then if you hit an error, backup and assume it's text!

    Yeah back to "write my own" :(

    Though at this point I think it's going to be "ignore data sets that have this" at this point and use the default xmlreader.

    Read, encounter error, skip to next "row" item. Boom roasted.

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • Options
    DrezDrez Registered User regular
    bowen wrote: »
    Ethea wrote: »
    I am really curious in what you are representing that you require values larger than 2.1B.

    Money would be my first guess.

    More or less, yeah.

    Switch: SW-7690-2320-9238Steam/PSN/Xbox: Drezdar
  • Options
    PolaritiePolaritie Sleepy Registered User regular
    Phyphor wrote: »
    bowen wrote: »
    Incindium wrote: »
    bowen wrote: »
    Unsanitised human input. Bane of our existence.

    Destroy the humans. It is the only solution.

    Edit: And now we know why SkyNet chose its actions.

    It gets better.

    It also has < and > in it.

    Something tells me I'm going to need to write my own xml reader.

    That's when you just write a data sanitizer to run on the file as a preprocessing step.

    Can't!

    <> would flag false positives on XML entities wouldn't it?

    Speculatively parse when you encounter stray <>s, then if you hit an error, backup and assume it's text!

    If its all the one tag, can you just script CDATA around the text?

    Steam: Polaritie
    3DS: 0473-8507-2652
    Switch: SW-5185-4991-5118
    PSN: AbEntropy
  • Options
    DrezDrez Registered User regular
    bowen wrote: »
    Basically, even in most languages "int" will switch based on what compiler you use. It could be 8, 16, 32, 64, ++ bytes.

    But int32 is always an int32 and an int64 is always an int64.

    My question is more - if I compile as Int64, does that make the program incompatible (at runtime, not build time) on 32-bit OSes. I'm assuming yes it will be incompatible, but I want to be clear.

    Switch: SW-7690-2320-9238Steam/PSN/Xbox: Drezdar
  • Options
    bowenbowen How you doin'? Registered User regular
    Drez wrote: »
    bowen wrote: »
    Ethea wrote: »
    I am really curious in what you are representing that you require values larger than 2.1B.

    Money would be my first guess.

    More or less, yeah.

    You can get around this by using int16s and doing 4-5 subdivisions.

    Like you'd see in RPGs : copper, silver, gold, platinum, unobtanium

    That is, if you're not making a "real world dollars" program.

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • Options
    DrezDrez Registered User regular
    bowen wrote: »
    Drez wrote: »
    bowen wrote: »
    Ethea wrote: »
    I am really curious in what you are representing that you require values larger than 2.1B.

    Money would be my first guess.

    More or less, yeah.

    You can get around this by using int16s and doing 4-5 subdivisions.

    Like you'd see in RPGs : copper, silver, gold, platinum, unobtanium

    That is, if you're not making a "real world dollars" program.

    Actually, money is only one aspect. I've decided to go the crazy Disgaea route with stats. So my EXP table effectively reaches toward 100B.

    I'm currently trying to tackle how best to generate random numbers with a min/max range beyond 4.3B, since arc4random_uniform() seems to be the preferred method, even in Swift, but only accepts UInt32.

    Switch: SW-7690-2320-9238Steam/PSN/Xbox: Drezdar
  • Options
    PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    edited October 2014
    I don't think the RPGs actually use that setup, I'm pretty sure internally they just have # of coppers and then convert for display. That's how WoW did it, and for a very long time they had a 214748g 36s 47c limit. Properly doing the math split across multiple values is much harder. You're writing a bigint library at that point

    Phyphor on
  • Options
    bowenbowen How you doin'? Registered User regular
    Drez wrote: »
    bowen wrote: »
    Basically, even in most languages "int" will switch based on what compiler you use. It could be 8, 16, 32, 64, ++ bytes.

    But int32 is always an int32 and an int64 is always an int64.

    My question is more - if I compile as Int64, does that make the program incompatible (at runtime, not build time) on 32-bit OSes. I'm assuming yes it will be incompatible, but I want to be clear.

    What I'm basically talking about is:
    let myInt:UInt64 = 2000000000;
    

    versus
    let myInt = 2000000000;
    

    or
    let myInt:UInt = 2000000000;
    

    I think that's how swift does variables, can't remember though, it's been a few months since I've looked at the book.

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • Options
    InfidelInfidel Heretic Registered User regular
    random64bit = random32bitgenerated1 << 32 | random32bitgenerated2

    OrokosPA.png
  • Options
    bowenbowen How you doin'? Registered User regular
    Phyphor wrote: »
    I don't think the RPGs actually use that setup, I'm pretty sure internally they just have # of coppers and then convert for display. That's how WoW did it, and for a very long time they had a 214748g 36s 47c limit. Properly doing the math split across multiple values is much harder. You're writing a bigint library at that point

    That's pretty lazy!

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • Options
    EtheaEthea Registered User regular
    Drez wrote: »
    bowen wrote: »
    Drez wrote: »
    bowen wrote: »
    Ethea wrote: »
    I am really curious in what you are representing that you require values larger than 2.1B.

    Money would be my first guess.

    More or less, yeah.

    You can get around this by using int16s and doing 4-5 subdivisions.

    Like you'd see in RPGs : copper, silver, gold, platinum, unobtanium

    That is, if you're not making a "real world dollars" program.

    Actually, money is only one aspect. I've decided to go the crazy Disgaea route with stats. So my EXP table effectively reaches toward 100B.

    I'm currently trying to tackle how best to generate random numbers with a min/max range beyond 4.3B, since arc4random_uniform() seems to be the preferred method, even in Swift, but only accepts UInt32.


    custom class of two UInt32 that represent the high / low of a UInt64.

  • Options
    PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    Drez wrote: »
    bowen wrote: »
    Basically, even in most languages "int" will switch based on what compiler you use. It could be 8, 16, 32, 64, ++ bytes.

    But int32 is always an int32 and an int64 is always an int64.

    My question is more - if I compile as Int64, does that make the program incompatible (at runtime, not build time) on 32-bit OSes. I'm assuming yes it will be incompatible, but I want to be clear.

    Nope. You can do 64-bit math on any CPU, even 8-bit ones. It's just slower to do so - but you will need to do separate builds since you can't run a 64-bit binary on a 32-bit system, so the compiler will take care of everything

  • Options
    DrezDrez Registered User regular
    edited October 2014
    bowen wrote: »
    Drez wrote: »
    bowen wrote: »
    Basically, even in most languages "int" will switch based on what compiler you use. It could be 8, 16, 32, 64, ++ bytes.

    But int32 is always an int32 and an int64 is always an int64.

    My question is more - if I compile as Int64, does that make the program incompatible (at runtime, not build time) on 32-bit OSes. I'm assuming yes it will be incompatible, but I want to be clear.

    What I'm basically talking about is:
    let myInt:UInt64 = 2000000000;
    

    versus
    let myInt = 2000000000;
    

    or
    let myInt:UInt = 2000000000;
    

    I think that's how swift does variables, can't remember though, it's been a few months since I've looked at the book.

    Sorry maybe I'm not wording the question the right way...

    What I gather from what I'm reading at Apple's site is this:
    var integerDude: Int
    var integerDude64: Int64
    

    Are identical because I'm compiling this on a 64-bit machine. Both variables are declared as Int64. But maybe I'm wrong.

    But even that isn't really my question. My question is (and yes, this is probably exceedingly newbish): If I use 64-bit integers, and compile my program, and sell it to someone using an iPod Whatever running a 32-bit version of iOS, is the game going to run or not. How does my use of 64-bit integers relate to the iOS running the compiled program?

    Drez on
    Switch: SW-7690-2320-9238Steam/PSN/Xbox: Drezdar
  • Options
    DrezDrez Registered User regular
    Phyphor wrote: »
    Drez wrote: »
    bowen wrote: »
    Basically, even in most languages "int" will switch based on what compiler you use. It could be 8, 16, 32, 64, ++ bytes.

    But int32 is always an int32 and an int64 is always an int64.

    My question is more - if I compile as Int64, does that make the program incompatible (at runtime, not build time) on 32-bit OSes. I'm assuming yes it will be incompatible, but I want to be clear.

    Nope. You can do 64-bit math on any CPU, even 8-bit ones. It's just slower to do so - but you will need to do separate builds since you can't run a 64-bit binary on a 32-bit system, so the compiler will take care of everything

    OK thanks. :)

    Switch: SW-7690-2320-9238Steam/PSN/Xbox: Drezdar
  • Options
    bowenbowen How you doin'? Registered User regular
    No you're right. But, if you compiled that as a 32 bit binary, integerDude64 is still a 64 bit integer. But, integerDude becomes 32 bit. Int is basically shorthand for whatever method the compiler is using at the time.

    It's all well and fine to use that if you don't care about the range and are likely to never hit it. If you know you're going to need billions of numbers, specify it as Int64 like your second one.

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
Sign In or Register to comment.