Our new Indie Games subforum is now open for business in G&T. Go and check it out, you might land a code for a free game. If you're developing an indie game and want to post about it, follow these directions. If you don't, he'll break your legs! Hahaha! Seriously though.
Our rules have been updated and given their own forum. Go and look at them! They are nice, and there may be new ones that you didn't know about! Hooray for rules! Hooray for The System! Hooray for Conforming!

#define THREAD_TITLE "PA Programming Thread"

15657585961

Posts

  • jothkijothki Registered User regular
    edited October 2010
    Perhaps he means declaring them as global variables?

    Seems kind of pointless. Either you're going to have to create and destroy them anyway, or they cost no overhead whatsoever since C++ isn't memory-managed. All adding a local variable to a function does is increase the stack pointer offset for that function.

  • InfidelInfidel Heretic Registered User regular
    edited October 2010
    Globals are the opposite of what I would call temporary. :lol:

    The post is pretty hard for me to decipher this late.

    TwitchTV channel: OrokosPA OrokosPA
    Play D&D 4e? :: Check out Orokos and upload your Character Builder sheet! :: Orokos Dice Roller
    The PhalLounge :: Chat board for Critical Failures IRC! :: #CriticalFailures and #mafia on irc.slashnet.org
  • khainkhain Registered User regular
    edited October 2010
    Infidel wrote: »
    Globals are the opposite of what I would call temporary. :lol:

    The post is pretty hard for me to decipher this late.

    Clarification would help, but I think that's exactly the point. Instead of allocating and freeing memory for variables when needed just make them global so they're already instantiated and can be used and thus you save some cpu cycles. In my experience though on an embedded system memory is a more valuable resource than the processor and I'd imagine that your memory footprint would go through the roof if you implemented this everywhere. I'd also question how much you'd actually save and there have got to be better optimizations that you should implement first.

  • PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    edited October 2010
    Well another thing is that on embedded platforms, malloc is usually limited. I do embedded work and the basic malloc pool we have is I think about 512kB for the entire system. There are other memory pools for buffers, but it's better to declare something statically instead of mallocing it, unless the system can cope with the allocation failing and/or it's only alive for short periods of time.

  • EndEnd Registered User regular
    edited October 2010
    The last time I did embedded programming I don't think we even had a working malloc.

    It was really okay though, since usually the stack was good enough if we didn't want to make it permanently stored in memory.

    We didn't use C++ though, just regular C.

    everyone I know goes away, in the end
    zaleiria-by-lexxy-sig~medium.jpgsteam~tinythumb.png
  • bowenbowen Registered User regular
    edited October 2010
    So. Name based UUIDs. Any idea on how to generate them in C++? I know how to do it in C#/Java with the built in classes but I'd like to do it in C++. I'd also not like to introduce yet another library to my app. I think there's an open source one like openssl or something right? I'm looking for platform independent if possible.

    Anyone have any insight or know how to do it? The RFC documents are like Latin to me.

  • bowenbowen Registered User regular
    edited October 2010
    End wrote: »
    The last time I did embedded programming I don't think we even had a working malloc.

    It was really okay though, since usually the stack was good enough if we didn't want to make it permanently stored in memory.

    We didn't use C++ though, just regular C.

    No reason to use C++ on embedded systems usually, too much overhead for what amounts to code fluff. In my experience anyways. It was always toted that C is the de facto language for embedded devices. Unless Sun paid someone off to get Java on there.

  • Joe KJoe K Registered User regular
    edited October 2010
    jothki wrote: »
    Perhaps he means declaring them as global variables?

    Seems kind of pointless. Either you're going to have to create and destroy them anyway, or they cost no overhead whatsoever since C++ isn't memory-managed. All adding a local variable to a function does is increase the stack pointer offset for that function.

    if you're using C++ and need a global variable, please consider using the Singleton Pattern. Same thing. Object-Oriented results.

  • Joe KJoe K Registered User regular
    edited October 2010
    exis wrote: »
    The guy who ran the course told me that he'd gotten the idea from a SE lecturer at another university, who ran a similar course but would grade at random intervals throughout the year, giving students 24 hours notice that their deliverable was due. Apparently this was deemed a little too harsh, though I can see the merit if it encourages students to actually use an agile process.

    Heh, thats more real world than you'd think. If the professor would stop by your dorm every 30 minutes and demand a status report, you might have a good experience of drive by management as well :-).

  • bowenbowen Registered User regular
    edited October 2010
    The problem with singletons is that the places where you'd need them they're almost completely unnecessary and the people usually doing those "lol open this file and dump some data" in them are terrible coders too. Which makes the code that much more D: to read.

  • Joe KJoe K Registered User regular
    edited October 2010
    bowen wrote: »
    End wrote: »
    The last time I did embedded programming I don't think we even had a working malloc.

    It was really okay though, since usually the stack was good enough if we didn't want to make it permanently stored in memory.

    We didn't use C++ though, just regular C.

    No reason to use C++ on embedded systems usually, too much overhead for what amounts to code fluff. In my experience anyways. It was always toted that C is the de facto language for embedded devices. Unless Sun paid someone off to get Java on there.

    It depends what you mean by "embedded device". Are you programming modern (strong) ARMs on the droid where there's a JavaVM? Yeh, thats not so much an embedded device as a slightly underpowered computer.

    Are you working with PIC processors and microcontrollers? C and ASM are probably your doom, then.

    As for C++ overhead - it's been greatly overblown how much C++ bloats the rendered ML. Yes, you have more context switches that end up throwing everything on and off the stack, but unless you're programming PICs, and are extremely memory constrained (like 64k), it just doesnt matter *in reality*.

    Basically, unless you're in an EE program and are in a course entitled something along the lines of "Microcontrollers and blahblahblah", and have had to make your first NAND gate by hand, you probably have enough horsepower to not even notice the difference in compiled C and C++.

    The only time in industry that I've been stuck there (on PICs) was a contract for a home security system firm, and that was 10 years ago. Reasonably powerful x86 (and now ARM - there is an incredible shift going on in processor technology due to handhelds) are so damn cheap that PICs are falling to the side.

  • Joe KJoe K Registered User regular
    edited October 2010
    bowen wrote: »
    The problem with singletons is that the places where you'd need them they're almost completely unnecessary and the people usually doing those "lol open this file and dump some data" in them are terrible coders too. Which makes the code that much more D: to read.

    Then the problem is the coder, not the design pattern. Your architect should be looking for crap like that and gently correcting it with a sledgehammer.

  • bowenbowen Registered User regular
    edited October 2010
    Tell me about it. Singletons still add way too much fluff for what amounts to a data dumper anyways. In my example.

    Sure you can have a singleton that calls a class that calls a method that dumps that data. Or you can just transpose all that code into main. Meh.

    I've seen worse though. So has templewulf.

  • grouch993grouch993 Registered User regular
    edited October 2010
    Sorry, temporary as in you need a std::string object in a function. So the guy pushing the change wants that declared in the class instead of just using a one off. It seems he thinks that initializing some number of objects in the class constructor will avoid a resource hit compared to doing it at point of use.

    It seems like the extra memory taken up and copy constructors creating and destroying objects will make it all moot.

    My previous post was vague because the person in question or some of his proteges may read this post and trace it back. Unpleasantness will follow.

    The embedded system in question is a full blown PC running an RTOS kernel.

    Steam Profile Origin grouchiy
  • InfidelInfidel Heretic Registered User regular
    edited October 2010
    That is the level of optimization where unless you're pretty much done and have profiling results, you don't bother. Unless he can show that it makes a smattering of a difference (and it probably won't) there is no reason to complicate things like that, you lose some of the correctness and conciseness with the enlarged scope of the variable.

    TwitchTV channel: OrokosPA OrokosPA
    Play D&D 4e? :: Check out Orokos and upload your Character Builder sheet! :: Orokos Dice Roller
    The PhalLounge :: Chat board for Critical Failures IRC! :: #CriticalFailures and #mafia on irc.slashnet.org
  • JasconiusJasconius sword criminal Flo-ridaRegistered User regular
    edited October 2010
    Can someone help me with a little Obj-C -> C translation?

    How do I write

    for (NSObject *object in nsArrayInstance)
    {
    //Do something to *object
    }

    where nsArrayInstance is instead

    NSObject *objects[5];

    Maybe it's not even possible... but basically I'd like to store an ObjC type in an array as stated and enumerate through it.

  • bowenbowen Registered User regular
    edited October 2010
    Couldn't you just do:
    for(int i = 0; i < ( sizeof(objects)/sizeof(NSObject) ); i++)
    {
        //do something to objects[i]
    }
    
    

    That's the only way I can think of to do that. Not sure if it'll work in obj-C, I know it works on primitive types.

  • JasconiusJasconius sword criminal Flo-ridaRegistered User regular
    edited October 2010
    Doesn't crash, but doesn't work either. Yeah I can't think of anything, just fooling around and trying to be fancy. NSArray it is!

  • bowenbowen Registered User regular
    edited October 2010
    Oh that's right, it's a pointer, sizeof will return the size of the pointer space rather than the sizeof the type, try sizeof(*objects);

  • Joe KJoe K Registered User regular
    edited October 2010
    Jasconius wrote: »
    Can someone help me with a little Obj-C -> C translation?

    How do I write

    for (NSObject *object in nsArrayInstance)
    {
    //Do something to *object
    }

    where nsArrayInstance is instead

    NSObject *objects[5];

    Maybe it's not even possible... but basically I'd like to store an ObjC type in an array as stated and enumerate through it.

    well... i'm not really up on obj-c, but if you have an array of struct (which is (excluding data access levels) all an object is, with some void(*)'s cast to functions in there for methods), you should be able to iterate over it just like an array of that struct. but i'm not sure that i understand your problem enough.

  • bowenbowen Registered User regular
    edited October 2010
    From how I understand it I guess he's trying to lower his footprint a little bit and using a standard array of objects rather than an OOP solution to an array.

    The trickiest part is getting the size of the array, especially hard when dealing with pointers. Personally for sake of my sanity I'd just do something like:
    const int ARRAYSIZE = 5;
    NSObject *objects[ARRAYSIZE];
    
    for(int i = 0; i < ARRAYSIZE; i++)
    {
        //do stuff with *objects[i];
    }
    
    

  • JasconiusJasconius sword criminal Flo-ridaRegistered User regular
    edited October 2010
    Yeah that's an option too. Ultimately that's less appealing than just using the array object. Too much code in this app to start slinging around constants that are likely to change throughout the life of the app.

  • peterdevorepeterdevore Registered User regular
    edited October 2010
    bowen wrote: »
    No reason to use C++ on embedded systems usually, too much overhead for what amounts to code fluff. In my experience anyways. It was always toted that C is the de facto language for embedded devices. Unless Sun paid someone off to get Java on there.

    What if your platform has a really shitty compiler though, then it might be better to port a VM over so you don't have to deal with it that often. I don't think Sun paid many people to implement the multitude of embedded java VMs out there, but they're there simply because it's a good VM spec without much strings attached (maybe not anymore now Oracle is getting sue happy though).

  • CmdPromptCmdPrompt Registered User regular
    edited October 2010
    Joe K wrote: »
    The only time in industry that I've been stuck there (on PICs) was a contract for a home security system firm, and that was 10 years ago. Reasonably powerful x86 (and now ARM - there is an incredible shift going on in processor technology due to handhelds) are so damn cheap that PICs are falling to the side.
    This isn't really the case in my experience. 8-bit microcontrollers are still powerful enough for many embedded applications and much cheaper and easier to develop with than 32-bit procs. Obviously the choice is application specific, but I worked on 3 projects last year that used 8-bit uCs.

    also using x86 on embedded devices is so dumb
    so dumb

    GxewS.png
  • InfidelInfidel Heretic Registered User regular
    edited October 2010
    Embedded devices we usually mean something other than the mini computers we cart around in all sorts of devices now.

    Those devices have grown beyond "embedded devices." There are still a lot of embedded devices that are at the same level as they've been for years. Just because we have phones with java doesn't mean those all went away.

    TwitchTV channel: OrokosPA OrokosPA
    Play D&D 4e? :: Check out Orokos and upload your Character Builder sheet! :: Orokos Dice Roller
    The PhalLounge :: Chat board for Critical Failures IRC! :: #CriticalFailures and #mafia on irc.slashnet.org
  • bowenbowen Registered User regular
    edited October 2010
    If I was getting paid I'd develop my own compiler. Because fuck non-standard compilers.

  • Joe KJoe K Registered User regular
    edited October 2010
    bowen wrote: »
    No reason to use C++ on embedded systems usually, too much overhead for what amounts to code fluff. In my experience anyways. It was always toted that C is the de facto language for embedded devices. Unless Sun paid someone off to get Java on there.

    What if your platform has a really shitty compiler though, then it might be better to port a VM over so you don't have to deal with it that often. I don't think Sun paid many people to implement the multitude of embedded java VMs out there, but they're there simply because it's a good VM spec without much strings attached (maybe not anymore now Oracle is getting sue happy though).

    from what i understand the licensing issue with the JVM has to do with the full-blown version that loads every class imaginable at startup implementation versus the stripped down versions that are usable in embedded systems and only load whats needed. The first is free (as in beer), and the second is not, and is what Oracle is suing google over.

    watch google create their own runtime, slightly alter the language and call it something different. i guarantee that google isn't going to be shaken down by Larry and co.

  • Joe KJoe K Registered User regular
    edited October 2010
    CmdPrompt wrote: »
    Joe K wrote: »
    The only time in industry that I've been stuck there (on PICs) was a contract for a home security system firm, and that was 10 years ago. Reasonably powerful x86 (and now ARM - there is an incredible shift going on in processor technology due to handhelds) are so damn cheap that PICs are falling to the side.
    This isn't really the case in my experience. 8-bit microcontrollers are still powerful enough for many embedded applications and much cheaper and easier to develop with than 32-bit procs. Obviously the choice is application specific, but I worked on 3 projects last year that used 8-bit uCs.

    also using x86 on embedded devices is so dumb
    so dumb

    the answer to that is a gigantic "it depends on your application". uCs still have their place, and are still cheaper than the lowend ARMs and x86 boards (think VIA's pico boards), but it depends on what you're doing to decide the platform.

  • Joe KJoe K Registered User regular
    edited October 2010
    bowen wrote: »
    If I was getting paid I'd develop my own compiler. Because fuck non-standard compilers.

    oh christ, you don't want to develop your own compiler. really, just use gcc with all the ANSI warnings on if you want true standards compliance. Intel's c compiler is very optimized for their chipsets, if you're looking for performance.

    but you don't want to write your own compiler... because... ewwww.

  • bowenbowen Registered User regular
    edited October 2010
    Well, I'd only do it if I was getting paid and if the platform's embedded compiler was garbage, ie, not ISO C compliant. Even C89 would be fine.

  • Joe KJoe K Registered User regular
    edited October 2010
    Infidel wrote: »
    Embedded devices we usually mean something other than the mini computers we cart around in all sorts of devices now.

    Those devices have grown beyond "embedded devices." There are still a lot of embedded devices that are at the same level as they've been for years. Just because we have phones with java doesn't mean those all went away.

    but i wouldn't call it a growth area, either...

    the right tool for the right job. if all you need is a cheap little uC to do simple tasks as cheaply as possible, than thats what you use.

  • CmdPromptCmdPrompt Registered User regular
    edited October 2010
    It's definitely a growth area in the hobbyist sector, no idea in terms of industry.

    GxewS.png
  • kedinikkedinik Registered User regular
    edited October 2010
    What's a good UML program to cut my teeth on?

    I've got a project that's starting to feel too spaghetti code-ish, so it seems like a good idea to organize everything with diagrams.

  • Joe KJoe K Registered User regular
    edited October 2010
    kedinik wrote: »
    What's a good UML program to cut my teeth on?

    I've got a project that's starting to feel too spaghetti code-ish, so it seems like a good idea to organize everything with diagrams.

    ugg, i don't really have a good opinion of UML, so I won't recommend any software for it, although if you want to pay IBM a shitton of money, they'll gladly sell you rational rose.

    I'd go for some "mind mapping" software... plenty of open source out there.

  • kedinikkedinik Registered User regular
    edited October 2010
    I'm not married to UML, just looking for a good way to visually organize a large-ish system of classes.

  • Alistair HuttonAlistair Hutton Dr EdinburghRegistered User regular
    edited October 2010
    Joe K wrote: »
    watch google create their own runtime, slightly alter the language and call it something different. i guarantee that google isn't going to be shaken down by Larry and co.

    Google aren't being sued over Java compliance, they're being done over patents on VM tech. They've already implemented their own VM, even if they were to start using Not-Java Oracle would still be pursuing them for the patents.

    I have a thoughful and infrequently updated blog about games http://whatithinkaboutwhenithinkaboutgames.wordpress.com/
    I made a game, it has penguins in it. WANG gets you money off.
    Currently Ebaying Nothing at all but I might do in the future.
  • Alistair HuttonAlistair Hutton Dr EdinburghRegistered User regular
    edited October 2010
    kedinik wrote: »
    I'm not married to UML, just looking for a good way to visually organize a large-ish system of classes.

    I like UMLet - http://www.umlet.com/

    It is highly idiosyncratic but it does what I need it to do in terms of class diagrams and the like. It has a very command line feel to it despite also being drag and drop.

    I have a thoughful and infrequently updated blog about games http://whatithinkaboutwhenithinkaboutgames.wordpress.com/
    I made a game, it has penguins in it. WANG gets you money off.
    Currently Ebaying Nothing at all but I might do in the future.
  • BobbleBobble Registered User regular
    edited October 2010
    Alright folks, question for the hive mind here. I've got a few PDF files that I'd like to combine into a single file. Currently, I do this manually with either Adobe 8 Professional, or some software called PDFCreator, depending on which computer I'm using. These PDFs are part of a large report process that I've mostly automated using Excel/Access and VBA.

    Ideally, if possible, I'd like to have some code in an Excel macro (or code from an Access DB, either works) that will take three existing PDFs and simply combine/append them into one. When I've asked others or google'd, it seems like this is usually accomplished with some third party software package, but I'd prefer not to cost this since it's just automating for convenience.

    Any ideas/assistance would be greatly appreciated :)

  • bowenbowen Registered User regular
    edited October 2010
    Unless you're an expert at postscript, you'll need an external library to do it for you. Seems a bit beyond the scope of excel and VBA/VB Macros.

  • TofystedethTofystedeth veni, veneri, vamoosi Registered User regular
    edited October 2010
    There's a pdf library for Python that can merge and split pdfs and stuff. I'm not sure if you can call Python scripts from a VB script (you probably can somehow) but at the very least if you're always combining it the same way, automating it with Python should be pretty simple.

    steam_sig.png
This discussion has been closed.