Well, it's been a while since I touched C. It's all coming back, though -- I never did care for OOP, but I'm starting to see how it could be useful for what I'm trying to do.
At this pace I should finish "Learn Objective-C on the Mac" by the end of the weekend / early next week, and then start trying to tackle Cocos-2d's tutorials.
how can you not care for OOP
are there people who seriously sit around and say "Well we shouldn't use X because it's object oriented"
Monkey Ball WarriorA collection of mediocre hatsSeattle, WARegistered Userregular
edited June 2010
The OOP debate has been going on since before I was born. Just because OOP is winning doesn't mean the debate is over!
I think it might just be that the overhead OOP has is being quickly overshadowed by the speed of modern computers, and the fact that how you go about solving a problem (algorithms, design, etc) has more to do with a program's speed than what language it's written in.
But I can see how people who've been doing procedural programming for a million years might be a bit skeptical.
Monkey Ball Warrior on
"I resent the entire notion of a body as an ante and then raise you a generalized dissatisfaction with physicality itself" -- Tycho
Sometimes I need to open a file, get a piece of data, and then drop it into another file. It'd be a complete waste of time for me, in C++, to write classes for that. It's a complete waste of time to write classes for that in any language, because there's no reason to reuse that code anywhere.
Now if I was writing a file parsing mechanism for all my projects, then yeah, I'd make that a class.
bowen on
not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
Well, it's been a while since I touched C. It's all coming back, though -- I never did care for OOP, but I'm starting to see how it could be useful for what I'm trying to do.
At this pace I should finish "Learn Objective-C on the Mac" by the end of the weekend / early next week, and then start trying to tackle Cocos-2d's tutorials.
how can you not care for OOP
are there people who seriously sit around and say "Well we shouldn't use X because it's object oriented"
Sometimes I need to open a file, get a piece of data, and then drop it into another file. It'd be a complete waste of time for me, in C++, to write classes for that. It's a complete waste of time to write classes for that in any language, because there's no reason to reuse that code anywhere.
Now if I was writing a file parsing mechanism for all my projects, then yeah, I'd make that a class.
My perspective is that I'm going to need to do that many times over my time using the language, so it's worth spending a little time making it into a reusable object so I never need to write it again.
But it might not be needed depending on what you're doing.
You might be right, but that line of thinking is also what creates those huge gargantuan mega-libraries that do everything you'd ever need it to do.
Just like I don't need an 8 ton tractor to dig a hole for me, I'll just have to deal with digging the hole everytime rather than buying an 8 ton tractor when I need a new hole. That's how I approach most of my "shit I need a little program to do X for me."
"shit I need a little program to do x... but a little different for me this time."
Though, not doing it on this that need it is just as silly too. I don't know how long I waited to make a file parsing system for a file format we used all the god damned time, I think it took me a year to finally hunker down and do it.
bowen on
not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
For solo projects, I write the code quickly and briefly. As code points surface that are obvious cases for OOP or other reuse, I pull that and give it a nicer framework.
This saves a lot of unnecessary work writing a bunch of classes that will never end up being usable again. The stuff that IS extracted and distilled is guaranteed to be useful again because that is the only reason I did the work on it.
For example, there are about 8 classes that I have for my PHP projects that I copy to every single site now, that evolved from single file work on earlier sites. It is robust and I can rapidly develop a site in a flat file without reinventing the wheel. DB, session management, authentication, etc. just the way I like it and setup in a quick fashion.
Not much. I got distracted by a few shitty tech support jobs and my Programming degree turning into a glorified non-transferrable MOUS certification. Trying to get back into the swing of things after 8 years, and, er, yeah. Slow going.
Never did like OOP though, it always seemed like it was over-complicating matters.
Not much. I got distracted by a few shitty tech support jobs and my Programming degree turning into a glorified non-transferrable MOUS certification. Trying to get back into the swing of things after 8 years, and, er, yeah. Slow going.
Never did like OOP though, it always seemed like it was over-complicating matters.
I've found the usefulness of OOP is directly proportional to the size of the project. Need to open a file and dick around with its contents? OOP is overkill. Writing an RPG? OOP is a lifesaver.
Not much. I got distracted by a few shitty tech support jobs and my Programming degree turning into a glorified non-transferrable MOUS certification. Trying to get back into the swing of things after 8 years, and, er, yeah. Slow going.
Never did like OOP though, it always seemed like it was over-complicating matters.
I've found the usefulness of OOP is directly proportional to the size of the project. Need to open a file and dick around with its contents? OOP is overkill. Writing an RPG? OOP is a lifesaver.
This is pretty much it.
Except netcode. That's the one fucking exception where you should always have OOP. Fuck netcode.
bowen on
not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
Not much. I got distracted by a few shitty tech support jobs and my Programming degree turning into a glorified non-transferrable MOUS certification. Trying to get back into the swing of things after 8 years, and, er, yeah. Slow going.
Never did like OOP though, it always seemed like it was over-complicating matters.
I've found the usefulness of OOP is directly proportional to the size of the project. Need to open a file and dick around with its contents? OOP is overkill. Writing an RPG? OOP is a lifesaver.
This is pretty much it.
Except netcode. That's the one fucking exception where you should always have OOP. Fuck netcode.
Oh god yes. Between connection setup and teardown, packet / stream conversion, processing of incoming messages and using the async APIs if you need more than a handful of connections or just don't want to dedicate a thread to each you really need a strong OO structure. I'm actually just about to rewrite mine from scratch, again.
So anyone have any opinions on SproutCore, or used it in any significant capacity or know anyone who has? I'm trying to figure out whether it's right for a project, and I don't want to get weeks in and find some crippling limitation.
I guess it's what's used for MobileMe, but I don't really want to get roped into a service I won't use at all, ever, and I can't seem to find anything else that uses it for anything complicated and free (so I can try it).
This will be my 5th. And the third time of completely changing the overall structure
My first one was a Berkeley sockets implementation with very tight integration of the datapath to socket management, current one is asynchronous & completion-port based (no linux version yet) with a more layered structure, partially decoupling the protocol layer from the actual I/O, and I've been playing around with a highly decoupled datapath & connection management structure for the new one.
So anyone have any opinions on SproutCore, or used it in any significant capacity or know anyone who has? I'm trying to figure out whether it's right for a project, and I don't want to get weeks in and find some crippling limitation.
I guess it's what's used for MobileMe, but I don't really want to get roped into a service I won't use at all, ever, and I can't seem to find anything else that uses it for anything complicated and free (so I can try it).
So anyone have any opinions on SproutCore, or used it in any significant capacity or know anyone who has? I'm trying to figure out whether it's right for a project, and I don't want to get weeks in and find some crippling limitation.
I guess it's what's used for MobileMe, but I don't really want to get roped into a service I won't use at all, ever, and I can't seem to find anything else that uses it for anything complicated and free (so I can try it).
I started getting into it. But there's not really much documentation, or community. If you want to do production quality work with it, you'll probably end up contributing to the source code.
So anyone have any opinions on SproutCore, or used it in any significant capacity or know anyone who has? I'm trying to figure out whether it's right for a project, and I don't want to get weeks in and find some crippling limitation.
I guess it's what's used for MobileMe, but I don't really want to get roped into a service I won't use at all, ever, and I can't seem to find anything else that uses it for anything complicated and free (so I can try it).
I started getting into it. But there's not really much documentation, or community. If you want to do production quality work with it, you'll probably end up contributing to the source code.
Basically, it's a non-flash alternative to Flex.
Yeah, I signed up for the MobileMe 60-day. It's pretty classy, but the photo app seems like the main event and the others have been passed by to some degree (though they are pretty, and quite functional). I can see how using Sproutcore for anything real complicated would be breaking new ground.
So in python... if I've got a float number, how do I change the number of numbers after a decimal point? I can't seem to find any information about this.
So in python... if I've got a float number, how do I change the number of numbers after a decimal point? I can't seem to find any information about this.
I'm fairly new to coding, or at least the more advanced portions, and ever since I began reading about C++ and other coding I've been confused as to the actual reason to use pointers, heaping, and stacking. I was in a Pre-AP (blegh blegh) Comp Sci course this past year and none of those three things were brought up at all in the class, so I was wondering if I could get a little more direction as to, well, the point in using those? Any feedback is greatly appreciated!
Also, pointers are a programming primitive, while heaps and stacks are data structures. You can read up on those more on Wikipedia or the link in the OP.
I'm fairly new to coding, or at least the more advanced portions, and ever since I began reading about C++ and other coding I've been confused as to the actual reason to use pointers, heaping, and stacking. I was in a Pre-AP (blegh blegh) Comp Sci course this past year and none of those three things were brought up at all in the class, so I was wondering if I could get a little more direction as to, well, the point in using those? Any feedback is greatly appreciated!
I can't say I've ever heard the terms "heaping" and "stacking", per se, but the terms you want to look up are "heap" and "stack". Both refer to places in memory (what you are concerned with), as well as data structures, which might be confusing.
"The Stack", as it is commonly referred to, is the place in memory where local variables go in C++ (as in most other languages). It is organized as a stack (in the data structure sense), hence the name. When the procedure in which the variable is declared exits, the variable disappears from memory. Objects declared using reference semantics are allocated on the stack, like so:
std::string myString("foobar");
The "heap" is the area of memory used for dynamic memory allocation. Basically, every time you use the keyword "new" in C++ (or the "malloc" function in C), some memory is set aside on the heap. This memory is not freed unless you explicitly free it (using "delete"). To the best of my knowledge, this heap doesn't have much to do with a heap data structure.
In C++, choosing whether to put a variable on the heap or stack is actually not a simple task, and a description of all the nuances is far beyond what I imagine to be your experience level. Basically, you should use stack variables as much as possible. But sometimes there's no way to get around having to use a "new" somewhere, and in that case it is very important to make sure the object is "deleted" when it is no longer needed. (And sometimes pointers to such an object must be shared by other objects, at which point determining when the heap object is no longer needed becomes a serious problem.)
And Java's automatic garbage collection is the reason you never heard about the heap and the stack in your Java class. They don't tend to cover those concepts in Java courses. I personally think that's a bad call, since there are still lots of ways to leak memory in Java apps if you aren't careful, but introductory courses never cover everything you need to know.
Ah, that's why I confused. Verbing the terms made me think of the data structures for whatever reason.
Though it's worth noting that "the stack" is still a stack.
Yeah, edited that factoid into my post. Though I always found it a bit of a stretch tbh, when doing assembly programming. It didn't feel like a stack so much as an array that had to be moved back and forth through. In a procedure you end up doing a ton of random accessing of things on the stack, such as arguments, without actually popping or pushing anything. And then when it comes time, you tend to "pop" all the memory used by the procedure in one go.
LoneIgadzra on
0
Options
Monkey Ball WarriorA collection of mediocre hatsSeattle, WARegistered Userregular
They don't tend to cover those concepts in Java courses. I personally think that's a bad call
The garbage collector is nice, but learning these sort of things makes Java feel like a black box. Is Java just making educated guesses about how to handle memory?
I'm in my third quarter of programming, and second quarter of Java. We've gone over stacks and heaps as data structures, but everything I know memory management I found randomly wikipedia surfing, etc.
They say one shouldn't use global/class-wide variables when local ones will suffice. If I'm understanding it right, in languages that don't hold your hand so much, poor "scoping" (is that the right word?) can quickly lead to horrific memory leaks and buffer overruns and stuff because it gets harder to keep track of memory management details? Or is it just a performance thing?
I'm sure upper level courses will go into some of this stuff. I think there might be a class just on different kinds of languages.
Monkey Ball Warrior on
"I resent the entire notion of a body as an ante and then raise you a generalized dissatisfaction with physicality itself" -- Tycho
It's more of a code organization thing. Having lots of global or widely scoped variables makes it much more difficult to figure out what's going on in a program, since it's harder to see and understand exactly how the variables are changing.
That and if you don't keep the scope limited to where it is needed, you're more likely to run into collisions and those can cause all sorts of headaches.
"Surely no one else would have used the variable wang!"
Also makes it much harder to unit test things since things are not self-contained.
Smug Duckling on
0
Options
Monkey Ball WarriorA collection of mediocre hatsSeattle, WARegistered Userregular
edited June 2010
I certainly see the benefits of unit testing....
My latest homework seemed mostly an exercise in optimization; getting the code to simply function took very little time. After I had it passing a given JUnit test, we were supposed to make two given optimizations and told to try to do other things (such as not calculating things more than once if possible, and narrowing the searches). So I'd think about it for a while, change the code a little bit, say putting things in different orders, or throwing a bunch of code in an IF statement. Small changes.
I wrote a little benchmark class, and I'd try it with these changes and, say, a run that took 8 mins to complete was now taking 4.5 mins. That was pretty fun, actually, but half the time I'd do these things and it wouldn't pass the given JUnit test anymore, and I'll know I screwed up somehow.
Monkey Ball Warrior on
"I resent the entire notion of a body as an ante and then raise you a generalized dissatisfaction with physicality itself" -- Tycho
They don't tend to cover those concepts in Java courses. I personally think that's a bad call
The garbage collector is nice, but learning these sort of things makes Java feel like a black box. Is Java just making educated guesses about how to handle memory?
No, 'cause if it guessed, then it would occasionally free objects that are still in use. Ggarbage collector algorithms that work to make sure that objects still in use are not freed. Wikipedia has a decent overview of GC algorithms.
I'm in my third quarter of programming, and second quarter of Java. We've gone over stacks and heaps as data structures, but everything I know memory management I found randomly wikipedia surfing, etc.
They say one shouldn't use global/class-wide variables when local ones will suffice. If I'm understanding it right, in languages that don't hold your hand so much, poor "scoping" (is that the right word?) can quickly lead to horrific memory leaks and buffer overruns and stuff because it gets harder to keep track of memory management details? Or is it just a performance thing?
I'm sure upper level courses will go into some of this stuff. I think there might be a class just on different kinds of languages.
Not sure where you go to school, but a lot of places will offer a course that uses C to help explain these sorts of things because it's still high-level but close enough to hardware that you can understand why these abstractions exist in the first place. (I TA'ed such a course for 1.5 years, we always used to say that this is the class that taught you that computers don't run on magic.)
For instance, in Java I'm not sure if the concept of globals even exists because everything is inside of an object. But in a C program, I can do things like this:
// file1.c
#include <stdio.h>
extern int foo;
extern void bar(void);
int main() {
bar();
printf("main: %d\n", foo);
return 0;
}
------------------------
// file 2.c
#include <stdio.h>
int foo = 5;
void bar (void) {
int foo = 4;
printf("bar: %d\n", foo);
}
-----------
> gcc -c file1.c
> gcc -c file2.c
> gcc -o test file1.o file2.o
> ./test
bar: 4
main: 5
>
This code is riddled with examples of how to abuse scoping and globals. Let's take a look:
In file1.c I use the extern keyword to tell the compiler that "hey, I swear that this variable is defined somewhere else, just take my word for it and keep a look out when you link okay?". I say two things are externed: an integer named foo and a function named bar that doesn't return anything. In the main function I call bar, print the contents of foo, and exit.
In file2.c I declare a variable with global scope (it's global since it's declared outside of any function) and set its value to 5. I declare a function named bar that then declares a variable called foo as well. Is this a compilation error? Nope! The local variable foo will be visible within this function only. (Note, this is also true in Java, but there's no this keyword to bail you out to let you access the other foo.)
I think compile both files. You could do this in one gcc command but I wanted to make it clear that I could compile file1.c even though neither foo nor bar are defined within that file. The -c flag tells gcc that I only want to compile these files, and that I will link later. (In C, a file goes through several stages before an executable is generated. Roughly, the steps are 1) the C preprocessor, where things like #include and #define statements are resolved 2) compilation, where object code is generated and 3) linking, where object files are linked together and any remaining references are resolved and an executable is generated.) Object files now generated, I link them to form the executable so now that the foo declared in the global scope of file2.c is truly global across both files.
I then execute the program, where you can see the effect of scoping and globals.
Nintendo can easily open the Wii to more independent developers to match anything Apple does, and Microsoft just barely made it into the console market with tons of money and a development platform that was already used by quite a few more people than what Apple has.
Nintendo can easily open the Wii to more independent developers to match anything Apple does, and Microsoft just barely made it into the console market with tons of money and a development platform that was already used by quite a few more people than what Apple has.
How is Apple going to disrupt the market? :?
While this is theoretically true, this is Nintendo we're talking about. Try to write another sentence involving the words "Nintendo", "open" and "independent developers" while keeping a straight face.
Nintendo can easily open the Wii to more independent developers to match anything Apple does, and Microsoft just barely made it into the console market with tons of money and a development platform that was already used by quite a few more people than what Apple has.
How is Apple going to disrupt the market? :?
While this is theoretically true, this is Nintendo we're talking about. Try to write another sentence involving the words "Nintendo", "open" and "independent developers" while keeping a straight face.
Independent developers that were very open with their beliefs spoke to Nintendo today about the Wii.
Posts
how can you not care for OOP
are there people who seriously sit around and say "Well we shouldn't use X because it's object oriented"
what the fuck kind of software do you build?
I think it might just be that the overhead OOP has is being quickly overshadowed by the speed of modern computers, and the fact that how you go about solving a problem (algorithms, design, etc) has more to do with a program's speed than what language it's written in.
But I can see how people who've been doing procedural programming for a million years might be a bit skeptical.
Sometimes there's little need for it.
Sometimes I need to open a file, get a piece of data, and then drop it into another file. It'd be a complete waste of time for me, in C++, to write classes for that. It's a complete waste of time to write classes for that in any language, because there's no reason to reuse that code anywhere.
Now if I was writing a file parsing mechanism for all my projects, then yeah, I'd make that a class.
Ok now seriously. Who the fuck are you?
He's not a developer, that happens.
My perspective is that I'm going to need to do that many times over my time using the language, so it's worth spending a little time making it into a reusable object so I never need to write it again.
But it might not be needed depending on what you're doing.
Just like I don't need an 8 ton tractor to dig a hole for me, I'll just have to deal with digging the hole everytime rather than buying an 8 ton tractor when I need a new hole. That's how I approach most of my "shit I need a little program to do X for me."
"shit I need a little program to do x... but a little different for me this time."
Though, not doing it on this that need it is just as silly too. I don't know how long I waited to make a file parsing system for a file format we used all the god damned time, I think it took me a year to finally hunker down and do it.
For solo projects, I write the code quickly and briefly. As code points surface that are obvious cases for OOP or other reuse, I pull that and give it a nicer framework.
This saves a lot of unnecessary work writing a bunch of classes that will never end up being usable again. The stuff that IS extracted and distilled is guaranteed to be useful again because that is the only reason I did the work on it.
For example, there are about 8 classes that I have for my PHP projects that I copy to every single site now, that evolved from single file work on earlier sites. It is robust and I can rapidly develop a site in a flat file without reinventing the wheel. DB, session management, authentication, etc. just the way I like it and setup in a quick fashion.
Not much. I got distracted by a few shitty tech support jobs and my Programming degree turning into a glorified non-transferrable MOUS certification. Trying to get back into the swing of things after 8 years, and, er, yeah. Slow going.
Never did like OOP though, it always seemed like it was over-complicating matters.
I've found the usefulness of OOP is directly proportional to the size of the project. Need to open a file and dick around with its contents? OOP is overkill. Writing an RPG? OOP is a lifesaver.
This is pretty much it.
Except netcode. That's the one fucking exception where you should always have OOP. Fuck netcode.
Oh god yes. Between connection setup and teardown, packet / stream conversion, processing of incoming messages and using the async APIs if you need more than a handful of connections or just don't want to dedicate a thread to each you really need a strong OO structure. I'm actually just about to rewrite mine from scratch, again.
I guess it's what's used for MobileMe, but I don't really want to get roped into a service I won't use at all, ever, and I can't seem to find anything else that uses it for anything complicated and free (so I can try it).
My first one was a Berkeley sockets implementation with very tight integration of the datapath to socket management, current one is asynchronous & completion-port based (no linux version yet) with a more layered structure, partially decoupling the protocol layer from the actual I/O, and I've been playing around with a highly decoupled datapath & connection management structure for the new one.
Bespin uses Sproutcore, afaik
I started getting into it. But there's not really much documentation, or community. If you want to do production quality work with it, you'll probably end up contributing to the source code.
Basically, it's a non-flash alternative to Flex.
Need to get a basic twitter posting functionality in an iPhone app, now that basic authentication is going down for good.
Yeah, I signed up for the MobileMe 60-day. It's pretty classy, but the photo app seems like the main event and the others have been passed by to some degree (though they are pretty, and quite functional). I can see how using Sproutcore for anything real complicated would be breaking new ground.
It's dangerous to go alone, take this.
3DS Friend Code: 2707-1614-5576
PAX Prime 2014 Buttoneering!
Also, pointers are a programming primitive, while heaps and stacks are data structures. You can read up on those more on Wikipedia or the link in the OP.
The older versions of this thread have a good short tutorial on pointers:
http://forums.penny-arcade.com/showthread.php?p=3431132#post3431132
Edit: Also that thread you linked made a lot more sense that what I had been imagining pointers to do, thanks!
I can't say I've ever heard the terms "heaping" and "stacking", per se, but the terms you want to look up are "heap" and "stack". Both refer to places in memory (what you are concerned with), as well as data structures, which might be confusing.
"The Stack", as it is commonly referred to, is the place in memory where local variables go in C++ (as in most other languages). It is organized as a stack (in the data structure sense), hence the name. When the procedure in which the variable is declared exits, the variable disappears from memory. Objects declared using reference semantics are allocated on the stack, like so:
The "heap" is the area of memory used for dynamic memory allocation. Basically, every time you use the keyword "new" in C++ (or the "malloc" function in C), some memory is set aside on the heap. This memory is not freed unless you explicitly free it (using "delete"). To the best of my knowledge, this heap doesn't have much to do with a heap data structure.
In C++, choosing whether to put a variable on the heap or stack is actually not a simple task, and a description of all the nuances is far beyond what I imagine to be your experience level. Basically, you should use stack variables as much as possible. But sometimes there's no way to get around having to use a "new" somewhere, and in that case it is very important to make sure the object is "deleted" when it is no longer needed. (And sometimes pointers to such an object must be shared by other objects, at which point determining when the heap object is no longer needed becomes a serious problem.)
Though it's worth noting that "the stack" is still a stack.
Yeah, edited that factoid into my post. Though I always found it a bit of a stretch tbh, when doing assembly programming. It didn't feel like a stack so much as an array that had to be moved back and forth through. In a procedure you end up doing a ton of random accessing of things on the stack, such as arguments, without actually popping or pushing anything. And then when it comes time, you tend to "pop" all the memory used by the procedure in one go.
The garbage collector is nice, but learning these sort of things makes Java feel like a black box. Is Java just making educated guesses about how to handle memory?
I'm in my third quarter of programming, and second quarter of Java. We've gone over stacks and heaps as data structures, but everything I know memory management I found randomly wikipedia surfing, etc.
They say one shouldn't use global/class-wide variables when local ones will suffice. If I'm understanding it right, in languages that don't hold your hand so much, poor "scoping" (is that the right word?) can quickly lead to horrific memory leaks and buffer overruns and stuff because it gets harder to keep track of memory management details? Or is it just a performance thing?
I'm sure upper level courses will go into some of this stuff. I think there might be a class just on different kinds of languages.
"Surely no one else would have used the variable wang!"
My latest homework seemed mostly an exercise in optimization; getting the code to simply function took very little time. After I had it passing a given JUnit test, we were supposed to make two given optimizations and told to try to do other things (such as not calculating things more than once if possible, and narrowing the searches). So I'd think about it for a while, change the code a little bit, say putting things in different orders, or throwing a bunch of code in an IF statement. Small changes.
I wrote a little benchmark class, and I'd try it with these changes and, say, a run that took 8 mins to complete was now taking 4.5 mins. That was pretty fun, actually, but half the time I'd do these things and it wouldn't pass the given JUnit test anymore, and I'll know I screwed up somehow.
No, 'cause if it guessed, then it would occasionally free objects that are still in use. Ggarbage collector algorithms that work to make sure that objects still in use are not freed. Wikipedia has a decent overview of GC algorithms.
Not sure where you go to school, but a lot of places will offer a course that uses C to help explain these sorts of things because it's still high-level but close enough to hardware that you can understand why these abstractions exist in the first place. (I TA'ed such a course for 1.5 years, we always used to say that this is the class that taught you that computers don't run on magic.)
For instance, in Java I'm not sure if the concept of globals even exists because everything is inside of an object. But in a C program, I can do things like this:
This code is riddled with examples of how to abuse scoping and globals. Let's take a look:
In file1.c I use the extern keyword to tell the compiler that "hey, I swear that this variable is defined somewhere else, just take my word for it and keep a look out when you link okay?". I say two things are externed: an integer named foo and a function named bar that doesn't return anything. In the main function I call bar, print the contents of foo, and exit.
In file2.c I declare a variable with global scope (it's global since it's declared outside of any function) and set its value to 5. I declare a function named bar that then declares a variable called foo as well. Is this a compilation error? Nope! The local variable foo will be visible within this function only. (Note, this is also true in Java, but there's no this keyword to bail you out to let you access the other foo.)
I think compile both files. You could do this in one gcc command but I wanted to make it clear that I could compile file1.c even though neither foo nor bar are defined within that file. The -c flag tells gcc that I only want to compile these files, and that I will link later. (In C, a file goes through several stages before an executable is generated. Roughly, the steps are 1) the C preprocessor, where things like #include and #define statements are resolved 2) compilation, where object code is generated and 3) linking, where object files are linked together and any remaining references are resolved and an executable is generated.) Object files now generated, I link them to form the executable so now that the foo declared in the global scope of file2.c is truly global across both files.
I then execute the program, where you can see the effect of scoping and globals.
How is Apple going to disrupt the market? :?
While this is theoretically true, this is Nintendo we're talking about. Try to write another sentence involving the words "Nintendo", "open" and "independent developers" while keeping a straight face.
Independent developers that were very open with their beliefs spoke to Nintendo today about the Wii.