EDIT: related, when both operands are false, then `or` is defined to not return False but b instead. This sort of makes sense for `or`, but by symmetry, `and` returns the a value by default. Then you get weird situations like this:
>>> x = []
>>> y = []
>>> z = x and y # Hrm, what is z referring to?
>>> x.append(1)
>>> y.append(2)
>>> z
[1]
The definition of `or` in Python is consistent but not what I would call sensible.
I don't think that is weird, but I also already knew that 'and' and 'or' are short-circuit operators in Python. 'z' couldn't have been 'y' since the latter was never evaluated. In the case of 'or' then 'y' would've been chosen.
In practice, the only place I make use of 'or' like that is initializing variables. Due to how Python treats mutable default arguments (they are created when the function is defined, not called) you'll see this pattern used:
def foo(a, b=None):
b = b or []
If you used "b=[]" in the argument list, the state of 'b' would persist across all invocations of 'foo'.
It isn't the fact that `or` in Python is short-circuiting --- in most languages && and || are short-circuiting operators. It's the fact that, in Python, `or` (and `and`) produce non-boolean values which is very non-standard, especially from a types perspective as I pointed out above. The reason why that's the case is to have convenient syntax for "defaulting" values like what you described above. However, rather than lumping that into a non-standard definition of `or`, I'd rather see that be placed into some other operator (or function).
non-standard? it's pretty standard behavior in dynamically typed languages, at least among the ones I just checked, with only php not following suit
End on
I wish that someway, somehow, that I could save every one of us
0
gavindelThe reason all your softwareis brokenRegistered Userregular
Got an offer. Such salary, much benefit package, wow.
Make sure you graduate, instead of accepting a job that requires you to drop out.
Yes, its for next summer. Didn't spend as much money as I have on college to forget the last step. Besides, if you don't finish, you lose the ability to use a masters as a midcareer boost. (No idea if that actually works...)
(Next up: whining in this thread about senior design.)
Angels, innovations, and the hubris of tiny things: my book now free on Royal Road! Seraphim
Got an offer. Such salary, much benefit package, wow.
Make sure you graduate, instead of accepting a job that requires you to drop out.
Yes, its for next summer. Didn't spend as much money as I have on college to forget the last step. Besides, if you don't finish, you lose the ability to use a masters as a midcareer boost. (No idea if that actually works...)
(Next up: whining in this thread about senior design.)
That is awesome. I have heard rumors of companies offering interns offers to get them to drop out of school. Cynically I think it is to make it harder for the person to leave the company.
I graduated with my BS in CompSci seven years ago, and it would require a pretty significant pay bump to offset the lose in pay while I am in school ( presuming a company payed for the Masters ).
In theory if I would go back to school it would be for a Masters in something like Business or Engineering over CompSci.
EDIT: related, when both operands are false, then `or` is defined to not return False but b instead. This sort of makes sense for `or`, but by symmetry, `and` returns the a value by default. Then you get weird situations like this:
>>> x = []
>>> y = []
>>> z = x and y # Hrm, what is z referring to?
>>> x.append(1)
>>> y.append(2)
>>> z
[1]
The definition of `or` in Python is consistent but not what I would call sensible.
I don't think that is weird, but I also already knew that 'and' and 'or' are short-circuit operators in Python. 'z' couldn't have been 'y' since the latter was never evaluated. In the case of 'or' then 'y' would've been chosen.
In practice, the only place I make use of 'or' like that is initializing variables. Due to how Python treats mutable default arguments (they are created when the function is defined, not called) you'll see this pattern used:
def foo(a, b=None):
b = b or []
If you used "b=[]" in the argument list, the state of 'b' would persist across all invocations of 'foo'.
It isn't the fact that `or` in Python is short-circuiting --- in most languages && and || are short-circuiting operators. It's the fact that, in Python, `or` (and `and`) produce non-boolean values which is very non-standard, especially from a types perspective as I pointed out above. The reason why that's the case is to have convenient syntax for "defaulting" values like what you described above. However, rather than lumping that into a non-standard definition of `or`, I'd rather see that be placed into some other operator (or function).
non-standard? it's pretty standard behavior in dynamically typed languages, at least among the ones I just checked, with only php not following suit
EDIT: related, when both operands are false, then `or` is defined to not return False but b instead. This sort of makes sense for `or`, but by symmetry, `and` returns the a value by default. Then you get weird situations like this:
>>> x = []
>>> y = []
>>> z = x and y # Hrm, what is z referring to?
>>> x.append(1)
>>> y.append(2)
>>> z
[1]
The definition of `or` in Python is consistent but not what I would call sensible.
I don't think that is weird, but I also already knew that 'and' and 'or' are short-circuit operators in Python. 'z' couldn't have been 'y' since the latter was never evaluated. In the case of 'or' then 'y' would've been chosen.
In practice, the only place I make use of 'or' like that is initializing variables. Due to how Python treats mutable default arguments (they are created when the function is defined, not called) you'll see this pattern used:
def foo(a, b=None):
b = b or []
If you used "b=[]" in the argument list, the state of 'b' would persist across all invocations of 'foo'.
It isn't the fact that `or` in Python is short-circuiting --- in most languages && and || are short-circuiting operators. It's the fact that, in Python, `or` (and `and`) produce non-boolean values which is very non-standard, especially from a types perspective as I pointed out above. The reason why that's the case is to have convenient syntax for "defaulting" values like what you described above. However, rather than lumping that into a non-standard definition of `or`, I'd rather see that be placed into some other operator (or function).
non-standard? it's pretty standard behavior in dynamically typed languages, at least among the ones I just checked, with only php not following suit
That's fair enough. My types bias comes through as usual. ^_^ Although ultimately, the "standardness" of it does not bug me so much as it is an (in my view) unnecessary trade-off of regularity for convenience.
@TwitchTV, @Youtube: master-level zerg ladder/customs, commentary, and random miscellany.
Also just to add, I should not need to go look up the fine print about how precisely an or statement is implemented in the language in order to make sense of a line of code: knowing what or means in general should be enough. The line "product(a, b or [None])" requires just that of me.
You don't need to know exactly how it is implemented. It's described here in a truth table with short notes.
Python's implementation of logical OR is not the same as the standard meaning of boolean logical OR, because python does not stop to check to make sure both operands resolve to boolean values. If you stick to boolean values, the truth tables look the same:
(f or t) returns t, (t or t) returns t, (t or f) returns t, (f or f) returns f
However, since python doesn't give a crap, the truth tables get ridonkulous:
(false or hat) returns hat
(true or hat) returns true
Why the hell is a logical or operator returning hat? It doesn't make sense from a mathematical point of view, which means you need to fall back on the language rules, which is implementation details, which is the link you said above.
If I look at a boolean operator and I have to double check documentation to see what's going on or make sense of it, it's not very readable.
The line (false or hat) is a violation of the principle of least surprise.
+1
gavindelThe reason all your softwareis brokenRegistered Userregular
Got an offer. Such salary, much benefit package, wow.
Make sure you graduate, instead of accepting a job that requires you to drop out.
Yes, its for next summer. Didn't spend as much money as I have on college to forget the last step. Besides, if you don't finish, you lose the ability to use a masters as a midcareer boost. (No idea if that actually works...)
(Next up: whining in this thread about senior design.)
That is awesome. I have heard rumors of companies offering interns offers to get them to drop out of school. Cynically I think it is to make it harder for the person to leave the company.
I graduated with my BS in CompSci seven years ago, and it would require a pretty significant pay bump to offset the lose in pay while I am in school ( presuming a company payed for the Masters ).
In theory if I would go back to school it would be for a Masters in something like Business or Engineering over CompSci.
The people I've seen get the midcareer masters did it while working. Some companies have tuition reimbursement. At my school, a good chunk of the masters students graduated into working for Lockheed or Boeing, turned around, and enrolled to get a masters on the company bill.
And, yes, a midcareer masters is the "traditional" way to make the jump into management, at least according to my family's lore. Flip an MBA or something and leap over to the suits. Course, this is the same family that pushed me hard to go to law school in 2008, so I'm not sure I actually believe it...
Angels, innovations, and the hubris of tiny things: my book now free on Royal Road! Seraphim
Got an offer. Such salary, much benefit package, wow.
Make sure you graduate, instead of accepting a job that requires you to drop out.
Yes, its for next summer. Didn't spend as much money as I have on college to forget the last step. Besides, if you don't finish, you lose the ability to use a masters as a midcareer boost. (No idea if that actually works...)
(Next up: whining in this thread about senior design.)
That is awesome. I have heard rumors of companies offering interns offers to get them to drop out of school. Cynically I think it is to make it harder for the person to leave the company.
I graduated with my BS in CompSci seven years ago, and it would require a pretty significant pay bump to offset the lose in pay while I am in school ( presuming a company payed for the Masters ).
In theory if I would go back to school it would be for a Masters in something like Business or Engineering over CompSci.
The people I've seen get the midcareer masters did it while working. Some companies have tuition reimbursement. At my school, a good chunk of the masters students graduated into working for Lockheed or Boeing, turned around, and enrolled to get a masters on the company bill.
And, yes, a midcareer masters is the "traditional" way to make the jump into management, at least according to my family's lore. Flip an MBA or something and leap over to the suits. Course, this is the same family that pushed me hard to go to law school in 2008, so I'm not sure I actually believe it...
Masters-while-working is certainly not a necessary to step into a management role in software development. Most places have their own sorts of management development programs (if anything) to train fresh managers, no extra degree required.
@TwitchTV, @Youtube: master-level zerg ladder/customs, commentary, and random miscellany.
Are there are any good studies of language performance in the C++/C#/Java realm that look at usage of std:: in C++ versus "raw" C/C++ usage?
My basic question, does liberal use of std:: negate the performance benefits of C++ over C# and Java?
No studies that I know of. If you ever find one, I'd be interested in reading it!
To be fair to the standard C++ library, it was never designed to be the fastest library out there, so in all honesty, I would expect raw optimised C/C++ usage to be faster. It might end up being 20-100% faster, since an optimised version can be more cut down and only deal with specific use cases, while the standard library is much more generic. For example, there is an interesting discussion on std::list::size(), and its computation complexity. So if you go "raw" usage, you'll obviously be able to optimise for your use case, and be faster than the standard library, because you'll know which operations are the bottlenecks and which aren't.
However, through personal experience (anecdotes lol), the standard library is typically fast enough, and reliable enough that performance is not an issue (provided I choose to right container, obviously). In this day and age, I wouldn't worry about liberal use of the STL until you discover that it is an actual bottleneck.
Maybe it's just because Python is the first language I used a lot, but its logical operator behaviour is perfectly intuitive to me.
Pokémon X | 3DS Friend Code: 0490-4897-7688
Friend Safari: Fighting - Machoke, Pancham, Riolu | In game name: Jessica
Official Weather Gym Leader of the G+T Pokémon League. @me to try for the Climate Badge!
It seems like some of the concerns cited in that article may become obsolete as consoles coalesce toward an x86 architecture where all their essential components are either identical or very similar
I totally get how STL could ruin your life if you had to build something for the PS3 AND the Xbox360... but the PS4 and Xbone are practically the same computer
I'm sure the compilers are still different, but it should be much more pleasant now
As far as "computers never having enough RAM and CPU"... makes me wonder when that was written... there's entire classes of games that can and do live without 8 gigs of RAM...
It seems like some of the concerns cited in that article may become obsolete as consoles coalesce toward an x86 architecture where all their essential components are either identical or very similar
I totally get how STL could ruin your life if you had to build something for the PS3 AND the Xbox360... but the PS4 and Xbone are practically the same computer
I'm sure the compilers are still different, but it should be much more pleasant now
As far as "computers never having enough RAM and CPU"... makes me wonder when that was written... there's entire classes of games that can and do live without 8 gigs of RAM...
Well, even though it's different platforms, the implementations of the standard library will be different. VC++ uses Dinkumware standard libraries and Sony uses Clang although it's not clear to me if the Sony SDK leverages any of libc++. Both libraries are at various states of compatibility with the C++11 standard which induces different performance and compatibility concerns.
That being said, the EASTL raises some interesting points, some of which are addressed by C++11 (move semantics, TR1 containers) and some that are too specialized to be solved by a standard library (e.g., intrusive containers).
Kambing on
@TwitchTV, @Youtube: master-level zerg ladder/customs, commentary, and random miscellany.
0
GnomeTankWhat the what?Portland, OregonRegistered Userregular
MS uses a heavily modified version of Dinkumware, and I believe the very latest versions of VC++ (past 12) are mostly new Microsoft extensions.
We are in an odd place right now with C++ though, where in many ways it feels like Microsoft is actually ahead of Apple and GNU in supporting a lot of C++'s edge/esoteric features. No compiler or library set currently supports the entire C++11 (or for that matter C++0x) standard.
We are in an odd place right now with C++ though, where in many ways it feels like Microsoft is actually ahead of Apple and GNU in supporting a lot of C++'s edge/esoteric features. No compiler or library set currently supports the entire C++11 (or for that matter C++0x) standard.
LLVMs libc++ claims 100% C++11 support on Apple platforms, and Clang looks to be C++11 complete based on this table.
Rollers are red, chargers are blue....omae wa mou shindeiru
0
GnomeTankWhat the what?Portland, OregonRegistered Userregular
That's new. Up until recently LLVM was lagging way behind on a lot C++11 features. Good to see them getting on the ball with that.
The weirdest logical operator issue I ever ran into was in Perl. Technically it's an operator precedence issue, but it caused weird logical problems until I caught it. Perl has traditional && and || as well as having 'and' and 'or' as syntactic sugar. In a platform I used to have to do a lot of work on the traditional and syntactic sugar versions both got used in the same truth check. The syntactic sugar versions have lower precedence, though, including 'and' having lower precedence than || while && has higher precedence than ||. The difference in precedence between && and 'and' caused some hard to track down issues.
Apothe0sis, what version control system do you currently use?
Because, after having used GIT for the past 2 years, I would jump ship back to Subversion in a heartbeat. http://www.thegeekstuff.com/2011/04/svn-command-examples/
I try to commit at least once a day.
For major changes, this means splitting up my work into smaller chunks, where sometimes those small chunks aren't even referenced, so I can safely commit it, without breaking the build or unit tests.
I have tooled around with subversion some time ago to store changes to config files. It was... semi-succcessful.
But my question is more around ...what do you commit? Only working code? Code that sort of works/has a bunch of extra debugging stuff in it? Do you only commit things that you want to include in the final product? If so, is there a strategy for local version control so that you can roll back to previous states of things you are currently working on but haven't committed yet?
Ha. My class this semester uses Deitel and Deitel C How To Program 7th ed. I wonder what the chances are I can get by with using the 2nd ed. of it sitting on my bookshelf.
The weirdest logical operator issue I ever ran into was in Perl. Technically it's an operator precedence issue, but it caused weird logical problems until I caught it. Perl has traditional && and || as well as having 'and' and 'or' as syntactic sugar. In a platform I used to have to do a lot of work on the traditional and syntactic sugar versions both got used in the same truth check. The syntactic sugar versions have lower precedence, though, including 'and' having lower precedence than || while && has higher precedence than ||. The difference in precedence between && and 'and' caused some hard to track down issues.
Ruby has the same thing, stolen from perl specifically. It's made pretty clear in every ruby tutorial ever that you shouldn't use 'and' and 'or' unless you understand why they're different.
Ha. My class this semester uses Deitel and Deitel C How To Program 7th ed. I wonder what the chances are I can get by with using the 2nd ed. of it sitting on my bookshelf.
Probably 100%.
not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
I decided to try out the amazon book rental for $25 just to avoid the hassle of trying to match up page numbers which will almost certainly be different, differences in exercises in the book, etc. I had a teacher for a different class teach out of a different edition book than I had (mine was the first semester using the new edition and he kept teaching out of the previous edition anyway) and it was a pain in the ass at times.
I am curious to see how this class turns out. It's described as focusing on C++, but that book is very much C oriented and just occasionally uses C++ syntax or touches on "you can also do it like this in C++", if I remember right. They have a whole different book, C++: How To Program, for focusing on C++.
Ha. My class this semester uses Deitel and Deitel C How To Program 7th ed. I wonder what the chances are I can get by with using the 2nd ed. of it sitting on my bookshelf.
Probably 100%.
Probably could substitute used toilet paper as well while you're at it!
Kambing on
@TwitchTV, @Youtube: master-level zerg ladder/customs, commentary, and random miscellany.
Ha. My class this semester uses Deitel and Deitel C How To Program 7th ed. I wonder what the chances are I can get by with using the 2nd ed. of it sitting on my bookshelf.
Probably 100%.
Probably could substitute used toilet paper as well while you're at it!
Hah not a fan of Deitel and Deitel eh?
not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
Ha. My class this semester uses Deitel and Deitel C How To Program 7th ed. I wonder what the chances are I can get by with using the 2nd ed. of it sitting on my bookshelf.
Probably 100%.
Probably could substitute used toilet paper as well while you're at it!
Hah not a fan of Deitel and Deitel eh?
Not a of fan of any "learn C/C++" book on the market. ^_^
Kambing on
@TwitchTV, @Youtube: master-level zerg ladder/customs, commentary, and random miscellany.
Ha. My class this semester uses Deitel and Deitel C How To Program 7th ed. I wonder what the chances are I can get by with using the 2nd ed. of it sitting on my bookshelf.
Probably 100%.
Probably could substitute used toilet paper as well while you're at it!
Hah not a fan of Deitel and Deitel eh?
Not a of fan of any "learn C/C++" book on the market. ^_^
meh, book will only be necessary purely from a potential homework from the book and overly specific homework/test questions taken directly from quotes from the book sort of perspective.
I don't touch C or C++ often, but I occasionally break it out for something small or have to at least read someone else's code for reference and I went through the entire 2nd ed of the book 10+ years ago and this is just a basic intro to algorithms type class. Basic explanation of linked lists, stacks, queues, etc. It is also supposed to nearly exactly mirror the class I took last semester, except last semester was Java (this is being taken purely for transfer credit reasons).
@Ethea I will keep those books in mind for the future. I'd love to do some side projects with something lower level, I just don't have any projects I want to work on where there's any reason to not use Python.
Ha. My class this semester uses Deitel and Deitel C How To Program 7th ed. I wonder what the chances are I can get by with using the 2nd ed. of it sitting on my bookshelf.
Probably 100%.
Probably could substitute used toilet paper as well while you're at it!
Hah not a fan of Deitel and Deitel eh?
Not a of fan of any "learn C/C++" book on the market. ^_^
Can a man ever be truly said to have learned C++?
C/C++ are kind of like Eldritch knowledge. On its surface it looks interesting, benign, though you're not sure why you would ever have reason to interact with it. However, then suddenly you find a problem has arisen, and none of your normal channels seem to work. Slowly you realize that C++ is your only answer, and so you pick it up.
The weight of it is immediately apparent, before you even open any of the files. The sheer number of pages and chapters alone is daunting. But the real horror only happens when you actually open it.
Even at first glance you can feel your sanity completely slipping away, for when you gaze into C++ you gaze into chaos. You feel yourself in a realm where all data types are pointers, where pointers are simply numbers in a very real, and yet unreal sense. Memory is directly accessible, array[index] is functionally equivalent to index[array], and what's more, the second is legal and will compile.
You lean back in your chair, perhaps you close C++... for now.
But you eventually reopen it, knowing that this dread abyss is the only real solution to your problem.
Some will find what they need from C++ as quickly as they can, perform their dark work, and get out before too much of their humanity is stripped away. Some people, gazing into that dark void, will see the void gaze back, and they will sink only deeper, their mind completely slipping away into the depths of insanity.
But some people...
...some people become systems programmers.
Those poor, unfortunate few
They are the void which gazes back.
Rend on
+5
gavindelThe reason all your softwareis brokenRegistered Userregular
Logically, that would make C the clustered remnants of the even older gods whom the Old Ones vanquished to birth the current universe as a form of mid-afternoon snack.
C programmers: we were worshiping the unspeakable unknowable abyss of human incomprehension before it was cool.
Angels, innovations, and the hubris of tiny things: my book now free on Royal Road! Seraphim
Sorry but C is the light of god showing us the true and proper way, while C++ is the voice in the darkness calling for us to just try it once, you know you will love it.
Posts
non-standard? it's pretty standard behavior in dynamically typed languages, at least among the ones I just checked, with only php not following suit
Yes, its for next summer. Didn't spend as much money as I have on college to forget the last step. Besides, if you don't finish, you lose the ability to use a masters as a midcareer boost. (No idea if that actually works...)
(Next up: whining in this thread about senior design.)
That is awesome. I have heard rumors of companies offering interns offers to get them to drop out of school. Cynically I think it is to make it harder for the person to leave the company.
I graduated with my BS in CompSci seven years ago, and it would require a pretty significant pay bump to offset the lose in pay while I am in school ( presuming a company payed for the Masters ).
In theory if I would go back to school it would be for a Masters in something like Business or Engineering over CompSci.
And if PHP is not doing it then it must be right.
I made a game, it has penguins in it. It's pay what you like on Gumroad.
Currently Ebaying Nothing at all but I might do in the future.
That's fair enough. My types bias comes through as usual. ^_^ Although ultimately, the "standardness" of it does not bug me so much as it is an (in my view) unnecessary trade-off of regularity for convenience.
Python's implementation of logical OR is not the same as the standard meaning of boolean logical OR, because python does not stop to check to make sure both operands resolve to boolean values. If you stick to boolean values, the truth tables look the same:
(f or t) returns t, (t or t) returns t, (t or f) returns t, (f or f) returns f
However, since python doesn't give a crap, the truth tables get ridonkulous:
(false or hat) returns hat
(true or hat) returns true
Why the hell is a logical or operator returning hat? It doesn't make sense from a mathematical point of view, which means you need to fall back on the language rules, which is implementation details, which is the link you said above.
If I look at a boolean operator and I have to double check documentation to see what's going on or make sense of it, it's not very readable.
The line (false or hat) is a violation of the principle of least surprise.
The people I've seen get the midcareer masters did it while working. Some companies have tuition reimbursement. At my school, a good chunk of the masters students graduated into working for Lockheed or Boeing, turned around, and enrolled to get a masters on the company bill.
And, yes, a midcareer masters is the "traditional" way to make the jump into management, at least according to my family's lore. Flip an MBA or something and leap over to the suits. Course, this is the same family that pushed me hard to go to law school in 2008, so I'm not sure I actually believe it...
Masters-while-working is certainly not a necessary to step into a management role in software development. Most places have their own sorts of management development programs (if anything) to train fresh managers, no extra degree required.
Some good stuff here.
No studies that I know of. If you ever find one, I'd be interested in reading it!
To be fair to the standard C++ library, it was never designed to be the fastest library out there, so in all honesty, I would expect raw optimised C/C++ usage to be faster. It might end up being 20-100% faster, since an optimised version can be more cut down and only deal with specific use cases, while the standard library is much more generic. For example, there is an interesting discussion on std::list::size(), and its computation complexity. So if you go "raw" usage, you'll obviously be able to optimise for your use case, and be faster than the standard library, because you'll know which operations are the bottlenecks and which aren't.
However, through personal experience (anecdotes lol), the standard library is typically fast enough, and reliable enough that performance is not an issue (provided I choose to right container, obviously). In this day and age, I wouldn't worry about liberal use of the STL until you discover that it is an actual bottleneck.
Friend Safari: Fighting - Machoke, Pancham, Riolu | In game name: Jessica
Official Weather Gym Leader of the G+T Pokémon League. @me to try for the Climate Badge!
Interesssttinnnnggggg.
I'm going to have to read this on the plane ride(s) next week.
But for now, it's back to C and bootloaders.
Yaaaay embedded development!
I totally get how STL could ruin your life if you had to build something for the PS3 AND the Xbox360... but the PS4 and Xbone are practically the same computer
I'm sure the compilers are still different, but it should be much more pleasant now
As far as "computers never having enough RAM and CPU"... makes me wonder when that was written... there's entire classes of games that can and do live without 8 gigs of RAM...
Well, even though it's different platforms, the implementations of the standard library will be different. VC++ uses Dinkumware standard libraries and Sony uses Clang although it's not clear to me if the Sony SDK leverages any of libc++. Both libraries are at various states of compatibility with the C++11 standard which induces different performance and compatibility concerns.
That being said, the EASTL raises some interesting points, some of which are addressed by C++11 (move semantics, TR1 containers) and some that are too specialized to be solved by a standard library (e.g., intrusive containers).
We are in an odd place right now with C++ though, where in many ways it feels like Microsoft is actually ahead of Apple and GNU in supporting a lot of C++'s edge/esoteric features. No compiler or library set currently supports the entire C++11 (or for that matter C++0x) standard.
@hsu
I have tooled around with subversion some time ago to store changes to config files. It was... semi-succcessful.
But my question is more around ...what do you commit? Only working code? Code that sort of works/has a bunch of extra debugging stuff in it? Do you only commit things that you want to include in the final product? If so, is there a strategy for local version control so that you can roll back to previous states of things you are currently working on but haven't committed yet?
Very babby question
Thank you to everyone who has answered so far.
It handles setting up a local repo that you can commit to a la git, then commit the whole thing to svn when you need to sync with others
Ruby has the same thing, stolen from perl specifically. It's made pretty clear in every ruby tutorial ever that you shouldn't use 'and' and 'or' unless you understand why they're different.
Probably 100%.
I am curious to see how this class turns out. It's described as focusing on C++, but that book is very much C oriented and just occasionally uses C++ syntax or touches on "you can also do it like this in C++", if I remember right. They have a whole different book, C++: How To Program, for focusing on C++.
Here is a snippet of two emails I got today:
Probably could substitute used toilet paper as well while you're at it!
Hah not a fan of Deitel and Deitel eh?
Not a of fan of any "learn C/C++" book on the market. ^_^
Can a man ever be truly said to have learned C++?
I don't touch C or C++ often, but I occasionally break it out for something small or have to at least read someone else's code for reference and I went through the entire 2nd ed of the book 10+ years ago and this is just a basic intro to algorithms type class. Basic explanation of linked lists, stacks, queues, etc. It is also supposed to nearly exactly mirror the class I took last semester, except last semester was Java (this is being taken purely for transfer credit reasons).
@Ethea I will keep those books in mind for the future. I'd love to do some side projects with something lower level, I just don't have any projects I want to work on where there's any reason to not use Python.
C/C++ are kind of like Eldritch knowledge. On its surface it looks interesting, benign, though you're not sure why you would ever have reason to interact with it. However, then suddenly you find a problem has arisen, and none of your normal channels seem to work. Slowly you realize that C++ is your only answer, and so you pick it up.
The weight of it is immediately apparent, before you even open any of the files. The sheer number of pages and chapters alone is daunting. But the real horror only happens when you actually open it.
Even at first glance you can feel your sanity completely slipping away, for when you gaze into C++ you gaze into chaos. You feel yourself in a realm where all data types are pointers, where pointers are simply numbers in a very real, and yet unreal sense. Memory is directly accessible, array[index] is functionally equivalent to index[array], and what's more, the second is legal and will compile.
You lean back in your chair, perhaps you close C++... for now.
But you eventually reopen it, knowing that this dread abyss is the only real solution to your problem.
Some will find what they need from C++ as quickly as they can, perform their dark work, and get out before too much of their humanity is stripped away. Some people, gazing into that dark void, will see the void gaze back, and they will sink only deeper, their mind completely slipping away into the depths of insanity.
But some people...
...some people become systems programmers.
Those poor, unfortunate few
They are the void which gazes back.
C programmers: we were worshiping the unspeakable unknowable abyss of human incomprehension before it was cool.