Holy shit, that's the same debate thing that's been drifting around in bar associations and law schools in recent years. Particularly since law is so much more difficult to be successful in these days than in the glory days of the 80s, many schools were discussing having a 4th-year apprenticeship program to match grads with local practitioners and get the skills they need.
Maybe this is just a problem for professional careers in general? You know the "entry level" jobs that require 3-4 years experience...
Holy shit, that's the same debate thing that's been drifting around in bar associations and law schools in recent years. Particularly since law is so much more difficult to be successful in these days than in the glory days of the 80s, many schools were discussing having a 4th-year apprenticeship program to match grads with local practitioners and get the skills they need.
Maybe this is just a problem for professional careers in general? You know the "entry level" jobs that require 3-4 years experience...
Doctors do it, engineers do it, a lot of trade careers like plumbers and electricians do it. High skill work requires both theory and practice in industry, but tech companies are struggling to find qualified candidates because most candidates who graduate university have never used git before to maintain a project.
So they resort to glorified IQ tests as their bar, because there are no other indicators that someone knows how to program other than "have you studied how to invert a binary tree"?
Edit: There is also a lot of cargo culting that happens in the tech industry, let's not forget that part. "Google does it, so we should too!"
Yeah, that's one of the comparisons they've made in law schools was looking at how doctors go through a mandatory residency period. But then again programming doesn't have any kind of accreditation body who gatekeeps all entrants to the field, so it's even more crazy when I've even seen job postings demanding a PHD in CS for a shitty programming job. (Ha, good luck with that.)
Speaking of stupid IQ tests, SMBC made a joke about them today:
Holy shit, that's the same debate thing that's been drifting around in bar associations and law schools in recent years. Particularly since law is so much more difficult to be successful in these days than in the glory days of the 80s, many schools were discussing having a 4th-year apprenticeship program to match grads with local practitioners and get the skills they need.
Maybe this is just a problem for professional careers in general? You know the "entry level" jobs that require 3-4 years experience...
Doctors do it, engineers do it, a lot of trade careers like plumbers and electricians do it. High skill work requires both theory and practice in industry, but tech companies are struggling to find qualified candidates because most candidates who graduate university have never used git before to maintain a project.
So they resort to glorified IQ tests as their bar, because there are no other indicators that someone knows how to program other than "have you studied how to invert a binary tree"?
Edit: There is also a lot of cargo culting that happens in the tech industry, let's not forget that part. "Google does it, so we should too!"
The number of times I've had to say "We're not Google/Amazon/Netflix/Twitter/<insert huge tech company here> and don't have their resources" in a meeting is entirely too damn high.
I mean, this isn't just an issue exclusive to the computer science domain. I received my degree in nuclear engineering and it's not like I learned everything I needed to hit the ground running in that field. You were just expected to pick things up on the business side and knew "Okay, 1 rem is too much exposure outside this container". All the degree meant was "Hey, this guy knows some about how nuclear reactors work and can learn things I guess?" Maybe we're expecting too much vocational from our universities.
This is what I think/wonder abut when we're teaching our students code versioning. Make no mistake: we teach them code versioning by talking about it for a couple hours in lecture, then giving them a handout with Git instructions in lab, having them "practice" using Git for an hour in lab, and then making them use Git for their projects.
That's it. Now they've "learned" code versioning. They them, promptly, fuck their Git repos up repeatedly over the course of the term, and we're forced to stare at their Git logs to see if they've been using it right in marking, and I'm just like... this is a huge fucking waste of time - way more than the 3 hours spent on "teaching" Git - it's distracting them from learning the other software engineering stuff they're supposed to learn in this course, and they come out of it still not knowing shit, I can't imagine any employer can do a crappier job of "teaching" students Git in 3 hours time, assuming they're even using Git, which they might not at all, in which case hahahahahahahaha.
Like, I don't know what it's like in Law and Nuclear Engineering, but in Computer Science, we do kinda actually respond to employer feedback. It's just incredibly feeble, ineffective, and probably comes at the cost of the things that we're actually capable of effectively teaching students. We don't respond to the demand for unit testing experience by creating a whole new course on unit testing - not that any student would take that course anyways - we just jam in a couple hours of unit testing into CS1 and call it a day; does that really satisfy your demand for computer programmers who know (how) to unit test?
If I were teaching a comp sci course, it'd be a course where each set of students has a different responsibility completely separated from the others. And each group has to work on it all concurrently, without knowing if any of the other teams have even written a line of code. It must be unit tested, of course. And they have to use git. And the teacher is constantly changing the requirements based on stupid questions that all of the students ask in lecture.
Every student would hate me, and they would all know exactly how it is to work in industry.
If I were teaching a comp sci course, it'd be a course where each set of students has a different responsibility completely separated from the others. And each group has to work on it all concurrently, without knowing if any of the other teams have even written a line of code. It must be unit tested, of course. And they have to use git. And the teacher is constantly changing the requirements based on stupid questions that all of the students ask in lecture.
Every student would hate me, and they would all know exactly how it is to work in industry.
Hello yes I had that course. Our lecturer got his jollies by pretending to be the most annoying client ever.
+9
GnomeTankWhat the what?Portland, OregonRegistered Userregular
We need to bring back apprenticeship in my opinion. My dad was an electrical engineer and he had to do two years of apprenticeship before he was let near anything important without supervision.
We need to bring back apprenticeship in my opinion. My dad was an electrical engineer and he had to do two years of apprenticeship before he was let near anything important without supervision.
There is absolutely nothing that prevents employers from doing this.
Except themselves.
+6
syndalisGetting ClassyOn the WallRegistered User, Loves Apple Productsregular
My offshore development group actually does this!
They assign "shadow" resources to projects who do not directly interface with the customer, but pair program, study, sit in on scrums and planning sessions, and truly understand what it means to work on a team and learn the language... and it could be up to a year before they trust them as a standalone resource.
SW-4158-3990-6116
Let's play Mario Kart or something...
There are Software Engineering degrees available from (some) universities in New Zealand. They still have the comp-sci grounding but the higher level papers are all focused around more industry-relevant stuff (lots of group projects, courses about Agile/processes, etc).
IMO this is a step in the right direction because comp sci should be more theoretical and less focused on what will get you a job. Universities shouldn't be all about teaching vocational skills. They're academic institutions. Filling CS degree programs up with a bunch of <whatever industry wants at the moment> comes at the cost of teaching things that students aren't able to learn outside of a university.
That said, I worked for a manager who refused to read a CV that didn't have a CS degree on it. So despite there being software engineering qualifications available from universities, some employers have a shitty elitist attitude and treat them like second-class citizens.
I mean, this isn't just an issue exclusive to the computer science domain. I received my degree in nuclear engineering and it's not like I learned everything I needed to hit the ground running in that field. You were just expected to pick things up on the business side and knew "Okay, 1 rem is too much exposure outside this container". All the degree meant was "Hey, this guy knows some about how nuclear reactors work and can learn things I guess?" Maybe we're expecting too much vocational from our universities.
This is what I think/wonder abut when we're teaching our students code versioning. Make no mistake: we teach them code versioning by talking about it for a couple hours in lecture, then giving them a handout with Git instructions in lab, having them "practice" using Git for an hour in lab, and then making them use Git for their projects.
That's it. Now they've "learned" code versioning. They them, promptly, fuck their Git repos up repeatedly over the course of the term, and we're forced to stare at their Git logs to see if they've been using it right in marking, and I'm just like... this is a huge fucking waste of time - way more than the 3 hours spent on "teaching" Git - it's distracting them from learning the other software engineering stuff they're supposed to learn in this course, and they come out of it still not knowing shit, I can't imagine any employer can do a crappier job of "teaching" students Git in 3 hours time, assuming they're even using Git, which they might not at all, in which case hahahahahahahaha.
Like, I don't know what it's like in Law and Nuclear Engineering, but in Computer Science, we do kinda actually respond to employer feedback. It's just incredibly feeble, ineffective, and probably comes at the cost of the things that we're actually capable of effectively teaching students. We don't respond to the demand for unit testing experience by creating a whole new course on unit testing - not that any student would take that course anyways - we just jam in a couple hours of unit testing into CS1 and call it a day; does that really satisfy your demand for computer programmers who know (how) to unit test?
I just want to say that I learned Git by the seat of my pants and backwards. When I moved my codebase to Linux, I looked for a git client and decided it was just better to use the command line. Turns out it was easier just to do that as the procedure was pretty much the same. status -> add -> commit -> push (or pull if updating from the windows copy) . Now that I have something very valueable in my repo, I be damned if I screw up all my hard work by doing something funky with it. Whoops! I screwed up the function. I'll just undo the changes. Easy. Branch and rebase? F that noise. I have heard too many horror stories where a rebase utterly pooched a repo. I'm happy to work on somerhing, have it work, and push with a note. I am also aware of what dragons lurk, which is good enough I think.
We need to bring back apprenticeship in my opinion. My dad was an electrical engineer and he had to do two years of apprenticeship before he was let near anything important without supervision.
There is absolutely nothing that prevents employers from doing this.
Except themselves.
This is legitimately the point of internships.
You're supposed to follow around senior developers and learn from them.
Internships are supposed to transition to job offers, in an ideal world.
What people do is use internships as a way to get free labor and people put up with it because ???
not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
We need to bring back apprenticeship in my opinion. My dad was an electrical engineer and he had to do two years of apprenticeship before he was let near anything important without supervision.
There is absolutely nothing that prevents employers from doing this.
Except themselves.
This is legitimately the point of internships.
You're supposed to follow around senior developers and learn from them.
Internships are supposed to transition to job offers, in an ideal world.
What people do is use internships as a way to get free labor and people put up with it because ???
For the boss: They like having someone bring in thier coffee every morning?
For the intern: Student loans/parents are paying my expenses anyway....
We need to bring back apprenticeship in my opinion. My dad was an electrical engineer and he had to do two years of apprenticeship before he was let near anything important without supervision.
There is absolutely nothing that prevents employers from doing this.
Except themselves.
This is legitimately the point of internships.
You're supposed to follow around senior developers and learn from them.
Internships are supposed to transition to job offers, in an ideal world.
What people do is use internships as a way to get free labor and people put up with it because ???
If it was unpaid (which I think is rare? at least around here), then the big benefit is it counts as work experience. When I finished college in 2010, no one would hire me (also a really shitty time, recession 'n all) because I had no work experience. That whole problem, can't get a job without experience but you need a job to get experience...
I have a paid high school intern. She is completely useless to the business. I don't give her busywork or make her get coffee, it's mostly asking her to do minor bug fixes or read through the code I'm writing and tell me how it works. Our business has a pretty active Pro Bono community though, so my bosses went for it once they understood that the intern isn't here to be useful to us, she's here to learn.
If nothing else, I think I've taught her that she doesn't want to be a programmer. Which is a bummer, but it's better for her to learn now that she doesn't have a passion for it. I think instead of getting an intern next year I'm going to go back to volunteer teaching. Or start a coding club or something, I dunno. It sucks to invest so much time into someone only to find out that they are going to major in something else.
As a PhD student, I'm looking at "internships" right now
But mostly that seems to be because it's a way to signal to companies "Hey our lab wants to work with you on some research, and you can essentially get someone to do research with you, while you only have to pay an intern-level salary (which is also partially taken care of from this grant type about connecting academics and companies)". It's a little shitty how it's basically a "We'll gladly let you pay less than what we're worth because we don't have the resources to do this on our own", but also, hey, we get a lot out of it as well, and it's not unlikely that it leads to correctly-paying positions also.
Hi everyone I decided to pick up programming as a hobby. I used to learn it in school a long time (more than a decade ago) but never really got anywhere. Using this website https://www.learncpp.com/cpp-tutorial/1-4a-a-first-look-at-function-parameters/ now since it was recommended online. One thing I would like to ask is how much mathematical knowledge do I need at later points? I am not a very smart person so I think I can also try to pick up some mathematics now, if I really need it down the road.
Most programming jobs require almost no math, it's all just logic.
The math comes in when you're trying to estimate time and space efficiency or for things like statistics or machine learning. None of those would apply to being a web developer making CRUD applications, which is probably 95% of normal programming jobs.
Edit: Forgot, game engine programming is surprisingly math-heavy. Ever tried to calculate the distance between two objects in 3D space? Or whether a line can be drawn between two points in 3D space without impacting something (firing a gun at someone and checking if the bullet hits)?
Hi everyone I decided to pick up programming as a hobby. I used to learn it in school a long time (more than a decade ago) but never really got anywhere. Using this website https://www.learncpp.com/cpp-tutorial/1-4a-a-first-look-at-function-parameters/ now since it was recommended online. One thing I would like to ask is how much mathematical knowledge do I need at later points? I am not a very smart person so I think I can also try to pick up some mathematics now, if I really need it down the road.
If you're interested in just learning programming in general, could I suggest you try Python or a more recent language over C++?
C++ has a lot of... hmmm.... thorny edges that can (and do!) catch a lot of beginners.
Modern languages have learnt from those mistakes, and are generally a lot friendlier.
I've been enjoying Go. It's got some oddities, but overall it's pretty nice. I think it's great for simple/quick API's (or, that's my main use so far anyway).
I guess the only oddity that even slightly bugs me is that unused variables actually cause errors when compiling (though, helping keep my code neat is good). That's not much of a complaint though. I, since I spend about 90% of my time with C#, still find getting multiple objects returned from a function (that's not like a container of objects [array, IEnumerable<wtfever>]) to be a little odd, but the more I use it, the less it seems in any way abnormal. Python's done it since ... well, for quite a while. I wanna say since it came about, but I honestly don't know if that's quite accurate. It also kind of gives me a sense of better checking for errors, since damn near every function call is going to return a result and an error. If the error's nil, you're good to go, otherwise, something done went n blew'd itself up.
I haven't tried Rust yet, but I've heard a few good things about it.
I always have to recommend C#. Get the c-like styling with all of .NET, and you don't have to jump into pointers and such. (aside: I'm embarrassed as heck that I'd have to pretty much re-learn pointers n all the involved fun in its entirety if someone was like "oh god here's some c/c++, I need help!)
The lesson here is: Ask 15 programmers what the best technology to learn is and you'll get 30 different answers.
Steam: Spawnbroker
+14
OrcaAlso known as EspressosaurusWrexRegistered Userregular
C++ is a terrible language for a beginner. Way too many corner cases and too much cruft due to historical reasons. You'll spend more time fighting the compiler than understanding what is actually going on.
Decent second or third language, but don't do it as your first.
As a c++ developer I suggest learning rust first as the language gives you less chances to 'hurt' yourself, plus the tooling and community packaging is more streamlined and easy to understand for new users.
Ethea on
0
mightyjongyoSour CrrmEast Bay, CaliforniaRegistered Userregular
I started with C...
But I think for most part, Python has a much lower barrier to entry that still gives you a good sense of what programming is about, basic logic/constructs etc without needing to learn a wide array of gotchas (like others have mentioned).
I think the biggest thing I hear from people who move from Python to a strongly-typed language like Java/C++, is that they get used to how much Python does for you and are sort of taken aback when those things aren't available by default.
+1
OrcaAlso known as EspressosaurusWrexRegistered Userregular
C has its own case of stupidities and historical cruft and limitations, but it still has significantly fewer language features upon which to impale yourself. It wouldn't be my choice for a first language, but its lack of features makes it that much easier to learn than C++.
C is just crippled by its lack of features and standard library if you want to actually do something beyond a console app and don't really know what you're doing yet.
Something like Python would be my first choice as a starter language, then follow up with something that does a bit less hand-holding and opens up different possibilities--maybe Rust, or C#, or C++, or Haskell, or Scala, or...
Posts
Maybe this is just a problem for professional careers in general? You know the "entry level" jobs that require 3-4 years experience...
Doctors do it, engineers do it, a lot of trade careers like plumbers and electricians do it. High skill work requires both theory and practice in industry, but tech companies are struggling to find qualified candidates because most candidates who graduate university have never used git before to maintain a project.
So they resort to glorified IQ tests as their bar, because there are no other indicators that someone knows how to program other than "have you studied how to invert a binary tree"?
Edit: There is also a lot of cargo culting that happens in the tech industry, let's not forget that part. "Google does it, so we should too!"
Speaking of stupid IQ tests, SMBC made a joke about them today:
https://www.smbc-comics.com/comic/hiring
The number of times I've had to say "We're not Google/Amazon/Netflix/Twitter/<insert huge tech company here> and don't have their resources" in a meeting is entirely too damn high.
This is what I think/wonder abut when we're teaching our students code versioning. Make no mistake: we teach them code versioning by talking about it for a couple hours in lecture, then giving them a handout with Git instructions in lab, having them "practice" using Git for an hour in lab, and then making them use Git for their projects.
That's it. Now they've "learned" code versioning. They them, promptly, fuck their Git repos up repeatedly over the course of the term, and we're forced to stare at their Git logs to see if they've been using it right in marking, and I'm just like... this is a huge fucking waste of time - way more than the 3 hours spent on "teaching" Git - it's distracting them from learning the other software engineering stuff they're supposed to learn in this course, and they come out of it still not knowing shit, I can't imagine any employer can do a crappier job of "teaching" students Git in 3 hours time, assuming they're even using Git, which they might not at all, in which case hahahahahahahaha.
Like, I don't know what it's like in Law and Nuclear Engineering, but in Computer Science, we do kinda actually respond to employer feedback. It's just incredibly feeble, ineffective, and probably comes at the cost of the things that we're actually capable of effectively teaching students. We don't respond to the demand for unit testing experience by creating a whole new course on unit testing - not that any student would take that course anyways - we just jam in a couple hours of unit testing into CS1 and call it a day; does that really satisfy your demand for computer programmers who know (how) to unit test?
Every student would hate me, and they would all know exactly how it is to work in industry.
Hello yes I had that course. Our lecturer got his jollies by pretending to be the most annoying client ever.
There is absolutely nothing that prevents employers from doing this.
Except themselves.
They assign "shadow" resources to projects who do not directly interface with the customer, but pair program, study, sit in on scrums and planning sessions, and truly understand what it means to work on a team and learn the language... and it could be up to a year before they trust them as a standalone resource.
Let's play Mario Kart or something...
IMO this is a step in the right direction because comp sci should be more theoretical and less focused on what will get you a job. Universities shouldn't be all about teaching vocational skills. They're academic institutions. Filling CS degree programs up with a bunch of <whatever industry wants at the moment> comes at the cost of teaching things that students aren't able to learn outside of a university.
That said, I worked for a manager who refused to read a CV that didn't have a CS degree on it. So despite there being software engineering qualifications available from universities, some employers have a shitty elitist attitude and treat them like second-class citizens.
I just want to say that I learned Git by the seat of my pants and backwards. When I moved my codebase to Linux, I looked for a git client and decided it was just better to use the command line. Turns out it was easier just to do that as the procedure was pretty much the same. status -> add -> commit -> push (or pull if updating from the windows copy) . Now that I have something very valueable in my repo, I be damned if I screw up all my hard work by doing something funky with it. Whoops! I screwed up the function. I'll just undo the changes. Easy. Branch and rebase? F that noise. I have heard too many horror stories where a rebase utterly pooched a repo. I'm happy to work on somerhing, have it work, and push with a note. I am also aware of what dragons lurk, which is good enough I think.
This is legitimately the point of internships.
You're supposed to follow around senior developers and learn from them.
Internships are supposed to transition to job offers, in an ideal world.
What people do is use internships as a way to get free labor and people put up with it because ???
For the boss: They like having someone bring in thier coffee every morning?
For the intern: Student loans/parents are paying my expenses anyway....
If it was unpaid (which I think is rare? at least around here), then the big benefit is it counts as work experience. When I finished college in 2010, no one would hire me (also a really shitty time, recession 'n all) because I had no work experience. That whole problem, can't get a job without experience but you need a job to get experience...
If nothing else, I think I've taught her that she doesn't want to be a programmer. Which is a bummer, but it's better for her to learn now that she doesn't have a passion for it. I think instead of getting an intern next year I'm going to go back to volunteer teaching. Or start a coding club or something, I dunno. It sucks to invest so much time into someone only to find out that they are going to major in something else.
But mostly that seems to be because it's a way to signal to companies "Hey our lab wants to work with you on some research, and you can essentially get someone to do research with you, while you only have to pay an intern-level salary (which is also partially taken care of from this grant type about connecting academics and companies)". It's a little shitty how it's basically a "We'll gladly let you pay less than what we're worth because we don't have the resources to do this on our own", but also, hey, we get a lot out of it as well, and it's not unlikely that it leads to correctly-paying positions also.
This will be here until I receive an apology or Weedlordvegeta get any consequences for being a bully
The math comes in when you're trying to estimate time and space efficiency or for things like statistics or machine learning. None of those would apply to being a web developer making CRUD applications, which is probably 95% of normal programming jobs.
Edit: Forgot, game engine programming is surprisingly math-heavy. Ever tried to calculate the distance between two objects in 3D space? Or whether a line can be drawn between two points in 3D space without impacting something (firing a gun at someone and checking if the bullet hits)?
If you're interested in just learning programming in general, could I suggest you try Python or a more recent language over C++?
C++ has a lot of... hmmm.... thorny edges that can (and do!) catch a lot of beginners.
Modern languages have learnt from those mistakes, and are generally a lot friendlier.
I guess the only oddity that even slightly bugs me is that unused variables actually cause errors when compiling (though, helping keep my code neat is good). That's not much of a complaint though. I, since I spend about 90% of my time with C#, still find getting multiple objects returned from a function (that's not like a container of objects [array, IEnumerable<wtfever>]) to be a little odd, but the more I use it, the less it seems in any way abnormal. Python's done it since ... well, for quite a while. I wanna say since it came about, but I honestly don't know if that's quite accurate. It also kind of gives me a sense of better checking for errors, since damn near every function call is going to return a result and an error. If the error's nil, you're good to go, otherwise, something done went n blew'd itself up.
I haven't tried Rust yet, but I've heard a few good things about it.
I always have to recommend C#. Get the c-like styling with all of .NET, and you don't have to jump into pointers and such. (aside: I'm embarrassed as heck that I'd have to pretty much re-learn pointers n all the involved fun in its entirety if someone was like "oh god here's some c/c++, I need help!)
C# is just nice. And you get to use Visual Studio/Visual Studio Code, which are fantastic editors.
Decent second or third language, but don't do it as your first.
But I think all of them will agree that C++ is not the best place to start.
Just keep in mind to not get discouraged if it isn't working out for you! If C++ ain't a good time then try something else!
But I think for most part, Python has a much lower barrier to entry that still gives you a good sense of what programming is about, basic logic/constructs etc without needing to learn a wide array of gotchas (like others have mentioned).
I think the biggest thing I hear from people who move from Python to a strongly-typed language like Java/C++, is that they get used to how much Python does for you and are sort of taken aback when those things aren't available by default.
C is just crippled by its lack of features and standard library if you want to actually do something beyond a console app and don't really know what you're doing yet.
Something like Python would be my first choice as a starter language, then follow up with something that does a bit less hand-holding and opens up different possibilities--maybe Rust, or C#, or C++, or Haskell, or Scala, or...
Yes, Perl.
Boo this man!