So I have an opportunity to go back to school, for free, based on the WIA program.
Dell closed my callcenter in November, putting about 600 people out of work, and the WIA worker retraining program apparently entails the following:
About $325 a week unemployment for 2 years
2 years of all expenses paid College
It seems like a no brainer, right? 2 years of free school, with about $10 an hour unemployment the entire way through?
There's a catch.
The same day I got laid off, I got a new job doing tech support for a company doing outsource work for AT&T. It's more or less the same as what I was doing for Dell, except it's work from home and consumer internet instead of business hardware. $14 an hour, so it's a tiny bit more than Dell, but it's quite a bit more stressful.
It's also temp work. I worry that it's "permanent temp" work -- i.e., I'll be forever "just almost but not quite" meeting goals to be brought on permanently. I seem to have acclimated to the work but still can't quite hit the goal of 9 calls a day every day -- I've been hovering around 7-8.
There's also the fact that honestly, I can't see me doing tech support forever. This is year 8 of phone tech support for me. I have the distinct feeling I'm wasting my time.
My 2 roommates were both Dell coworkers who got in on WIA, which is why I heard more about it. One of their classmates got hired at the company I work at now but quit after the first week, realizing just how completely pants on head retarded AT&T runs things. So I'm fairly certain, but still getting verification, that I could walk away from this job and go back to school.
Which leads me to my question about programming. Back in the day (about 8 years ago), I was going to a
community college in my home state for Programming, a huge clusterfuck turned my AA in Programming to 2 AASes in "General IT".
I had a general knowledge of C, C++, Delphi Pascal, PHP, PERL, Java, Javascript, Visual Basic, etc. It wasn't expert level stuff, but I was good enough at it that I was tutoring
all the programming classes at YVCC. I could have probably went into an intern or starting position on a team and gotten better.
Anyway. I was thinking of going back to shore up my Computer Programming skills, burn off the rust, and maybe get an AA with plans of going to a 4 year for a Comp Sci degree. However, every time I look at the computer programming market, I'm driven to find a stiff drink.
Is going into programming a mistake in this age of outsourced jobs?
My other ideas were horticulture (with plans on opening a vineyard / making wine) or graphic design with a minor in Japanese (translating manga) -- the idea being very generic degrees, that would give me options going forward.
Or maybe I should just try and find a better IT Support job than... ugh... tech support?
Posts
Try going for a mechanical engineering degree. You get to program, get paid $60,000 a year, and can do some cool shit too.
The only catch is, you have to get your degree first. which is serious business.
It depends on the discipline that you get into and how good you are at it. Programming jobs get outsourced but I've never worked anywhere or with anyone that didn't know and acknowledge that usually outsourced programming is of low quality and often reserved for mundane tasks.
As such, the most commonly outsourced programming stuff that I've known are either small-medium websites or data-heavy enterprise software. Those are things that you don't want to do anyway.
The short answer is that it is not a bad idea but you have to dedicate to it and give yourself the edge against outsourced competition.
Just know that getting a degree in graphic design is pretty much getting a degree in being a shift manager at Starbucks. There is a gross over-saturation of that market and a gross under-saturation of actual talent. I wouldn't go that route unless you are absolutely confident you can live off it.
It's a tough ladder to climb. But a foundation in computer science or engineering would definitely help you move up the IT ladder.
I will definitely say from my current job search that companies view a comp sci degree as more than just programming qualifications. They (justifiably) assume that if you have that degree you have a lot of technical knowledge in general and can pick up even more. There are a lot of jobs that a CS degree will qualify you for that are not outsourceable like programming mundane stuff can be.
Steam Profile
3DS: 3454-0268-5595 Battle.net: SteelAngel#1772
I would agree with this.
Also, tutoring at a community college is good, but it also doesn't mean you're really that close to the level that is required for a lot of true programming jobs.
If you have the opportunity to go back and get a stronger skill set, I'd definitely recommend it.
Like Jasconius said, only the really mundane stuff is outsourced, or only by manager that don't understand the business and is just bean counting (which means you really don't want to work for them anyways).
This is the final answer.
A profession in computer programming is only going to be enjoyable and rewarding if you have a personal initiative to research and improve yourself. In this sense, programming is a lot like playing an instrument or playing a sport. You get what you practice. As such, just doing your 9 to 5 isn't going to advance your career, because usually you are just doing repetitive stuff. Programming jobs where the job itself allows you to grow your skills in a meaningful way are rare and coveted. If it's something you want to do for the next 20 years, be prepared to spend more than a few nights and weekends at the keyboard.
But that is no different than graphic design. The difference is you get paid real money as a programmer. Entry level graphic designers get treated like dogs assuming they even get hired in the first place.
1. My Dad was a nuclear engineer, and today he's working as a project manager for an insurance company. Engineering degrees give you a whole lot of general knowledge and technical expertise. Moreover, your degree is basically a signed affidavit saying "I know how to solve problems." That's one of the reasons they're in such high demand.
2. Computer Science is not an easy degree, especially if you're not great at math. At least at my school, Computer Science is about 1/3 electrical engineering, and the EE classes are all killers. No engineering degree is easy, really, but Comp Sci is way more than just programming.
3. How well do you remember your Physics? Did you take Calculus? You may have to retake them, because most engineering schools either assume you already know basic Calc and Physics or have you take them during your Freshman year. I made the mistake of taking AP Calculus my junior year of high school, and I had forgotten so much I had to retake it my first semester anyway.
There's a gag about the gynecologist's wife. She's feeling a bit randy, so she meets him at the door wearing an apron and a smile. His response, without looking up, is "God honey, if I see one more..."
I haven't coded in like 7 years, ever since starting up the tech support grind. Maybe I should try to teach myself python this weekend and see if I find it enjoyable still.
YVCC was CompSci when I started. By the time I finished, it was "Info Tech" and the thing basically was worthless.
Here are the 2 AAS degrees I have. They're not exactly the same, but they're close enough:
http://www2.yvcc.edu/it/support.html
http://www2.yvcc.edu/it/general_degree.html
Basically they shoehorned me into the AAS: IT Software Support and AAS: General Degrees (where my programming credits went to die), and I had 45 credits left over. I think they did it just to get me away from them before I realized how screwed over I was.
I was going to just keep on as CS as a grandfather clause case, except when the woman who forcibly took over CS and rebranded it IT came on, 98% of the students and EVERY teacher quit. This should have been a huge warning but back then I was in a real fog.
So no, I don't have ANY calc, physics, sciences, englishes, nothing but computer classes. The IT degree at the school didn't require them. The CS did, but not until year 2, and they shifted over to the new program at the end of my year 1.
That was about when I discovered the new "IT" Degrees don't transfer to 4 years...
Last I heard the woman who did all this had fled the state after they almost lost accreditation.
So if you have the money and the drive, I wouldn't say a 4 year degree is something you'll regret.
I'm gonna try to learn Python this week and see if it clicks. Who knows.
Also, long term, a programmer is going to be making more then tech support in most cases. What is tech support, $14 per hour? Programming varies in salary depending on who you ask but in most cases the pay is going to be greater.
Also it's easier to outsource tech support then it is programming. Both can be, but programmers are usually higher in demand since it requires more education.
There are different types of programming. Some people like high level stuff, like web design/development and people like nitty gritty bare bones stuff like ASM and programming embed hardware.
I say go to school for 2 years since they are paying for it. Even if you don't go into programming you have "2+ year education" that'll make your resume go for more.
edit: also, most CS degrees are 4 years. If you go for a degree, make sure it's a BS and not a BA. There is a lot of math, but that is what they are there for. It's not "here, learn calculus by next week". It's "we're going to spend an entire semester going over basic calc before moving on to more advanced topics"
I second the posts about programming being a discipline and a practice - but I put in 9-5 and got promoted and am being trusted with the keys to the company because I'm really good at what I do - the 4 years I spent at Uni I was working on OSS projects and basically learning how to do things right, and that made me faster or better.
I'm getting out with a BA in CIS, which is not a great degree, but its enough if your personality and enthusiasm sets you apart from the crap programmers whose jobs are getting replaced by outsourcing.
Well, Python first. Will put in 8 hours of uninterrupted work on it tomorrow.
The important thing is that you know learn how to effectively structure your code and design programs without resorting to myVar1; myVar2; myVar3; style programming.
Learn about DRY (http://en.wikipedia.org/wiki/DRY) and multi-tier application development (http://en.wikipedia.org/wiki/Multi-tier_architecture) - down that path, greatness lies.
Seriously, stay away from Java.
Just because the more languages you work with, the more you "get" how computers and algorithms and such really work underneath, and you can apply your ideas to any language as you improve.
Which is a good thing, just don't stick with what you end up considering the bad languages!
I'd recommend having a look at Python, C#, and Java. Java has a lot of stupid shit, but it has some good things, and it's still the main programming language used for large corporate stuff on the Unix side of things. C# (and .net in general) is sort of the Java of the Windows world and much of it is what Java should be. I'm primaily a Linux Perl developer, but I still really like C# for some projects. Python seems to be becoming the next Perl - a high quality, high level, interpreted language that can be used for tons of stuff and work pretty well at most of it. I've seen some Perl developers (which is what I code in all day at work and many of my personal projects) say that Python is like Perl without most of Perl's problems.
You'll need to be sure you enjoy it and do a lot of work on your own to make sure you truly understand it, though. There are plenty of programming jobs in the US, but you'll need to stand out from the crowd to be truly successful and a 2 year education in programming, while a great start, is only the tip of the iceberg.
Try _why's poignant guide to ruby.
One thing that I would recommend to anybody learning to program is to stay as far away from GUI programming as you can. Just don't do it. Learning to program by making a GUI will teach you so many bad habits that you'll likely never be a decent programmer.
Start with console applications. If you want to put a GUI on something, make it a web page. The clear separation of layers inforced on you by the server / browser split will teach you more about GUI programming than any actual GUI programming ever will.
Play with me on Steam
Oh my god this. Event-driven code (read: GUIs) is basically a way to write glorified GOTO statements, except that the GOTO statements happen sometime in the future.
You can't write good code that way, unless you're writing at 88mph, and it's nearly impossible to test.
The nice thing about Ruby and GUI programming is that you can write a rails app that handles all of your application logic, and then spits out data in a .json format. From there, you can use something like ExtJs and have a rich web interface going with no goddamn trouble.
GUI programming does entice you to cheat, but starting with GUI programming is a lot like starting with an electric guitar as opposed to an acoustic.
You might not be as mechanically disciplined, but at least you'll be interested and engaged in it, and the rest will come.
Console application programming is boring in every way that something can be boring.
You can make interactive applications that still have respectable logic and data layers.
Now if they are actually referring to actual UI frameworks then yeah they are just talking out of their ass about things which they don't understand.
I'm not saying that web development is better than desktop development, just that learning in a web environment will help you learn a lot about separation of concerns, modularity of code and communicating between layers that is very easy to ignore in a lot of desktop GUI frameworks.
A multi-threaded event-driven GUI is not a place for beginners.
Play with me on Steam
I have yet to see a course in programming that started you at GUI development other than visual basic. Even then, the programmer degree (as opposed to the other CS degrees) took intro to C++ before it and did the command line coding anyways.
From a programmer's perspective, as long as you're not too overly concerned with pretty GUIs and fancy controls, who cares if your output is to a text box as opposed to stdout stream.
But yeah, programmers, pick up a good sockets programming book and learn it. It's way more important than learning how to use the graph data-structure. That's my helpful advice for today.
I would support this.
Web applications are a good place to start. They are easy to make, easy to publish, and they pretty much cover all bases from beginner to the lower echelons of "advanced" unless you make something hella complex/popular.
The thing about web programming though, is that for every good tutorial about web application design and programming, there are about 20 to 25 bad ones.
Web technologies have become so abstracted from their technical roots that a lot of completely unqualified voices have a platform to push concepts that are actually pretty bad, because they simply don't understand them. See: 95% of the Flash community, anyone who has ever used Cold Fusion, about 60% of people who use jQuery.
The key to not sucking in the web is to focus less on the frills and get a true understanding of the shoulders upon which you stand to get an app showing up in someones browser. This means understanding resource management, application design, etc, and less on just googling for disparate tutorials and slapping code together. Learn to walk before you run.
This is much more to the point that I had in my head. Understand network communications. Once you do that, you'll be in a much better position than most of the programmers that I've ever had to deal with.
Play with me on Steam
What web technology doesn't let you put logic into the view?
You definitely have to learn most everything on your own in software development, both in college and work, but I'm not sure why you say you have to do it all on your own time. If you're using the knowledge in your current job, you can research it on the clock. If you're not then why spend your time learning it when you don't know if you'll ever use it and you'll have no practical experience to get a job in that field? Only a really shitty job will leave you coding mundane crap with no room for advancement ever unless you walk up to your boss and explain to him what you've been reading over the weekend.
I think you would be surprised at how rare being able to bill research is in any meaningful quantity, unless you are in an in-house company which usually means you ain't doing much advancement anyway.
The only companies I know of that actively pay their employees to just learn shit and work on pet projects are start-ups who are trying to gin up talent or extremely wealthy/stable companies that can inject some risk in having a dude sit around for a day out of the week and just fool around with no guaranteed bottom line consequence.
I'm not talking about pet projects, I'm talking about getting better at your job. As an entry level programmer you won't have any say in what technologies are being used, so all you need to do is learn the ones you're using and increase your productivity so you'll be assigned better projects. When you're higher up and the decisions on which technologies to use are up to you, then they better damn well let you research it on the company dime. Learning new technologies on your own time is unlikely to pay off in a job - a competent developer can learn anything relatively quickly, general skills and practical experience are more important when interviewing. You make it sound like every developer in the company is just an entry level code monkey forever until they quit and get hired somewhere better.
You get hired if you can do the job, and you do the job. Period.
There's no switching from Java to Python if its better, or from PHP to .NET if such and such a feature exists. It is what it is until someone wearing a tie with an MBA says they are switching.
And yeah, let's say a change is made, they won't fire everyone (except most or all of the contractors), the people who are left will get sent to a conference or two but other than that they are thrown into the fire.
The more you spend in your own time, the better you will be when that time comes.
I think advising people to just kick back and count on a company to nurse you along your career is bad advice even if it's just partially counting on them. You have to do it yourself, and often on your own time. More often than not.
So if you're working for a company like that, what are you expecting will happen if you take on a couple pet projects? You might have a wider arsenal of skills but you won't have job experience to demonstrate it to future employers anyway, and odds are they won't be using the same technologies you worked on either.
I went into my first year of college as a CS major with almost zero programming experience, no calculus, and only basic high school physics. School was very challenging and I put in very long hours sometimes, but I made it through and loved it. If you make it through your first 2 or 3 programming classes and you still like programming, I would say that Computer Science is the major for you. As far as calculus, just study your brains out and hopefully get through the classes. You probably won't need to use calculus in any actual programming job unless you specifically seek out jobs like that.
The job market isn't crap for software engineers, it just went from extremely good back down to normal.
Even if CS winds up not being for you, you will still be in college. You will have taken some general required classes like math, english and physics. You can change majors and get a degree in something else without starting over completely.