People shouldn't learn to program.
....someone had to do it.
Why should people learn to program?
I don't really understand this. People should learn to program, if they want to make programs. Otherwise why should they? For making money (ahhahahahaha)? There are much easier ways to do that.
There's no "should" in this equation - only "would" and I think I can fashion an (not the) answer to that.
Programming is a form of creative power. Yes, it's a highly technical form of creative power, but no matter how you look at it, it's a creative form and it does entail power. Without getting too philosophical, allow me to present a few analogies that I think hold true:
Naturally, there are other reasons why someone would program - eg natural aptitude, professional requirement, financial reasons, etc. However, the above three points are, to me, the main reasons why anyone would be inclined to take up programming.
Hope you can use some of this for your presentation.
Programming is a form of creative power. Yes, it's a highly technical form of creative power, but no matter how you look at it, it's a creative form and it does entail power. Without getting too philosophical, allow me to present a few analogies that I think hold true:
- Desire for power: programming is much akin to any other form of "playing god", such as writing, directing (a film or a play), composing, etc. It's something that is bound by very specific rules in the form that is used to deliver it, but ultimately what (not how) you program comes from within you and that, by and large, tends to have VERY few limitations (eg it has the same limitations that, for instance, film has). You'll need to think wide, though - from a CGI plugin to a plasma screensaver to a command line application that solves million-digit prime numbers. This is power and by harnessing programming you hardness (a part) of that power. That's pretty tempting.
- Social conditioning: How many (usually short) stories by the likes of Bradbury, Asimov, Dick and even Vonnegut, or films based on those stories, can you think of that deal with the development of artifical intelligence or show a mega computer try and take over the world or at least control the fate of humanity? Allow me to help you here a little with some of the more popular examples:
- Colossus: The Forbin Project
- Resident Evil
- The Terminator
- The Andromeda Strain
- 2001: A Space Odyssey
- The Matrix
- Dark City
- The Thirteenth Floor
- The Fly
- eXistenZ
- Eagle Eye
- Tron
- etc
Ray Arnold: [trying to bring the system back on-line] Access main program. Access main security. Access main program grid.[the computer denies him finally saying, "You didn't say the magic word!"]Dennis Nedry: [on computer] Uh uh uh! You didn't say the magic word! Uh uh uh!Ray Arnold: Please! God damn it! I hate this hacker crap!
It's no coincidence that in sci-fi the unsuspecting victim "gets sucked into the computer and his or her brainwaves get re-programmed". Much of the time this kind of media-induced lust for control is what we fear and, in a very sick way, desire the most. - Alas! To pick up chicks!: Programming is a form of social taboo - a lot of status-related stuff is related to programming (the most prevalent being the "nerd" status); however - it's a little bit more than that. Have you ever shown a female (or sometimes male) colleague how to do something in Excel and heard them say something along the lines: "I've no idea what you/he just did - reprogrammed it or something. Anyway, it's working now.". In today's society we rely heavily on programming, but even more than that - like someone mentioned - we rely on being able to use something that has already been programmed for us. Being able to program usually leads to far greater computer savviness, which a) allows us to not rely on others for some menial tasks that aren't even programming-related (eg the above Excel example) and b) allow us to apporach any piece of software with much greater intuition and understanding.
Naturally, there are other reasons why someone would program - eg natural aptitude, professional requirement, financial reasons, etc. However, the above three points are, to me, the main reasons why anyone would be inclined to take up programming.
Hope you can use some of this for your presentation.
You shouldn't. People learn to program because they're interested in computers and enrol on a Computer Science degree and end up being forced to learn. Now, you have to remember that a UK "computer science" degree is four years of programming at basically all but the top two or three elite Universities in the country. Even if you do get a temporary break from the code, it'll come crashing back round at some point.
"Today, we're going to learn about DNS"..."here's some Java code that implements a DNS lookup"
"Here's a red-black tree"..."Your assignment for Thursday; write one from scratch"
"Time to start your final year project"..."lots and lots of perfect code, make sure you're following the proper software development lifecycle rigidly. No, you're not allowed to do a research based project - code code code code code"
Everybody learns to code purely because "computer science" degrees shove it down your throats and this, in case you're wondering, is why so many CS graduates waltz straight into coding jobs from University and stay there forever; any promotion will just be to a software development manager job. Code is all they know, which is fine because it's probably the only practical skill they learned on their degree.
It's a sad fact when I know of more CS graduates than Software Engineering graduates working as software developers. When I did my two internships, I saw people with much better degrees than me from better Universities (and I have a 2.1 Honours from one of the top 10 universities in the UK) - these truly great minds, intelligent people with the world at their feet - wasting their lives hacking Java 1.4 code to form some application that ran on a server and produced no output whatsoever. What did the application do? Wrote financial information to a database or such other non-descript phooey. People with a chance to make a difference, people who could make something of themselves, locked into that. It was really sad and at that point I told myself it's postgraduate study or bust; no way am I going to stick in the mould of a CS graduate going into coding, no way at all.
The developers were generally the only people with no postgraduate qualifications. The infrastructure/sysadmin guys all had MCSEs, CCNAs etc. and one even had a LPT (License to Penetration Test); at internship #2, the lead DBA was Oracle certified. At both places the development team, while all high-level graduates, had no extra qualifications or certifications whatsoever.
Indeed. Not only is software development stale drone work, it's also pretty badly paid. If you're considered to be good at it, your salary will probably cap out at £50k. Considering the security consultants and solutions architects who can earn £85k+ without a sweat, it's pathetic pay for a truly crap job.
CS tuition needs to change. Get away from the endless coding; it's a big field and too many graduates are missing the bigger picture.
TL;DR - you shouldn't learn to program at all. If you enrol on a CS degree it'll be shoved down your throats, if you don't then it's not worth knowing anyway because it's all out of context.
"Today, we're going to learn about DNS"..."here's some Java code that implements a DNS lookup"
"Here's a red-black tree"..."Your assignment for Thursday; write one from scratch"
"Time to start your final year project"..."lots and lots of perfect code, make sure you're following the proper software development lifecycle rigidly. No, you're not allowed to do a research based project - code code code code code"
Everybody learns to code purely because "computer science" degrees shove it down your throats and this, in case you're wondering, is why so many CS graduates waltz straight into coding jobs from University and stay there forever; any promotion will just be to a software development manager job. Code is all they know, which is fine because it's probably the only practical skill they learned on their degree.
It's a sad fact when I know of more CS graduates than Software Engineering graduates working as software developers. When I did my two internships, I saw people with much better degrees than me from better Universities (and I have a 2.1 Honours from one of the top 10 universities in the UK) - these truly great minds, intelligent people with the world at their feet - wasting their lives hacking Java 1.4 code to form some application that ran on a server and produced no output whatsoever. What did the application do? Wrote financial information to a database or such other non-descript phooey. People with a chance to make a difference, people who could make something of themselves, locked into that. It was really sad and at that point I told myself it's postgraduate study or bust; no way am I going to stick in the mould of a CS graduate going into coding, no way at all.
The developers were generally the only people with no postgraduate qualifications. The infrastructure/sysadmin guys all had MCSEs, CCNAs etc. and one even had a LPT (License to Penetration Test); at internship #2, the lead DBA was Oracle certified. At both places the development team, while all high-level graduates, had no extra qualifications or certifications whatsoever.
Quote:
Otherwise why should they? For making money (ahhahahahaha)?
Indeed. Not only is software development stale drone work, it's also pretty badly paid. If you're considered to be good at it, your salary will probably cap out at £50k. Considering the security consultants and solutions architects who can earn £85k+ without a sweat, it's pathetic pay for a truly crap job.
CS tuition needs to change. Get away from the endless coding; it's a big field and too many graduates are missing the bigger picture.
TL;DR - you shouldn't learn to program at all. If you enrol on a CS degree it'll be shoved down your throats, if you don't then it's not worth knowing anyway because it's all out of context.
One should learn to solve problems, be rational, and explain things well. If you've got those, you'll find you can program anyway.
Richard "Superpig" Fine - saving pigs from untimely fates - Microsoft DirectX MVP 2006/2007/2008/2009
"Shaders are not meant to do everything. Of course you can try to use it for everything, but it's like playing football using cabbage." - MickeyMouse
I think some of you who are saying "learn to program to be able to make programs" are underestimating how pervasive programming actually is in a lot of people's lives. For example, at some point, pretty much everyone working in some professional capacity will have to use a spreadsheet application of some type. Modern spreadsheets are essentially grid-based UIs on top of fairly complex scripting systems. The productivity gains one can find from even cursory understanding of that system are huge.
It even benefits you, the real programmer, for these people to learn how to use MS Excel better. If they knew how to use the tools they already have at their disposal, they wouldn't have to bother you for help as often, leaving you to do more of your own work.
Another example: setting the clock on a microwave oven and entering cooking settings more complex than "high-heat, non-stop, for the next 5 minutes" is a basic program: you have to enter a sequence of instructions and they have to be ordered properly. "Programming" is a pretty pervasive task in people's lives.
Through my previous work in tutoring math, computer science, and physics, as well as micro-consulting on home PC systems, I've learned that, for most computer users, they think A) the computer does what it does by magic, B) the computer is smarter than them, and C) if they "do something wrong", they will make the computer catch on fire. Learning to program exposes each of these as the myths they are.
Knowing that computers aren't magical means that they are predictable. If you go into a situation expecting to be unable to learn the system, you'll be much less likely to pay attention to the patterns of the system to be able to anticipate them with future experiences. Then your father doesn't have to ask you how to attach a document to an email for the 5th time this week.
Knowing that the computer is fundamentally dumb gives dignity back to the user. Computers, as they are designed by man, are in a lot of ways incredibly condescending towards their users. This intimidates people, and makes them feel stupid. When you see your 10 year old kid figuring the machine out, but you can't, you don't tend to question what is wrong with the machine, you question what is wrong with yourself. Learning to program gives the user confidence in themselves again.
The fear of breaking the machine prevents a lot of people from exploring their systems more. How often have you had someone ask you where they could find a program to do a specific thing, and you've pointed out a program that was already on their computer to do just that? I know I've done it on a number of occasions: basic word processing, basic image editing, sound recording, etc. This is why children pick up computers easier than adults, they are more apt to exploration.
Without the understanding and comfort with computers that an education in programming provides, today's people are going to find themselves at a serious disadvantage to their literate peers. Computers are pervasive in our lives, from the machines on our desks to the smallest pocket calculator, from our cars to our microwaves.
[Edited by - capn_midnight on April 21, 2010 8:13:21 AM]
It even benefits you, the real programmer, for these people to learn how to use MS Excel better. If they knew how to use the tools they already have at their disposal, they wouldn't have to bother you for help as often, leaving you to do more of your own work.
Another example: setting the clock on a microwave oven and entering cooking settings more complex than "high-heat, non-stop, for the next 5 minutes" is a basic program: you have to enter a sequence of instructions and they have to be ordered properly. "Programming" is a pretty pervasive task in people's lives.
Through my previous work in tutoring math, computer science, and physics, as well as micro-consulting on home PC systems, I've learned that, for most computer users, they think A) the computer does what it does by magic, B) the computer is smarter than them, and C) if they "do something wrong", they will make the computer catch on fire. Learning to program exposes each of these as the myths they are.
Knowing that computers aren't magical means that they are predictable. If you go into a situation expecting to be unable to learn the system, you'll be much less likely to pay attention to the patterns of the system to be able to anticipate them with future experiences. Then your father doesn't have to ask you how to attach a document to an email for the 5th time this week.
Knowing that the computer is fundamentally dumb gives dignity back to the user. Computers, as they are designed by man, are in a lot of ways incredibly condescending towards their users. This intimidates people, and makes them feel stupid. When you see your 10 year old kid figuring the machine out, but you can't, you don't tend to question what is wrong with the machine, you question what is wrong with yourself. Learning to program gives the user confidence in themselves again.
The fear of breaking the machine prevents a lot of people from exploring their systems more. How often have you had someone ask you where they could find a program to do a specific thing, and you've pointed out a program that was already on their computer to do just that? I know I've done it on a number of occasions: basic word processing, basic image editing, sound recording, etc. This is why children pick up computers easier than adults, they are more apt to exploration.
Without the understanding and comfort with computers that an education in programming provides, today's people are going to find themselves at a serious disadvantage to their literate peers. Computers are pervasive in our lives, from the machines on our desks to the smallest pocket calculator, from our cars to our microwaves.
[Edited by - capn_midnight on April 21, 2010 8:13:21 AM]
[Formerly "capn_midnight". See some of my projects. Find me on twitter tumblr G+ Github.]
I guess I totally misunderstood the stuff.
So the question is really: "Why should people be taught to program?"
Am I right?
In that case, they should be of course. Absolutely. One reason: one might never know how great programming can be.
If I hadn't been taught, simply I would have never started programming.
It should be taught just like anything else: art, technics, whatever.
Programmers think AI is some Voodoo, GUI: even bigger voodoo. Not to mention graphics techniques.
Exposing these myths comes with age, I guess, or maybe experience in real life?
I don't know, but some programming courses won't help by themselves.
I have been thinking about this topic for long now (I even wanted to make a thread on this), but I haven't find a solution/explanation yet. I can't even express my thoughts properly.
But I experienced, that a lot of people will never get rid of this paralyzing mystification, no matter how hard they work/learn.
EDIT: I'm speaking about creative work of course, operating a microwave oven, or setting the VCR is not an issue.
Sorry I can't see, and I can't even speak, I should take a nap (Who cares?)
[Edited by - szecs on April 21, 2010 9:01:42 AM]
So the question is really: "Why should people be taught to program?"
Am I right?
In that case, they should be of course. Absolutely. One reason: one might never know how great programming can be.
If I hadn't been taught, simply I would have never started programming.
It should be taught just like anything else: art, technics, whatever.
Quote: Through my previous work in tutoring math, computer science, and physics, as well as micro-consulting on home PC systems, I've learned that, for most computer users, they think A) the computer does what it does by magic, B) the computer is smarter than them, and C) if they "do something wrong", they will make the computer catch on fire. Learning to program exposes each of these as the myths they are.Sadly, a lot of (if not most of the) programmers still think like that. Just look at the threads in gamedev.
Programmers think AI is some Voodoo, GUI: even bigger voodoo. Not to mention graphics techniques.
Exposing these myths comes with age, I guess, or maybe experience in real life?
I don't know, but some programming courses won't help by themselves.
I have been thinking about this topic for long now (I even wanted to make a thread on this), but I haven't find a solution/explanation yet. I can't even express my thoughts properly.
But I experienced, that a lot of people will never get rid of this paralyzing mystification, no matter how hard they work/learn.
EDIT: I'm speaking about creative work of course, operating a microwave oven, or setting the VCR is not an issue.
Sorry I can't see, and I can't even speak, I should take a nap (Who cares?)
[Edited by - szecs on April 21, 2010 9:01:42 AM]
Quote: Original post by ukdeveloper
Not only is software development stale drone work, it's also pretty badly paid. If you're considered to be good at it, your salary will probably cap out at £50k. Considering the security consultants and solutions architects who can earn £85k+ without a sweat, it's pathetic pay for a truly crap job.
what is fun about being a security consultant or a solutions architect?
Senior Software Engineer is a great job and you can definitely get more than £50k.
If your a crap software developer then it probably is stale drone work.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement