I think a lot of the times what is missing in this conversation is the need to first get teachers in the classroom that know how to code and want to teach it. Unfortunately a teachers salary is much, much lower than what someone who does know how to code can make elsewhere.
I understand that there are a lot of online tools that make learning/teaching coding easier but I do believe the person teaching it still needs to have a decent understanding. Kids ask tons of questions on how to accomplish certain things and if the teacher doesn't really know how to code, they won't be able to help those students who will then lose interest.
This is a really good point. I was thinking a while ago about the role of teachers in this context and why they're so necessary, especially in this day and age where there are online tutorials for everything.
I was taught how to code from early on (6th grade), and most of my learning of programming (which I would distinguish from computer science) happened as we worked on projects/built things.
I can't tell you how many times I saw other students get stuck on bugs along the way during development and get really frustrated with programming (I saw this when I was a student, and later when I taught others how to code).
What always helped was having teachers/instructors who could easily help students work through those roadblocks so they didn't get stuck and holed up on one thing. Not really the traditional teacher that just stays up and lectures at students but rather one that also spends time getting in there and working alongside them to some extent.
Traditional classrooms/lectures are structured from the ancient days where students went to listen to a professor speak and jot down whatever the lecturer orated. But now, especially with programming, the emphasis is more on building, and so that changes what the instructor really does in the classroom (just "lecturing" vs "facilitating" and "supporting").
Probably a long-winded way of the same point you just made, but I just thought I'd elaborate since it's interesting to me. Some of my friends say that these online courses are 100% independently viable but I am not convinced that they are a suitable replacement for teachers (though I recognize it really depends on the students and how independently motivated or how prone to getting burned out they are...).
I agree with everything you said. I am also not convinced that "these online courses are 100% independently viable" especially for the younger kids. I didn't think this a year ago but my opinion has really changed since having the opportunity to teach kids in person in a class. I use online tools in the class and they are awesome but would have lost 90% of the kids if they didn't have someone there to guide/explain things in more detail.
I was also pleasantly surprised how often kids asked "how can I make it do this" which was something outside of what the online tutorial taught. The tutorials were great at sparking their interest but can't really accommodate all of the edge cases especially for those more creative thinkers. I also learned that most of the kids got bored fairly quickly following step-by-step tutorials. They wanted to hack and play around with what they were learning. So I changed the tutorials to be much smaller with very small steps and at the end of each mini tutorial, I encouraged them to hack and try different things based on what they learned.
Maybe with a live online teacher available this would be less of an issue but even with that you lose the personal interaction the kids have with each other and the teacher. Things like showing each other what they created and working directly with the teacher/other kids to expand on that. I think a lot of younger kids really do need that interaction and in-person feedback and I am not sure how that can be re-created online.
To your last point about a live online teacher, I think it would also just be really frustrating for the teacher to be in that position. Debugging code can be difficult at times, and to do so online where you can't just be with the student in person makes it even harder.
To your broader point though, I think that resources like StackOverflow, and even some smaller communities (e.g. subreddits) are a great way to supplement online courses (by getting questions answered/dealing with roadblocks). The only issue is response time (a dedicated instructor who was with you in person could ideally resolve it quickly whereas some SO answers can take days, weeks, or even longer, depending on how obscure/unclear it is).
I agree though - programming after all is like playing with infinitely many legos; it can be overwhelming but ultimately you need to just get your hands dirty and fiddle... otherwise you won't learn correctly...
And even if you were 100% driven to teach, you could be consulting for a code school or something else thats far more lucrative. I'd consider at this point the only people who are actually teaching full time would be people doing it for charity. Except, instead of being thanked you have to navigate all the administration and just plain pains in the ass that come from being a teacher.
The most likely people to go teach how to code are:
1) People who are only academic and don't have any desire to join the private sector (unlikely to be the best teachers)
2) People who've joined the private sector and have their retirement already done (unlikely to put up with administration)
3) People who code as a hobby, have a spouse or other financial support (hard to find at all)
I think I fall mostly into your #2. I have been self employed for 15 years and for the past year have been volunteering at my sons middle school 1 hour per week. I am not retired but have financial stability from my own business.
Your points about dealing with the administration are so true. It is beyond frustrating. I understand their hands are tied on certain things because they have guidelines, etc. to follow but it really has made me realize I would never consider teaching for a living in the current education system. Not only because the pay is low but because of all of the political crap you have to deal with. I have been approaching it as don't ask first and do things the way I want until they tell me no. It has caused friction with the administration but I don't really care because the kids are getting so much from it and the classes always book up super fast with a waiting list. I am doing this for free so what is the worst they can do? Cut me off? That would a disservice to the kids not to me.
Your #1 point is what I see the most and some of what is being taught or lack there of is mind blowing.
This could be driven by industry, as many things are, and as they do to promote and get trained personal in unis. IMHO, computer literacy would have to be combined with math classes from an early age. To achieve that, computer literacy or rather programming in general would have to become much more common knowledge. That is an ongoing process, I believe. I don't know what's wrong with the American school systems, but the introduction of programming as a standard curriculum can be gradual, until parents and prospect teachers can be reasonably expected to have absorbed enough to foster the children in this respect.
I did have three years of computer science in a German A-levels program, by choice, roughly equal to the topics of a first bachelors semester (at best) - remember, the first semester in uni will not include much coding at all, compared to theory. The theory was applied in haskell, prolog, sql and mainly ruby. Anyway, I had coded in basic already before I went there.
My teachers were engaging in the FSF Europe to get FOSS into schools, maintained the school network and were passionate and old school. It was the only school in the district to offer this class.
Odd that as a child growing up in the 80s every kid in my public school (in rural Kansas no less), as well as the surrounding schools I encountered, was taught programming in 6th-8th grades and also was required to take at least one high school course. Of course this was also a time when computers shipped with BASIC, users were confronted with a blinking cursor command prompt upon startup, and the very idea of a "computer" meant "tool to do things."
Nowadays the modern conception of "computer" is more akin to "consumer appliance," a more interactive television experience. For the majority of people learning to program a computer makes no more sense to them than learning how to make television shows... and the very same tech companies that drove the industry into the consumer appliance category now bemoans the fact that everyone appears to be consumers rather than producers.
> For the majority of people learning to program a computer makes no more sense to them than learning how to make television shows
Why doesn't it make sense for everyone to learn how to make television shows? The three basic areas of active literacy in modern society are writing, coding, and video production. It seems like knowing all three should be a basic requirement for graduating high school.
> the three basic areas of active literacy in modern society are writing, coding, and video production.
Uhm... source? Why writing, coding, and video production and not writing, coding, and photography? Or podcasting? It's unclear whether you randomly made this up on the spot, or if you're actually quoting academic taxonomies.
FWIW, Wikipedia says:
"Literacy is traditionally understood as the ability to read, write, and use arithmetic. The modern term's meaning has been expanded to include the ability to use language, numbers, images, computers, and other basic means to understand, communicate, gain useful knowledge and use the dominant symbol systems of a culture. The concept of literacy is expanding in OECD countries to include skills to access knowledge through technology and ability to assess complex contexts."
Video production does not seem to have a prominent position, and while computers/technology are brought up, coding is not.
Those are the three primary mediums for selling ideas in modern society. Podcasting is a niche phenomenon, and even the most popular podcasts have only a few million subscribers. And while photography clearly can be used to create change in society, it doesn't do so anywhere near as often as the other mediums.
Video production is definitely one of the primary mediums for the communication of ideas, because of how versatile it is. A podcast is basically a video, without the visual aspect. The visuals accompanying video provides another dimension through which an idea may be conveyed. Photographs are just snapshots of a video. A photograph can represent multiple ideas, and provide an idea of the story's plot. Video, in comparison, provides a running narrative and dynamicism vs the staticness of photography.
Sounds like some post hoc rationalization. I could similarly state that a video is just a piece of interactive multimedia , without the interactive aspect - not sure what that brings to the table regarding the definition of literacy.
> Those are the three primary mediums for selling ideas in modern society.
Again, according to whom? Is this some generalization you're making up on the spot, or a statement that is quantifiably provable? It's really not clear to me that video production is used to sell ideas significantly more than radio, or highway billboards, or public speaking.
This, on my 8088 i had to some (gw)basic programming and command line hackery just to get games to load. Then the games required some imagination. Now it's just point, click, and pay $big_corp $ so you can get stuff in the game faster.
Anyone ever read "The inmates are running the asylum" by Cooper? If this trend of making everyone into a programmer continues, I wonder what user interfaces will evolve into.
As a side note, recently I've been having more and more unexpected but amusing conversations with my non-programmer friends about things like 'APIs and SDKs'. The business worker has taken over the technical vernacular, and it's leading to some strange contortions of reality.
>Anyone ever read "The inmates are running the asylum" by Cooper?
Ha, good one - at least the book title. (The full title is: The Inmates Are Running the Asylum: Why High-Tech Products Drive Us Crazy and How to Restore the Sanity.) I'd read about that book by him (Alan Cooper [1]), but haven't read it yet. But I did buy and read another of his books, About Face: The Essentials of Interaction Design, some years ago. I thought it was good. It's about user interface design that makes sense for humans. I still remember one point where he said that in a scrollbar, having both the arrows at one end, instead of at opposite ends, makes it more easy to use. IIRC, I mentioned this on HN a while ago in some context, and someone replied that such a scrollbar exists in some GUI toolkit or OS.
The story of how he created Visual Basic and then how Microsoft bought it from him, is also interesting.
A few excerpts from the Wikipedia article about him (linked above):
[After he left college, he founded one of the first microcomputer software companies.]
[In 1975, as the first microcomputers became available, Alan Cooper founded his first company, Structured Systems Group (SSG), in Oakland, California. SSG’s software accounting product, General Ledger, was sold through ads in popular magazines such as Byte and Interface Age. This software was, according to the historical account in Fire in the Valley (by Paul Freiberger and Michael Swaine), “probably the first serious business software for microcomputers.]
[During the 1980s, Alan Cooper authored several business applications including Microphone II for Windows and an early, critical-path project management program called SuperProject. Cooper sold SuperProject to Computer Associates in 1984, where it achieved success in the business-to-business marketplace.]
[In 1988, Alan Cooper created a visual programming language (code-named “Ruby”) that allowed Windows users to build “Finder”-like shells. He called it “a shell construction set." After he demonstrated Ruby to Bill Gates, Microsoft purchased it. At the time, Gates commented that the innovation would have a “profound effect” on their entire product line. Microsoft decided not to release the product as a shell for users, but rather to transform it into a professional development tool from their QuickBASIC programming language called Visual Basic, which was widely used for business application development for Windows computers.]
[Cooper’s dynamically installable control facility, which became famous as the “VBX” interface, was a well-known component of "Ruby". This innovation allowed any 3rd party developer to write a widget (control) as a DLL, put it in the Visual Basic directory, and Visual Basic would find it, communicate with it, and present it to the user as a seamless part of the program. The widget would appear in the tool palette and appropriate menus, and users could incorporate it into their Visual Basic applications. The invention of the “VBX” interface created an entire new marketplace for vendors of these “dynamically installable controls.” As a result of Cooper’s work, many new software companies were able to deliver Windows software to market in the 1990s.]
I'd worked on a few commercial VB projects earlier, and had read about how the VBX innovation (mentioned above) spawned a whole new industry of companies creating third-party controls/widgets. Delphi was probably also made possible because of VB (though there was a chance it could have been invented even if VB had not).
I do think there's a lot of value in teaching logic and algorithms as a system for thought. Coding might be a more direct way to get to it than, say, calculus, which is what is generally used.
But teaching kids java and databases is probably not the right approach, and is definitely only happening because Oracle wants to stay relevant.
Yes, logic and reasoning from facts/propositions to conclusions, should be taught to all, IMO - and this applied even before computers were invented. I see a lot of people these days (who are otherwise educated, even) unable to formulate basic logical arguments, and hence either make illogical arguments and deductions, or just go by emotions only, neither of which is good for the world. There is a place for both the head and the heard, and neither should rule.
I think kids would better served learning how to solve problems and staying curious.
I had given my niece this board game robot turtles that was post on here several years back. Its a great game, but it really requires a parent to get involved.
I remember I got started back in the mid 80s with writing maze games in BASIC. When I was a little older, I bought all those little electronics books printed on graph paper that they sold at Radio Shack. Then I would make simple circuits and learn to solder.
At least I spend an inordinate amount of time making sure data is in sync between objects, databases, remote APIs etc, and doing similar things that feel ripe for automation.
I think future developers will be focused on UI design and algorithms, and not have to worry too much about passing around data etc. Most of this doesn't even require AI, just like automatic reference counting doesn't.
What percentage of decelopers today know anything about machine code? I would guess less than 5%, maybe 1%?
My point is that programming is moving to higher levels all the time, most developers don't have to worry about pointers anymore for example.
The change does proceed very slowly though, compared to the consumer side. I guess developers are very conservative and programming as a whole advances 'one funeral at a time'.
"'Everyone should learn to code' is the new way of saying we need to create compliant factory workers and that the real purpose of school is to make sure that we are training people for the 'factory jobs' of the future."...
"I was amused when President Obama said that everyone should learn to code, since while I am guessing that Cook can code, I am pretty sure Obama cannot, so why does he care? And, what does he actually know about coding?"
School should teach students to read, write, do basic math and think and discuss both logically and rhetorically. Once a student is educated in that sense, (s)he is prepared to learn coding.
Otheriwise, coding becomes like all those other courses that the student took but never used. The basics will always be useful.
No, they've been trying for years. Though often times its just a cynical ploy to push a specific coding language onto a young demographic hoping to lock them in as a user base later on
My high-school in rural virginia (Halifax County), taught a real programming (Pascal back in the day). I did enjoy it very much, and my teacher was awesome (she was a former engineer, turned into teaching).
I can't see why wouldn't high-schools have at least intro to Python or something similar as a requirement, with having more advanced classes as optional.
If you can code python, why would you go into teaching? That's basically the issue schools are facing; anyone who's really qualified can make more money elsewhere
I was taught VB programming in high school by a teacher that never coded in her life. We just followed the modules in the book and the class would band together to solve the problems of individual students.
Yeah, it wasn't optimal, but we managed to learn to program and several of us went on to be professional programmers. More than one did so right out of high school.
I desperately need to acquire expertise in a domain of programming that is not accessible via high school courses, and YouTube videos, or a fear I may not have a job (or a decent paying job) in 10 years.
Not really, to be honest. I mean, right now I do what should be "accessible via high school courses"--I do devops/infrastructure development and automation. I call it "e-plumbing" and I'm not even joking. But I have zero worry about competition for a few reasons:
1) There is enough subject matter expertise in the infrastructure/devops sub-field that being able to synthesize approaches out of all of it is really, really hard and is in many ways gated on experience. Understanding all of the products AWS throws at you and when to apply them versus using vendor-independent solutions versus using a different cloud provider entirely is a problem that high school and YouTube won't really prepare you for.
2) Being able to cohesively apply that subject matter expertise requires a framework that takes a lot of theoretical knowledge that takes a while to both learn and to understand the importance of. People shit on college all the time but I would be significantly worse at my job if not for the rigorous algorithmic courses forced upon me at college (it's one of those "do you know what you don't know?" things).
3) The baseline computer science background, programming chops, and breadth of experience that make me good at applying code to #1 and #2 means I have other options if I want to get out of infrastructure work. That's how I ended up here--I went from college to doing bog-standard Java work at TripAdvisor to leading a mobile team two years later because "you're smart, you'll figure out what you don't know" to devops/infrastructure work, where I'd never touched AWS or Ruby and yet picked both up on the fly. A few years later, I consult to pay the bills and work on other stuff that I find fun.
4) The demand of the industry is not going away any time soon; there will be a need for seniors to guide and manage even that theoretical legion of juniors. I'm much more worried about those juniors than I am today's juniors or seniors.
I am a JavaScript developer. I also have 0 worry about competition in the job market. It isn't that I am a rockstar programmer, that JavaScript developers are hard to find, or even that demand is low. Unfortunately, it is that most JavaScript developers suck. Absolutely horrid.
The primary culprit is a lack of education (formal or mentor-ship) on how web technologies work and I have almost never encountered somebody who did well with these technologies due simply to their university education. If you are hoping your mastery of Java or C# is good enough to quickly jump into JavaScript or some other web technology will be sorely disappointed.
It's funny that you say that, because I've been getting into ES6/TS lately (mostly to implement my own stuff) and that's basically been my reaction: "this stuff is pretty straightforward, why is so much code so bad?". Then I realize that I've had to understand, if not implement, those web technologies to do the work I've already been doing (and, years back, I wrote a JavaScript runtime as an experiment, so the language's guts aren't foreign), and I'm just picking up the other end of the rope now.
The amount of implicitly-carried junk we all know about to be good at our jobs instead of just tolerable is pretty amazing.
You can practice this directly in your web browser (developer tool's console) without any additional code editor. I am teaching my wife this now. If you can master that one skill and slowly build up some JavaScript confidence you can always find a job... so long as web browsers are a thing.
I wouldn't worry about losing your job. Software engineers will always be in demand, and demand will only increase.
I pretty much do the same job that I did in 1994. Only then, I made 1/5th as much and had 5-times as many people on my team. But, back then, few people wanted to do this type of work.
When software engineering salaries go back to normal (i.e. when they don't encroach on the 'leadership class'), it will be fine. Probably more fun, too (for people like me at least).
I find this to be allegorical to "housing prices will never go down, and the demand will only increase over time".
Imagine some big tech companies go down from some fundamental shift - suddenly there are floods of coders on the market. Its the most common job in many states, and growing. The housing market can crash - the tech sector (as we know it) can crash.
I agree - I think it will crash. But, like real estate or stock market bubble, it will be a correction. Companies will fire people. Wages will go down. There will be pain. Lower wages will allow more people to be hired, with less stress, more work-life balance. Many people will leave the industry because they were just in it for the money. Companies will have larger teams of more passionat people. Those people will collaborate. The innovators will then have more time to innovate. New companies and tech will be created. And, the cycle will continue.
Did you not worry about the same at the peak of outsourcing boom? How is competing with the next generation of students any worse than competing with the whole world?
I wasn't around at the height of the outsourcing hysteria, but I probably would have worried about it just like everyone else. I've actually been layed off before while working for a big company that outsourced to China. So, it's hard not to let that reinforce my worries. Granted, it was super easy to find a new job.
Competing with the next generation of students is worrisome because I know some companies are hesitant to outsource due to the overheard of timezones, language, etc. But they'd love to pay their developers less. Nothing drives down salary like an employer's market.
I don't know, maybe machine learning, data analytics, or game programming... Just something that you can't easily do with a couple months of YouTube studying (iOS, Web, etc.)
The motive of cheaper employees is self-limiting as long as they're skipping over getting kids good at using and administrating computers and jumping right to programming.
Yes, kids these days are good at installing apps onto their phones. Until they learn how to figure out how to change their wifi ssid and password without a manual (then move onto programming) there is never going to be an explosion in the number of qualified software engineers. Unless "coding" really means "shitting out cookiecutter webapps."
It should be a requirement, just like math is. But don't get upset when we're not pumping out software developers any more than we pump out mathematicians.
Yes, I don't agree with the viewpoint that learning programming (per se) should be a requirement, even though software is so pervasive in modern life and will be more so in future. Instead, IMO, the concepts, importance and influence (on the world) of both computer hardware and software should be made known to students, via an intro level course, and then they should be given the option to learn more about those topics, if they wish. They should also be told that as time goes on, being computer-illiterate is more and more likely to be a handicap in the world, even for day-to-day living. But still, programming is not a requirement for living in the world. Instead, computer literacy, which nowadays should involve more topics than it did earlier, should be made a required course, including things like the basics of how hardware and operating systems and software work (not in too much depth, but enough), plus networking, the Internet, the importance of things like computer security (password management, URLs, malware prevention basics, backups, and many more).
I understand that there are a lot of online tools that make learning/teaching coding easier but I do believe the person teaching it still needs to have a decent understanding. Kids ask tons of questions on how to accomplish certain things and if the teacher doesn't really know how to code, they won't be able to help those students who will then lose interest.