While not technically all of what's covered in CS, in my experience with a PhD in Comp. Eng., 8 years as an AF comm officer, and 3.5 years in the commercial sector as a soft. eng., this is vastly more useful than most of what's covered in a CS curriculum.
The CS curriculum probably made more sense back in the day when everyone was essentially an embedded developer. But nowadays, the most useful knowledge I have is the low level mechanics of how things like the OS and networking protocols work. S/W eng. classes are a bit useful, but mostly knowing how to write in C++, Java, and now Python has gotten me most of the way. As it is, I have almost never run into a situation where most of my CS classes have been relevant. And, where they are relevant, it can be covered by a week course in the basics.
I feel the CS curriculum would be much better service for students if it covered more of the knowledge of how to get things done. And not in a faddish, framework du jour manner, but there are constant elements throughout all the fads that a good developer should learn cold, and are not covered very well, at least in my 8 years in CS academia.
IMHO the real problem with CS is that it's driven by AI envy, and much of what is considered important only makes sense in light of the assumption the human mind is basically a computer, and CS is all about how to recreate a human mind. However, almost none of that line of thought matters in the real world, and is most likely false.
There is no 'the' CS curriculum. You must be thinking of one particular school's CS curriculum, like maybe your own? Another school's CS curriculum is going to be wildly different.
Most CS curricula I've seen have commonality, at least in the US and UK, with a significant focus on math, algorithms, datastructures, and so on. A lot is interesting, but most is useless for normal work. I think it all should be kept, but the curricula should also be designed with the fact that most students will not become CS researchers, and will have normal programming jobs.
A curricula, OTOH, that prepares a student to be a good developer should also have a heavy emphasis on:
- #1 is write a lot of code with a focus on good coding practice, preferably in a combination of Python and Java or C++
- Understand Linux and be proficient with command line tools
I'm assuming you have some concrete curriculum in mind, but I'm curious how what you're thinking of compares to something like the learning objectives spelled out in the CS Curricula 2013[0]?
Personally, going through OP's site, I was nodding my head and comparing it to what I learned in my undergraduate CS degree. Some of it is dated, but most of it connects to the Architecture and Operating System classes I took.
Agreed, the typical CS education does not equip students to become useful programmers. I don't think AI has much to do with its failings, though. Universities like to think they are training up scientists.
It is much easier to teach an engineer to make good software than to teach a CSist to do engineering. I have seen the latter happen, but the usual results are... well, we see that every day.
That would be an engineering program not a science program. 40 years ago there wasn’t much difference but the field has moved on since then! A number of schools have split the tracks, or offer only a computer engineering program.
Linear algebra, discrete math, calculus. The mathematical underpinnings of algorithms and data structures. Turing machines will be a lot more useful once you actually know how to use properties to prove more things. A kindergartner can recite the layman description of an infinitely long tape with ones and zeroes but such simplistic understanding has very little practical use if you don't understand how it fits into the context of CS in general. Machines improve, architectures evolve, frameworks change. But math doesn't. Every few days there is a new "linear algebra for machine learning" guide that pops up on HN and every now and then there will be a "how to learn math" question on HN. The lack of mathematical maturity among software devs and engineers is not a good thing and reflects poorly upon the industry. Too many universities these days focus mostly on practical leet coding rather than the theoretical underpinnings of CS. An in-depth study of the finer aspects of a networking protocol would become outdated the moment the next iteration comes out, but a close examination of Shannon and information theory will serve you well for life. There seems to be a continuous myth that undergraduates cannot code a FizzBuzz to save their lives and thus all focus should be placed on testing that specific skill. This mentality of being dismissive towards math is pernicious to computing as a science and relegating theory as "stuff I never ever needed in my n years in industry" creates a harmful echo chamber for software engineering.
I'm not saying these aren't important but for most developers this stuff is rarely going to be used. Most "developer" jobs are not math heavy. I'd rather push someone to learn how to write clean code and focus more on the engineering aspect.
Yes, this is what I'm referring to. Math and algorithms rarely, if every, show up in my work beyond basics that don't require more than a week or two of information.
Not OP, and not necessarily endorsing these specific topics as most fundamental to a CS education (I've more often drawn on discrete math/symbolic logic) but 3Blue1Brown has great series on introductory linear algebra and calculus. And Symbols, Signals, and Noise: An Introduction to Information Theory by John Pierce is also good.
My own view echoes this (https://meaningness.com/metablog/how-to-think), which is that it's often useful to know a little about a lot of different kinds of math. That way you'll set yourself up to notice when and where some specific discipline might apply, then you can go back and learn the details if you need to.
The CS curriculum probably made more sense back in the day when everyone was essentially an embedded developer. But nowadays, the most useful knowledge I have is the low level mechanics of how things like the OS and networking protocols work. S/W eng. classes are a bit useful, but mostly knowing how to write in C++, Java, and now Python has gotten me most of the way. As it is, I have almost never run into a situation where most of my CS classes have been relevant. And, where they are relevant, it can be covered by a week course in the basics.
I feel the CS curriculum would be much better service for students if it covered more of the knowledge of how to get things done. And not in a faddish, framework du jour manner, but there are constant elements throughout all the fads that a good developer should learn cold, and are not covered very well, at least in my 8 years in CS academia.
IMHO the real problem with CS is that it's driven by AI envy, and much of what is considered important only makes sense in light of the assumption the human mind is basically a computer, and CS is all about how to recreate a human mind. However, almost none of that line of thought matters in the real world, and is most likely false.