Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The Missing Semester of Your CS Education (2020) (csail.mit.edu)
1024 points by saikatsg on Feb 25, 2023 | hide | past | favorite | 336 comments


The content taught here is the highest payout thing you can learn, in my opinion. Certainly more important than actually writing code or learning algos.

What this content covers should unlock iteration speed, which is the single greatest lever in learning and growing faster (on a computer). Thus it gives you more cycles to go back to improving your code, experimenting with algos, etc. Probably also highly correlated with upwards mobility in the software job market.

Great seeing this under a common umbrella I can hand to students and new grads.


I have this feeling that the kids who appear better at uni are the ones who happened to pick up certain not-quite-programming skills before they started. Basics of networking, how installation of programs happens, how to use the command line, that kind of thing.

I looked over her shoulder as my wife was doing a CS degree, and I realised there's a bunch of these little things that make life a lot easier if you know them.


There's definitely a thing with CS that, at least at more elite schools, there's an assumption that you more or less know how to program--at least a language like Python--and you also more or less know your way around a computer well enough to use it as a tool for programming. (Or you pick it up quickly on the side along with your full course load.)

This is more or less unique among college majors outside of some arts disciplines like music. Yes, there's a requirement for some secondary school algebra and some basic science but an electrical engineering major could basically have never assembled a circuit before attending college and probably wouldn't be at any particular disadvantage.

And, per the original post, MIT is certainly one of the institutions that does this. The 6.001 MOOC(s)--basically intro to algorithms--teaches a bit of Python on the side but clearly you're intended to mostly learn it on your own.

(By contrast, back in the day, I took a FORTRAN course as part of a non-CS engineering major. The assumption was that you had never touched a computer before.)


I had a guitar tutor once and I asked him if I honestly had a chance to get into the Royal College of Music. He said absolutely, no problem, as long as you practice for at least two hours per day. Every day. For the next ten years... High-level CS is the same and I don't see any reason why it should be different. There is just no, or very little, time to teach introductory programming classes at most universities.


With the performing arts--perhaps music in particular--it's certainly the case that you mostly can't decide you want to do the music thing, except as maybe a very recreational activity for the first time in college. However, I'm not sure I'm sold that CS--more or less uniquely among technical fields (including electrical engineering)--needs to have the same level of informal prerequisites.


Am I unique in not romanticizing computer science?

I see it more as a trade. There can certainly be some beauty to it. I’m sure coal miners hold some of their own in special regard too.


Software development is a trade, if all you aspire to be is a WordPress or frontend developer then you don't need a fancy degree, you can just as easily go to a 6 to 8 week code camp for that.

Computer science is a pathway for those that want to delve deeper, and that necessitates rigorous fundamentals in applied mathematics and information theory.


> Computer science is a pathway for those that want to delve deeper, and that necessitates rigorous fundamentals in applied mathematics and information theory.

I dare say that one can go through a whole career as a software engineer without spending a single semester learning about algorithms, in the sense that you just start learning about them in a JIT fashion if/when they become an issue, which is rarely.

Some of the O(X) trivia so much sought after in some interviewing circles is even fundamentally wrong when real world aspects such as problem size or CPU architecture come into play.


This is basically my career. I had a stable gig for my first handful+ of years after college and never much encountered the leetcode grind people talk about until recently. A lot of the O(X) stuff is self apparent when you're writing practical code: lotsa slow loops within loops == bad and should be avoided where possible.

When I started doing job interviews it was a little bewildering how much some companies focused on this. I ended up taking a job whose interviews focused on more practical skills and also stressed a good culture fit and understanding of development practices. It feels like a place I'll be happier.

I think you're right about the JIT aspect to this sort of thing--that's really what being an engineer is about in my experience. When you encounter a problem, figure out how to do it better. If my filter/sort/whatever is too slow for a specific application, I'll do some research and implement something more appropriate.


Computer science != algorithms. There's also the whole systems branch! If you ever find yourself thinking about TCP, virtual memory, system calls, frame buffers, shaders, signatures, interpreters, compilers, query planners, etc. these are also things you could have studied and practiced building in a CS program.


Coal miners also do stuff like massively die in gas explosions and are kind of a poster child for labour exploitation. Wouldn't it be crazy if this line of work held the same prestige to hold in special regard?


It sounds like you are thinking of software engineering, which is an adjacent field often conflated. CS is about pushing current boundaries and inventing the future.


There is beauty in just about any trade that is well done. To the extent this is romanticized is up to you. I'm not a big fan of the professionals that feel it is 100% trade. On the flip side, the 100% romantics are not ideal either IMHO - too focused on technology for technology sake.


Well after they spend 4-8yrs of their life learning CS, damn right they’re going to romanticize and gatekeep it every chance they get.

Meanwhile I agree with you, software engineering jobs are a trade 90% of the time. It’s why so many highly educated folks are being laid-off right now, while the doers keep-on building and making bank.


> while the doers keep-on building and making bank.

It’s spelled Doerr


I didn't start programming until I was in college and did just fine in my undergrad CS at UW Madison.


My school handled this cleverly by taking people with prior programming experience into a Haskell sequence, which suitably kicked our asses to the same degree as what the first-time programmers were getting.


My CS course at the University of Sussex, UK did this (in 1988) with ML, the functional programming language. Plenty of us had already been coding in C or Pascal before starting at college. None of us had done any functional programming. I loved ML: it seemed all elegant and beautiful and magical.

When I hear people starting their CS degree with C++ or Java, it makes me cringe.


My college's CS program did the same, but with Scheme back in the day.

I can't complain, since I was introduced to Emacs along the way, which I still use heavily to this day. And while I don't really use Scheme anymore, it made Elisp trivial to figure out.


Which school was this? Was this UT Austin by any chance


UWaterloo does this.


I don’t think that’s quite right. You’re not going to expect someone majoring in French to not have studied any French before college.

It seems completely reasonable to expect someone to be comfortable finding their way around a computer if they’ve chosen to major in computer science.

I’d also expect an entering electrical engineering major has studied more math than just “some secondary school algebra”. They’ve probably already studied some calculus or at the very least are ready to as soon as they start their first semester.

It’s a bit of a red flag to have no foundations in a subject you’ve decided to focus on for the next four years. That said, sufficiently motivated individuals can catch up and overcome their initial lack of preparation.


You have to put yourself back into your 18yo self, right? The kind of people who think a degree in management will make you a manager, and if they let you on to your course they must know what the requirements are and you must satisfy them.

At that time in your life your entire world has been school, and everything you did in school led to another thing in school. Why wouldn't you assume you'd get taught programming when you applied for a CS course, whose prereq was math that you did in school?


everything they teach you in a CS degree is useful and broadens your knowledge of the space, and all of it might potentially come in handy as your career develops, though much of it might not.

Doesn't mean you'll be a great programmer or archtect or planner or teacher. Right? you might drop it and go into something else.

Same is actually true of a management degree, everything they teach has been well thought out and is applicable to managing a company. Doesn't mean you'll be any good at it or work in all those departments. It is not a useless degree, though many useless people might pursue it.


MIT has an expectation that students would cover their programming skills via internships or outside jobs, and thus they focus on teaching CS which is quite hard to pick up on your own. Unlike frameworks or trends which change every 2 years, fundamentals are quite hard to change and can carry you quite far if you know how to apply it correctly.


>(By contrast, back in the day, I took a FORTRAN course as part of a non-CS engineering major. The assumption was that you had never touched a computer before.)

This was my "back in the day" experience as well, though I had a choice between C and Pascal. There was no expectation that anyone had any programming experience. I only had a touch of C64 BASIC knowledge going in.

> I have this feeling that the kids who appear better at uni are the ones who happened to pick up certain not-quite-programming skills before they started. Basics of networking, how installation of programs happens, how to use the command line, that kind of thing.

This one really stood out to me and highlights a change in things since then. That describes ALL of the CS kids during my time. We were the ones who a) had computers and b) did weird things like rebuilding kernals or futz around forever trying to get remote X sessions running.

A fact that blew my mind a few years ago was that, of the 13 software engineers on my team, only 3-4 actually owned a computer other than the work-provided one.


But is it common for kids to be taught how to code before they come to uni? I would imagine it's not that easy to find someone to teach that sort of thing, so then the only kids who graduate high school with any tech skills are the ones who did it themselves.

Your point about engineering is exactly right. I built a working radio in my first term, having never done anything like it before, save for acing some physics exams with minimal electricity sections. When we came to coding, I had mucked about a bit with a computer, but I also found other students who'd worked at Microsoft. Wide wide range, and a lot of people of course ended up getting that fellow to email them the solution.


Even if it's more like hacking around in Python than being "properly" taught, I assume that it's pretty common for a technically-inclined and motivated high school student to play around with a Raspberry Pi. I imagine I'd be doing so if I were in high school today. In addition to books, there are also some pretty breezy intro to programming MOOCs out there that wouldn't be out of the reach of a smart high-schooler.


In my class 5 years ago, the differentiating factor was whether the kids who grew up online or not. Students like myself or my roommate who’d spent their formative years in front of a computer, not because they had to, but because it seemed like the thing to do at the time. I never realized how much I was learning during the time I spent scripting RuneScape bots and writing toy “viruses”, I did it because it was more fun.


I got my start writing RuneScape bots too! It's such a shame because speaking to the younger generation (I call them the TikTok generation) they don't have the same experience with the web the way we did. For them, the internet is closed off, a few big websites like Facebook, youtube, twitter. They use their phones primarily to access the web, and a laptop is exclusively used for school work; "A PC? Why would I need that when I can game on my XBox/PS5??". The internet really did change so much in the last decade, and in my honest experience computer literacy has gone down not up as we've all expected


This doesn't fit my experience.

I mean, yes, this was me. I came in knowing most/all of this. I knew six (or more?) programming languages, had at least played with CVS/SVN, had installed Linux (read: fought with the Linux bootloader to get my AMD CPU to boot without crashing), had dipped my toe in a few open source communities.

But I was a tutor in college and I interacted with a bunch of people who didn't come in with any of this experience. Many of those people struggled, but I also know a bunch of people who came into CS knowing nothing, loved it, and went from zero-to-sixty faster than even the people who came in knowing a lot.

I'm still not sure I can identify what the ingredient was, but experience alone is not enough to explain it.


I went to college in the late 90s. Didn't even own a computer until I was a freshman (used my loan money to buy one). Math and science had always been my subjects, so I started college pre-med. But, this new computer I just bought kept sucking me in. I'd stay up all night reading how to make it faster, learn all the tricks, and finally make it do what I wanted through programming. Next semester I took a programming class just to see how it went, then dropped everything, and switched majors. I got a part time job writing custom software for local companies as a sophomore/junior (late 90s dotcom was just ramping), and I guess it has all worked out ok.


Yeah, I remember taking this machine architecture course with a friend who was getting a journalism degree and had never written any code or done anything more technical than playing StarCraft brood war before college.

One of the later assignments they gave us this "bomb" executable and we had to use gdb to pick it apart and modify the instructions or find ROP gadgets or something to make the code not "explode". He was my partner in the assignment and I spent most of the time trying to teach him what I was doing in gdb. And trying to express that, sincerely, he wasn't stupid GDB is just really hard and he didn't have the background knowledge to make learning it easier.


And in non-English countries, well, knowing English long before uni helps a fuck ton.


>Basics of networking, how installation of programs happens, how to use the command line, that kind of thing.

um yea thats my job dude and i fought for it so other people should too these people waltzing in not knowing shot should not get into CS


There are a lot of students interested in CS who have very limited access to a computer. Most kids these days get a locked down ipad or Chromebook at school, and Windows' footprint in schools has massively shrunk over the years.

It's not like how it was when I was growing up. We spent $2000 on a Gateway PC that I had full admin access to, and unfiltered internet. Now that kind of money is being spent on cellphones in the household. Because I had that kind of access I was bricking the OS (and reinstalling it), Mailbombing people's AOL inboxes, putting Sub7 on Kaza and Limewire and remotely messing with peoples computers.

The computers at school were only networked my senior year. Even then they were basically wide open.

Me and my friends effectively operated a screwdriver shop out of whoever's house we happened to be at. We were trying to host game servers on whatever we could cobble together.

None of our families had cellphones because they were still only used by businessmen. Computing priorities were way different and how you were exposed to them was different as a result.

None of those opportunities are anywhere near as common now. Kids get devices in schools but they're basically bricks that log everything you do. Computer Labs are rare and their access is fully locked down. Most students first personal computing device is a cellphone, and more likely than not it's an iPhone. All phones are super locked down but apple's even more so.

The market has changed. The days of open computing are now relegated to the back of the line or for those who can afford it.

So it's no wonder a class like this exists. We see similar issues in middle school where students don't understand what folders are or where their files are stored. Phones and tablets make it so easy you never have to think about it. This also extends to collage as well.


>that kind of money is being spent on cellphones in the household

Okay then dont.


One of those cases that some people are so unreasonable, you can't tell what's sarcas m


I sort of get that, in this day and age, it would probably seem a bit unusual for someone to show up at Stanford or MIT and be like "I've only ever used a computer to play video games but I think I'll major in CS." On the other hand, I also don't think programming had to be an all-consuming passion for the past 10 years as some seem to deeply believe.


I think there actually may be some of this. Both of my kids are in college right now, but I knew a lot of high school kids and their parents through the extracurriculars that my kids were involved in. The kids knew what the "hot" college majors were, including CS.

But they also knew the precise formula for college entrance, and were laser focused on getting into an "elite" school. Any activity that didn't contribute to that process was eschewed.


I'm not sarcastic.


I sat with a talented developer whilst we wrestled with a threading issue this past week. I wanted to inspect the value of a variable within a method during execution and asked him to set a breakpoint. He didn't know how to do that in the IDE, which he'd been using for over a year. Debugging is indeed a skill which needs learning.


I agree you should know how to use a debugger, however will also note that some companies mandate use of a specific IDE (or heavily encourage).

I have seen devs scoff at the thought of print debugging, but I recall that in systems programming there are many times you can’t use a debugger or need to rely on other tool.

I’d rather schools teach the concept of step debugging vs runtime debugging. Teach students to try to understand the code, make hypotheses, and verify them.

I have seen some people use a debugger solely because they only know how to step debug. Meaning they start from main or another entry point and step through every line of code.

My point being that, you can’t judge a developer by if they use print statements or a debugger. Judge them by the methodology of how they debug.


> I have seen devs scoff at the thought of print debugging

Next time they do, ask them to recommend a better method of debugging that works across generally all languages, compilers, IDEs, and platforms with next to zero configuration.


The key thing about print debugging for me, that I've never seen a typical debugger handle well, is what I like to think of as "temporal debugging":

Taking a program trace from the print debug statements, grepping through it repeatedly to filter down to certain events of interest, looking at the interleaving of those events, and figuring out the order that things happened in to cause it to go off the rails. That sort of thing. (To be fair, time-travel debuggers can start to get at this, but those are pretty uncommon. Traces, as you say, work almost everywhere.)

Or better yet, compare the traces between working and non-working runs to see how they differ. I've looked at diffs of traces this way before. (Sometimes I'll first use a small script to renumber pointers in traces by order of appearance.)

I've also used this sort of strategy before for debugging rare threading or other non-deterministic issues. Have the shell run the program in a loop, saving each run's trace and results to a different file, go off and get lunch, come back and see if anything failed. Then look to see if any of the runs failed and look for the structural differences between the working and non-working traces.

I can't imagine sitting and stepping through in a debugger 100+ times in the hopes that maybe this time, it will be the run that's just different enough to trigger the bug and that the debugger itself won't prevent the issue from manifesting. Not to mention, trying to remember the steps from all the good runs and spotting where the bad run goes bad before you've stepped to far. No thank you.

I think people really underestimate print debugging. Debuggers are fast and easy for simple bugs, sure, but there's powerful stuff that you can only really do with printed traces.


> I can't imagine sitting and stepping through in a debugger 100+ times in the hopes that maybe this time, it will be the run that's just different enough to trigger the bug

Isn’t that where things like breakpoint conditions, data breakpoints (“break whenever X value/field changes”), and dependent breakpoints, work pretty dang well? You just set up the appropriate situation and let it run until it breaks


Those help in some situations, sure. I'm talking about the non-deterministic things like unusual race conditions, where the problem only manifests rarely. Conditional breakpoints could certainly help you detect when things have broken, but you'll often be well downstream of the root cause by then. Though I suppose tracepoints could help with that. So I'll concede that you could probably do something similar in a powerful enough debugger with sufficient effort.

But one thing that I have found helpful in the past has been aggregating the information from the traces across runs. Comparing good with good and bad with bad to classify the commonalities, and then compare good with bad to see how they differ.


Yeah. Also for any kind of race condition a debugger is usually pretty useless. Also most memory corruption problems. The debugger has a place but so does printf debugging.


Print debugging is one of the best ways to learn about buffering and the difference between stdout and stderr.


> I agree you should know how to use a debugger, however will also note that some companies mandate use of a specific IDE (or heavily encourage).

The debuggers in most popular IDEs (IntelliJ IDEA/any JetBrains IDE, Visual Studio, VSCode, Eclipse) work the same way. You set breakpoints (typically by clicking somewhere around the line number), step into and out of functions, and look at the memory state. If you learned how debugging works in IDE X, you can easily switch to IDE Y without having to learn much.


The biggest hurdle for beginners when using built-in IDE debuggers in my experience is the requirement to set up the debug run configuration to properly execute their program. Even though this amounts essentially to "how do you run the program on the command line? Write that in the IDE configurator", newbies don't know how to run the program from the command line (and can't debug any startup errors they might get), so can't properly set up the debugger in their IDE, and revert back to print debugging everything.


> I have seen devs scoff at the thought of print debugging, but I recall that in systems programming there are many times you can’t use a debugger or need to rely on other tool.

Not just systems programming, but fixing issues in scaled production systems as well. Have fun attaching a debugger to the process that got killed twenty minutes ago when the spot instance it was running on got reclaimed. If you don't collect telemetry, you're blind. Debuggers are a luxury that you get to enjoy for issues you have before you ship.


A talented developer who doesn't know how to set a breakpoint sounds contradictory to me.


There is a world of programming where debuggers don't serve much purpose. Individual microservices are usually trivial, but push complexity into the interactions between services. Debuggers are not much use there; distributed tracing is more relevant. Functional programming, which is a growing part of the industry, really emphasizes code you can easily reason about. That's arguably the whole point of functional programming. Debuggers don't get much use there either.


Agreed, when I commented earlier I was also thinking about mentioning something similar with regards to classes of bugs due to interactions between systems.

Unless you have a setup where you can easily run one system with a debugger attached while connecting it to everything else, you’re basically restricted to running a debugger for bugs that can be reproduced locally.


I'm not a talented developer but I did spend my day on an F# piece of code that builds a databale based on a record type using reflection (and eventually realized it was going to need to be recursive) and I probably would've just quit if I wasn't allowed to use a debugger.

Now maybe if I was better that wouldn't be the case, but even the cleanest "wish i thought of that" functional code i've seen still looks like it'd be easier to fail fast using a debugger with.

In fairness though, I will admit I use the debugger a lot less when i'm not screwing with reflection on generic types or whatever because runtime errors just happen a lot less in functional styles. Usually if it compiles, it runs, because the compiler can sanity check the code better than you can.


> I'm not a talented developer

> [uses F#]

That has to be humblebragging. The average .net developer is terrified of or doesn't even know about F#.


It's really not. I'm quite terrible from any industry perspective.

F# isn't hard, it's just different. Hell in many ways i'd argue it's much much easier once you get used to it. It doesn't have the support C# does so often you're stuck with a library that WILL work but doesn't have documentation for doing it in F#, and that can lead to struggles, but that's not really a sign of being a good coder.

Most dotnet developers could probably code circles around me in F# if they knew it existed/gave it a chance.

Personally I stuck with it because it had the low code look of python with strong typing. It took a bit to wrap my head around some functional stuff (basically map/iter = foreach and if you want to update something on each loop you probably want a fold, or more likely a built in function), but once I got over that hurdle it was pretty smooth sailing.

The irony is that by far the hardest part is the library thing, which your average dotnet dev would handle WAAAAY better than me.


I’ve encountered the same thing. I’m a terrible programmer but find f# way clearer than most C#/java. But I work with tons of great developers who would rather cut off their finger than learn f#, it bothers me because it exposes some fundamental difference between us I dont like to believe exists.


Job vs craft. Or, maybe, they just don’t want to pollute the codebase with a new thing that breaks. Consistency is a virtue in old codebases.


> I'm a terrible programmer

Oh? Did you win some award? I'm curious as to how you're determining this?


Not a humblebrag. I’ve noticed I struggle with tracking lots of variables and states compared to many fellow developers. I’ve sort of tried to turn this weakness into a strength, by making code simpler, but sometimes I make it too terse because I like code golfing.


It's an interesting phenomenon because F# is far easier to learn than C#.


I don't agree, because learning F# is essentially learning two very different languages at the same time: first OCaml from which it was initially an implementation, then C# for all the object/CLR layer.

It's easy for people with functional programming knowledge, or who had OCaml as their first programming experience like I did at university, but for people without those exposures I can understand the difficulty.


Perhaps if you already know C#, otherwise I doubt it. Of course it depends on ones prior experience, but F# is functional but still requires you to understand OO in order to interact with the framework.


This was me - my first .net language was F# (although I'd dabbled a tiny bit in C#); it was hard to learn the .net standard library as well as trying to learn functional idioms...


I think it's harder if you already know C# or some other OO language. I really think it would be easier, or about the same, to teach a raw beginner the basics of F# vs C#.

The only reason F# winds up feeling harder isn't so much how F# works, but simply because the entire dotnet environment was built for C# first, so you do need to know how to handle the C# style. I do think that if you could just pick one and then suddenly have every library support it's styles, F# would probably be easier overall because it's just got a lot of nice features built in that make updating your code so much easier.


F# has two ways to invoke functions, ‘f a b’ and ‘f(a,b)’. You need to know when to use what. There is no way this is simpler to beginners compared to having one consistent way.

C# is not a simple language, but F# has basically all the complexity of C# with OCaml on top.

Either C# or OCaml would be simpler to learn, although the combination is powerful.


I don't know F# but the reference only mentions the `f a b` syntax.[1]

`f(a, b)` looks like a call to a function that takes a single tuple as its argument. So I would expect `f` to have the type signature `A * B -> C` instead of `A -> B -> C`. Is my intuition wrong? If it is, then what does F# use the parenthetical syntax for?

[1] https://learn.microsoft.com/en-us/dotnet/fsharp/language-ref...


Methods from the .net framework and other libraries are called with a tuple of argument, since they are not compatible with the native F# was of calling functions.


I mean to be completely bad faith pedantic, yeah that's still f a b, but obviously not the point being discussed here.


This to me is a pretty simple thing to explain and deal with.

F# enforcing a lot of good practice code style stuff (order matters for example pisses off long time devs who already have styles, but prevents SOOOO much stupid bs from beginners) and basically eliminates runtime/chasing variable state errors completely so long as you can stay within style. Yes it'd be nicer if there was only one way to invoke functions but if i had to take a tradeoff I think it's a pretty easy one.

It is an issue that yes, like your example, you're often stuck ALSO learning OO because "oh you want to use X library, well that's OO so...", and even then you can isolate your mutable/OO areas really well, but this is more of an issue with it being an second fiddle language. If F# got F# specific libraries for all the C# stuff out there tomorrow I think it'd take off and most people would never look back.

If we're talking basic business logic/beginner programmer stuff, yeah I think F# offers a lot of stuff that makes it flat out easier to use. And if you want to point out complex issues, I feel the biggest one is that something that's intuitively much easier to understand in OO (create a variable/object, populate it on each iteration of a loop depending on logic), can feel daunting as hell in F# (fold).


Knowing a good bit of Rust helped me considerably, due to the commonalities of being expression-based, pattern matching and sum-types. F# almost feels like a more functional and GC'd Rust.


F# is heavily based on OCaml and Rust was heavily inspired by the ML family of languages, and I've often heard it described as an ML without GC and with memory management and a C++ style syntax.

You're noticing a very real relationship.


Is the average .net developer terrified of F# because they're not talented enough, or because it's different?


It's 100% the latter. I get "ok we don't want to mix codebases" and that's fine, but if you can code in C# you can probably get up and running in F# in a week, maybe a month if you struggle with some of the concepts.

One major issue I do see coming from the C# side is "well how do I do this then?!", which often the answer is "you don't, because you don't need to" or "well what if it's more performant to do it mutably!" well then thankfully F# can absolutely do that.

If you keep an open mind it's really a very clean and simple language, but in an age where half of development is importing 8 well known libraries, not being the main supported language is a major weakness.


I normally write C++, but I've also written in Rust, Python, Ruby and bunch of other languages. I've never had trouble with a programming language until Haskell, and had to accept that I'm just probably just not smart enough to do it.

However, I wrote my first thing a port of a small C# tool of a couple hundred lines, to F# in about an hour. It's been about a week now, and things are considerably smoothing out.

To be considered niche, the F# tooling has been great. Also, having the .NET libraries available adds a lot of built-in capability.


In my experience most developers remember how hard it was to learn their first language, and worry that the second will be just as hard


The distributed tracing point makes sense, but I think debuggers are still quite useful for functional code. Though maybe less commonly needed than just the repl.


Hi! I'm that person! Senior engineer, decade of experience. I've used debuggers in the past, both for running code and looking at core dumps, but I really don't find them to be cost effective for the vast majority of problems. Just write a print statement! So when I switched from C to python and go a couple jobs ago, I never bothered learning how to use the debuggers for those languages. I don't miss them.


I am also this person. I'm a systems programmer (kernel and systems software, often in C, C++, golang, bit of rust, etc)

What I find is that if my code isn't working, I stop what I'm doing. I look at it. I think really hard, I add some print statements and asserts to verify some assumptions, and I iterate a small handful of times to find my faulty assumption and fix it. Many, many times during the 'think hard' and look at the code part, I can fix the bug without any iterations.

This almost always works if I really understand what I'm doing and I'm being thoughtful.

Sometimes though, I don't know what the hell is going on and I'm in deep waters. In those cases I might use a debugger, but I often feel like I've failed. I almost never use them. When I helped undergrads with debuggers it often felt like their time would be more productively spent reasoning about their code instead of watching it.


Your list of programming languages excluded Java. Please ignore this reply if Java is included.

Are you aware of the amazing Java debugger feature of "Drop to Frame"? Combined with "hot-injection" (compile new code, then inject into current debug'd JVM), it is crazy and amazing. (I love C#, but the hot-injection feature is much worse than Java -- more than 50% of the time, C# compiler rejects my hot-injection, but about 80% of the time, JVM accepts my hot-injection.) When working on source code where it is very difficult to acquire data for the algorithm, having the ability to inspect in a debugger, make minor changes the the algorithm, then re-compile, inject new class/method defs, the drop to frame, the re-exec the same code in the same debug session is incredibly powerful.


Yes, that sounds pretty cool and it doesn't take a lot of imagination to see the utility in this. I've done a lot work on lower level software, often enough on platforms where the debuggers are tough to get working well anyway.

The plus side of less capable tooling is it tends to limit how complex software can be--the pain is just too noticeable. I haven't liked java in the past because it seems very difficult without the tooling and I never had to do enough java to learn that stuff. Java's tooling does seem quite excellent once it is mastered.


This part: "I haven't liked java in the past because it seems very difficult without the tooling"

If you are a low level programmer, I understand your sentiment. A piece of advice, when you need to use Java (or other JVM languages), just submit to all the bloat -- use an IDE, like IntelliJ, that needs 4GB+ of RAM. The increase in programmer productivity is a wild ride coming from embedded and kernel programming. (The same can be said for C#.)


I think there's a sort of horseshoe effect where both beginners and some experienced programmers tend to use print statements a lot, only differently.

When you're extremely "fluent" in programming code and good at mentally modelling code state, understanding exactly what the code does by looking at it, stepping through it doesn't typically add all that much.

While I do use a debugger sometimes, I'll more often form a hypothesis by just looking at the code, and test it with a print statement. Using a debugger is much too slow.


> Using a debugger is much too slow.

This varies, but in a lot of environments, using a debugger is much _faster_ than adding a print statement and recompiling (then removing the print statement). Especially when you're trying to look at a complex object/structure/etc where you can't easily print everything.


I think there is a bit of a paradox, debugging can seem heavy, but then when you’ve added enough print statements you’ve spent more time and added more things to clean up than if you had just taken the time to debug, well, you should have debugged. But you don’t know until you know. This seems to also appear with the “it would’ve been faster to not try to automate/code a fuller solution” than address whatever you were doing.


> using a debugger is much _faster_ than adding a print statement and recompiling (then removing the print statement)

Don't remove the print statement. Leave it in, printing conditionally on the log level you set.


In what way is using a debugger "slow"? I find that it speeds up iteration time because if my print statements aren't illustrative I have to add new ones and restart, whereas if I'm already sitting in the debugger when my hypothesis is wrong, I can just keep looking elsewhere.


I find I use the stepping debugger less and less as I get more experienced.

Early on it was a godsend. Start program, hit breakpoint, look at values, step a few lines, see values, make conclusions.

Now I rely on print statements. Most of all though, I just don't write code that requires stepping. If it panics it tells me where and looking at it will remind me I forgot some obvious thing. If it gives the wrong answer I place some print statements or asserts to verify assumptions.

Over time I've also created less and less state in my programs. I don't have a zillion variables anymore, intricately dependent on each other. Less spaghetti, more just a bunch of straight tubes or an assembly line.

I think it's possible that over the years I hit problems that couldn't easily be stepped. They got so complicated that even stepping the code didn't help much, it would take ages to really understand. So later programs got simpler, somehow.


I find I use the stepping debugger more and more as I get more experienced. Watching the live control flow and state changes allows me to notice latent defects and fix them before they ever cause actual problems. Developers ought to step through every line of course that they write.


It’s because we moved to request-response, and allow for deep linking / saved states. Soo the state of the session is more easily reproduced


> I never bothered learning how to use the debuggers for those languages. I don't miss them.

This could be causal.


You'd have to assume that the python and go debuggers do something that C debuggers don't do.


Or assume that python debuggers aren't as nice to use, or that python does not lend itself to inspecting weird memory and pointer dereferences, or a bunch of other possibilities.


Most of what I worked with code wise growing up was either very niche or setup in such a way that debuggers weren't an option so I never really used them much either. I don't understand their appeal when print statements can give you more context to debug with anyway. I'm definitely no senior but I'm used to solving things the "hard way" as one developer told me. He wondered how I could even work because of how "bad" my tools were but I didn't know any better being self taught and with certain software it's just not compatible with the tools he mentioned.


It depends what you're doing. Sometimes inserting a print and capturing state works. Sometimes you're not sure what you need to capture, or it's going to take a few iterations. That's where pdb / breakpoint() / more interactive debuggers can be very helpful.


What makes you talented?


You don’t need breakpoints all the time though. If you’re familiar with the code (or just “talented”), you might have an intuition for what the problem is and it’s faster to just think through it (and maybe write a few quick prints) instead of interrupting your train of thought setting breakpoints, clicking continue, waiting for the IDE to freaking load the debugging session (cough Visual Studio), rerunning the test, etc.

Besides, every IDE has a different way to debug, so they might just not be familiar with the interface. I can’t tell you exactly how to debug in VSCode even though I’ve used it the most. I’ve had to run a debugger only a handful of times in the past couple of years and it’s always for codebases that are more tangled (e.g. .NET where there’s interfaces everywhere).


+1. i just used a debugger today at my work for the first time in 4 years by coincidence. normally i just throw a couple prints and rerun the test and today i was reminded why. takes like 8 minutes to run the test in debug mode. lots of useful info in there but usually i can guess where the error is without it. it was indeed good at pinpointing the sigsegv though.


As an untalented developer, I used to make heavy use of debuggers, and knew them well. Currently, as a still untalented developer, I've fallen out of the habit of using them and don't know how to for my current toolchain.

Neither situation was at all related to my talent (or lack thereof).


True. A talented developer abuses the hell out of breakpoints by using hit counts, conditions and trigger points.


I read a long time ago, that either Kernighan or Ritchie said they never use a debugger and just sprinkle printf() statements.


Concluding that using printf to debug is superior to using a debugger would be a mistake!

I’ve been programming in C for nearly 20 years and primarily used printf for debugging for the first 12-15 years, and have used debuggers more and more. I use Emacs, and its gud mode is so nicely integrated into everything that using gdb is truly much, much faster than the alternative. I don’t use print debugging at all anymore.

It takes less time and cognitive overhead to just stop the program on the same line you would have inserted your printf, but you can now inspect the entire program state.

Obviously I’m not saying anything revelatory if you use an IDE on a regular basis… I guess this comment is for the folks who eschew IDEs.


This works well mostly and I use this. I usually implement a logging system where I can enable/disable individual components but use a debugger to examine functions which are not behaving as intended.. or core dumps. Usually in connection with an unit test that fails or started failing caused by my changes to the codebase that I'm working on.

Logging alone is fine, but it can often be difficult not to drown in information. I do embedded programming in C mostly.


There are large categories of software engineering where debuggers are not used that much.

I work on compilers and debuggers are more of an hindrance than help when trying to fix a compiler bug.

I believe there is a similar situation for distributed systems.


I don't usually use breakpoints in part because I use neovim and in part because... I seldom truly need them. Who are these people and what are these problems where stepping through method calls etc is actually necessary? I find it hard to believe. I've been successfully programming and problem solving this way for over 15 years.


Meh. I use the right tool for the job, and most of the time, a simple print statement put in the right place beats any debugger. Certainly the debugger has helped me in the past, but maybe one or two times only. Besides, putting a print statement costs nothing, and one knows exactly how to do so. Debuggers vary wildly: terminal, IDEs, etc.


I didn't use debuggers since switched from C to Rust. By the way I switched from emacs to VSCode and I do not know how to debug here. I never used debugger with lisp. Debugging is a language dependent technique.


I like debugging Lisp because the debugger is built in. A break here, an invoke-debugger there and the REPL takes me where I want to be.


I haven’t been shocked that fellow engineers don’t know how to use a debugger for at least ten years. Most jobs in the industry can be done adequately without getting into tools that low level.


Maybe rare but you may be very skilled at understanding structure and finding solution but knowing nothing about tools. As long as you're either one of them.


It's interesting. I mainly use functional-first languages, and I rarely need breakpoints. Programming is much more compositional with certain languages.


How would a breakpoint help with debugging a concurrency issue?


Well. a big part is curiosity, and apparently he isn’t curious not interested in what his tools do.

I used to read every help file on windows / visual c++, the FreeBSD manual, plowed through the file system and tested this it. Just because I was curious


There’s limited time. One cannot vey curious about all the stuff related to programming, so we must be selective.


Sure.. that’s the story of the lumberjack with a blunt axe.


Nah, debuggers are useful but hardly necessary to do your job well. People have been productive at programming long before they had many useful tools at all, the most important thing will always be how well you understand the program you are writing and not how you use your tools.


I learned this stuff without having people show me.


I went to Purdue over a decade ago and we had a 1 credit hour lab that taught this stuff. Unix command line, git, bash, and finally some python.

Like you said, it's been a complete game changer. I feel these skills continue to differentiate me from my peers in terms of how I can attack arbitrary problems bravely to this day.


They still teach this stuff at Purdue. I graduated a few years ago and by far the most important class was about how unix works, moving around the command line and finally introducing us to vim


<3 Me and a few others created this course in 2014. Glad you found it useful! I got so much joy out of the content creating the content for the lectures and labs.


Indeed. While I think learning algorithms and ds is a non-negotiable thing, in 90% of the companies out there in 90% of the situations one will never have to write a binary search from scratch or implement a queue from scratch. On the other hand, profiling, debugging, glueing things together via bash, etc., that’s what distinguishes you from the colleges who only write passable code.


I dropped out after a year of college and have since weaseled my way into a dev position, your observation could not be more true in my experience. I've since strongly considered going back to school for a degree, but as interesting and probably useful some of the material may be, I'm not convinced it's worth it as an investment into making me better at my job.

If only I could use 529 funds on some of these online course providers.


But writing implementing some of those basic data structures is some of the funnest things to do in uni.


Not sure if that fits, but it reminds me of Gary Bernhardts "Unix Chainsaw" talk

https://www.youtube.com/watch?v=ZQnyApKysg4

It's all about leveraging "simple" cli tools and combine their powers to cut through work very swiftly. The opposite of many days for many people.


I have tried in vain to get this implemented at our uni. I can say a few things I find interesting:

  - Students used to get this stuff but no longer do, for example all workstations  used to be unix, so when you left, you "knew" "unix" (shell, vim, etc)
  - Due to things like ABET, classes are crammed with need-to-know-for-accreditation info so, well, some items need to go by the wayside (many are in the MIT list)
  - There is a huge push, even by ABET, for security and crypto to be somehow integrated into nearly every class. 
  - Professors seem aware that we need this "missing class", but it is hard for administrators to implement, because: Universities were pressured into lowering credits to grad, so some courses were removed, so there is no room left for another course. 
I am not pushing one way or another, and I only have the vision of working at two universities, but I think unis need to take a real hard look at their courses from a holistic point of view. I recall stumbling across that MIT course at least 5 years ago. I do not know many others who implemented something like that.


> I do not know many others who implemented something like that.

the school I did in france, ENSEIRB-MATMECA, started with three weeks where you only learn shell commands, emacs, LaTeX etc before doing anything else. Here are the slides (in french sorry, although they all have useful reference cards at the end):

- intro: http://mfaverge.vvv.enseirb-matmeca.fr/wordpress/wp-content/...

- unix, shell: https://cours-mf.gitlabpages.inria.fr/if104/docs/01-unix.pdf

- emacs: https://cours-mf.gitlabpages.inria.fr/if104/docs/02-emacs.pd...

- latex: https://mfaverge.vvv.enseirb-matmeca.fr/wordpress/wp-content...

- "advanced" shell scripting: https://cours-mf.gitlabpages.inria.fr/if104/docs/05-scripts....


Thank you for the links... I actually read enough French to get by, so those are great to look at for more ideas.


Taking a quick look at the Emacs PDF, I learned that the french word for 'buffer' is tampon!


LaTeX? Is this a dual math-cs program?


I find this comment a bit odd. I teach CS and our students are required to use LaTeX for everything they hand in (internship report, project work, BA thesis). We also publish everything using LaTeX if possible. There's some conferences which use Word templates (shudder) but mostly it's LaTeX.

Citations and bibliography management are easy, it's super easy to switch from IEEE to Harvard or whatever and as soon as you have diagrams, graphs, tables or formulas it's a no brainer.

Pretty curious why you associate LaTeX with "math only". AFAIK it's absolutely standard to use it in CS. It's also quite easy these days. We provide templates for the students and they usually use Overleaf so there's very little hassle with setting up the entire environment.


Ah, that makes sense. The reason is that I was a math major and now doing programming, have lamented my LaTeX being utterly useless; the only thing I use it for is (perhaps somewhat ironically) my resume document, and I've forgotten so much that I don't even list it on my resume (if I even thought it was relevant).

But I had not considered that during the degree program itself, you will lose LaTeX quite frequently, which of course does make sense.


> have lamented my LaTeX being utterly useless

ConTeXt integrates Lua. KeenWrite[0] is my text editor that converts Markdown to XHTML then pipes that XML document into ConTeXt for typesetting[1]. ConTeXt does an amazing job of keeping presentation logic separated from the content. Meaning, once you've created a theme template, it's easy to pick it back up to create new ones.

There's a video series showing how to use KeenWrite.[2]

[0]: https://github.com/DaveJarvis/keenwrite

[1]: https://www.youtube.com/watch?v=qNbGSiRzx-0

[2]: https://www.youtube.com/playlist?list=PLB-WIt1cZYLm1MMx2FBG9...


Interesting, we never used LaTeX at all in our CS program. We also didn't have very many papers to write in our CS program anyway, most were coding or proofs which we could use regular Google Docs with their math support, or do it by hand.


Back in the 90's, I'd write all my papers in LaTeX, just to avoid using Windows and Word.


I did this in the noughties, and I was a psychologist ;) Some ideas never die I guess.


Sphinx, asciidoc, mkdocs etc more accessible these days.


LaTeX is also heavily used in Physics.


LaTeX was written by a famous computer scientist to write a computer science book.

Yes, mathematicians love it. But computer scientists were first there.


Maybe a useful skill but it's not a CS skill so it's slightly odd to teach it. Same with Emacs tbh. Students could easily use VSCode (and it would probably be better for a CS class).


I always laugh abit about this, the purists are just insane, yes you can do things in one way forever but you dont have to.


Interesting, do you think Lamport doesn't think of himself also as a mathematician? How about Knuth?


Both do a lot of mathematics. When they got into computers, mathematics was the standard route in.

All of their most famous work is in computer science.


We had plenty of math classes in CS


Note that this is not part of the CS curriculum at MIT. It's a series of classes during Independent Activities Period in January which is mostly unstructured time when students can do or not do any activities that catch their fancy. It works well for this sort of thing but students also do a ton of stuff that isn't especially academically-related.


I do know that, I was trying to implement it as part of a go at your own pace online course, but that still requires approval from the uni.


Thank you for making the point about accreditation, it's sort of a pet peeve of mine. I taught "intro to C++" last year at the Harrisburg campus of PSU. The students were a mix of non-CS majors who didn't know what a file was, a handful of students who already knew how to program and a bunch in the middle.

Re: accreditation... the admin is very reluctant to change anything about the courses. Even specific textbooks had to be recommended (I was warned for suggesting in the syllabus that the textbook wasn't needed). Seemed a little more strict than teaching mathematics, which I did in graduate school.

Re: kids these days... a significant portion didn't understand the concept of a file. I blame apps and the cloud (funny because I now work in cloud storage). I ended up writing my own pre-cursor doc to the "missing semester". It was challenge to get a student from not understanding the filesystem to having some sort of understanding of linear search and pointers. (If you're interested: https://www.dropbox.com/s/jar1r0l5vdgspcl/basics.pdf?dl=0)

I tried to stress, especially to the non-majors, that this "missing" stuff was perhaps the most important thing they could learn. That, and how to properly google/search for things. I would experiment and try to re-word homework questions so that interesting StackOverflow answers appeared in search results.


C++ needs a “missing semester” around tooling. Most material I see focus on the core language but leave out setting up build systems, package management, clang tidy, testing etc.


Yeah, the whole aspect of teaching C++ to beginners is fraught with tooling issues. That school is moving to Python (years after main campus switched)... but I'm not sure how I feel about that since I spent a lot of time talking about memory layout.

I ended up suggesting those who don't know how to setup a tool chain use VS Code. Not out of any particular affinity for it, but because of the good documentation covering Windows, Linux and macOS.


How do you do package management in C/C++?

I have only used pip, cargo, npm (well yarn and pnpm mostly) and composer.

Big off putting aspect of learning C/C++ is that I can’t grok how shared libraries work very well


A mix of actual package managers, distro packages and just cloning the library into a 3rd_party folder in your project


> I would experiment and try to re-word homework questions so that interesting StackOverflow answers appeared in search results.

That's a really interesting pedagogical approach, I like it a lot.


I also told them this, to encourage searching.

Though I hate Chegg and the like with a passion, since this sort of thing takes a lot of work (over the course of teaching same class a few times) and then w/ Chegg you immediately find the answers.


This “MIT Course” was taught by grad students because they felt the actual course work left this out.

So when you say it’s a shame your uni doesn’t teach this, well, that’s what these grad students were saying as well. Perhaps the students could seize the initiative at your institution as well?


I might have to get a separate group to do it, like the ACM students. If I suggest it, even as an online sort of thing, it gets cut down. All of the sudden, it can not be "something we support" that is not "part of the curriculum".


We're trying to implement exactly this. I assign the MIT missing semester materials in my sophomore systems programming course, and we do all of our assignments using a CLI, C, GCC, and Git. Prior to my course, the students know only Java and IDE programming.

One problem we have is that the prereq chain for our courses is very long, so adding another course as a pre-req to all others lengthens that chain.

The MIT course offers probably too much info for our purposes, or at least info students don't need preloaded. Just basic CLI and basic Git clone/fork/pull/push are enough for up to probably Junior year. The problem is that the intro courses are so sanitized that students aren't even getting basic CLI until Sophomore year, which means by the time they graduate, they're behind where they should be imo.


This "class" was run by our CS library computer lab and was something your TA might push you to attend, but not part of the formal curriculum. That worked around some of the admin nonsense but still got help to motivated students.


It would seem this would be the perfect kind of class for online learning. Perhaps even as a wiki to crowd source helpful contributions. Most learning doesn't even have to be directed, but just point students in the right direction. This is a ripe area to find ways to better facilitate individual initiative, because a moderate level of effort yields tremendous reward. Plus, you can easily determine if you are doing it right.


Same in another school (EPITA).

You start with 2 weeks of only Linux C + POSIX shell, and during this you're only allowed to use i3 + vim/emacs.


We had this class, Unix, in freshman year (VT). This isn't normal?


"Universities were pressured into lowering credits to grad"

Uni's are requiring more generals, especially humanities. The problem isn't credit requirements, it's accreditation, marketing, and politics.


We're teaching a course at ETH Zurich [1] where --besides the actual payload of solving partial differential equations (PDEs) on GPUs-- we put a lot of emphasis on "tools". Thus students learn how to use git and submit their homework via pushing to a repo of theirs on github, we teach testing and continuous integration, writing documentation, running code on a cluster, etc. In their final project, again submitted as a GitHub repo, they need to make use of all of theses skills (and of course solve some PDEs).

Note that excellent work in this space is done by the Software Carpentry project which exists since 1998 [2].

[1] https://pde-on-gpu.vaw.ethz.ch/ [2] https://software-carpentry.org/


As an alumni, thanks a lot for doing this. Looking back, all the things that I've learned in just the first few weeks in the industry made writing code so much more productive - if only someone had shown some of it already during some early semester, even just during some assistant teaching hour, it would have saved so many hours.

I remember specifically when one of the exercises for some compiler lecture contained unit tests the code had to satisfy, and I was like, wow, why didn't I already knew about this during algorithm classes earlier where I was fumbling around with some diff-tools to check my output. Let alone proper version control, now that would have been a blessing.

In hindsight, it's a bit embarrassing that I didn't bother to, well, just google for it, but neither did my colleagues - I guess we were so busy with exercises and preparing for exams that we just didn't have the time to think further than that.


Thank you very much for the GPU course. Even though my college taught shell usage to some extent, when I asked about GPU programming it was considered a nerd topic back in 2009.


Lots of laudatory comments here about this being essential but missing teaching but I have a different take. The content looks good, nothing wrong with teaching these things. But these things can be and are learned on the job fairly quickly for anyone interested enough in the field and with enough aptitude. In fact, I would say these things can be learned on your own time as a side effect of being interested in computers.

So I would say it's good content, but not essential for a CS program.


I think it can be useful to distinguish between the "known and unknown unknowns" here. For example, everyone will quickly realise that that they need to know git, and they will learn what they need to know (a known unknown). A university course would maybe save them time, but it would not really change what you know after 2 years in industry. Compared to e.g awk or shell scripting which can be incredible usefull, but maybe not something people realise by themselves that they need (a unknown unknown). A university should make people at least aware of these latter tools.


Totally agree. Further, these skills actively support rapid iteration on learning other core university outcomes. In CS it’s very easy to waste lots of time doing things the wrong way that could otherwise be spent doing something useful.


Good point. Some things you will learn by necessity, but if you dont know about say regexes you probably wont think of searching specifically for such a tool.


Whether you think that:

a) Universities are places of higher learning that do not need to cater to industry

b) There is an implicit social contract whereby universities should produce industry-ready graduates

c) Something in-between

… the skills taught in this course are as useful for the PhD candidate as for the junior software engineer.

Why leave these out or up to the student, then?

Furthermore, in many other academic disciplines, it’s very common to teach applied technique (eg, on writing essays or structuring research), is this any different?


I don't think you're wrong that you can learn this stuff on the job, but a "primer" kind of class like this which surveys several useful tools helps new engineers develop pattern matching skills around how to continue self-teaching these kinds of things. Shell/scripty/vim-ey,linux-ey stuff can be really challenging to learn how to learn for some people.


> The class is being run during MIT’s “Independent Activities Period” in January 2020 — a one-month semester that features shorter student-run classes. While the lectures themselves are only available to MIT students, we will provide all lecture materials along with video recordings of lectures to the public.

It seems that the course is only 1 month and runs alongside some student-run classes. So this is from the get go not an essential course of a CS program.


There's quite a broad mix of offerings during IAP--a lot of which isn't part of a regular academic program and a lot of which is just for fun/intellectual interest. It's mostly all non-credit. And there are a fair number of sessions dealing with practical details that may not be covered in regular courses.


I think it’s dependent on the person learning the material. My college had a similar course and it was also highly regarded as useful by students. However, while I found the course fun I already knew most of the basics from my research job, and the advanced stuff hasn’t really come up again even in work (e.g. fancy git or gdb commands), so I’ve entirely forgotten it; my biggest takeaway is probably ctrl r for searching in shell. But I can see why a guided intro would be really helpful to someone who had no experience or some trouble getting started learning this kind of material (which is very different from programming or computer science).

There’s probably an aptitude threshold past which the course’s value diminishes - don’t mean any disrespect to anyone, just trying to expand on your point. The top students either have already figured it out or will do so easily, so they might be better off doing something else with the time they would’ve invested in this. But for a lot of students learning about these tools and concepts can be a real force multiplier, more so than a random upper level course.


This is also what i thought too. The content is effectively a roadmap on how one could use the computer well to perform tasks (without the algorithmic/programming portion).

If your main usage environment is windows, none of these are that helpful, but the ideas could be translated mostly (windows shell is similar enough, that you can just as easily script them with batch files).


Yeah I pretty much agree with this comment. But I think my take on whether universities should be training for academia or for industry is "why not both?". I think these are just different valid tracks. There is a lot of education that overlaps between "computer science researcher academic" and "software engineering professional with strong fundamentals". So that should be the required set of credits. But then there are different tracks toward where you want to take your degree. Those sound like electives. Academics should learn more about research techniques and publishing and presenting at academic conferences, etc. And those on the professional track should learn more about tools and techniques used in industry.

There's no conflict or contradiction here, just different strokes for different folks.


It's amazing how many CS programs fail to teach you even the basic tools of being a software developer. Yes, yes, CS is not programming, but there is a non-trivial amount of CS that is indeed programming, and that is generally what people do with their CS degrees, so it'd make sense for a CS program to teach the basics. Maybe even more than the basics.


CS programs prepare you first and foremost for a PhD in CS. They're great for learning the theory and fundamentals, but teach practical Software Engineering skills as a side effect.


Ironically, as a CS PhD student I spend an enormous amount of my time on the subject matter listed. There are very few research areas in which PhD students can get away with ignorance of the command line, shell environments, build systems, version control, debugging, etc.

A common scenario is that I want to reproduce and extend some earlier research, and there's a GitHub repository for it that hasn't been touched for the last seven years, which was forked from an even older repository that some grad student hastily pieced together. And I need to run components of it on our high-performance computing cluster, where I don't have sudo privileges. So it's whole lot of moving things around between VMs and Docker containers, figuring out what to do about ancient versions of packages and libraries that are dependencies (especially everything that's still written for Python 2.7); either refactoring things to update it all, because I want to take advantage of newer functionality, or setting up some isolated environment with older releases of everything built from source.


This very much varies by school. In California, undergrad as prep for grad school is the default for the UC system, but not so for the Cal State system.


I think this is mostly a US thing, or countries based on similar education systems.

In Portugal our computing degrees are Informatics Engineering to use a literal translation, a mix of CS stuff and Engineering. And validated by the Engineering Order as fulfilling certain requirements, for anyone that at end of the degree also wants to do the admission exam for the professional title.

Those that only care about the theory part of CS take a mathematics degree with major in computing.


In India we usually call it "Computer Science & Engineering".

But name is the only thing we get right in CS education.


I was on the CS faculty at a Canadian university in the 1980s. I proposed a course with almost exactly this outline, only to be told it wasn't university-level material. MIT seems not to have got this message; good on them!


I personally don't like the undertone of the class (tho very grateful that this material exists!!!) - this idea that universities are failing their students by not teaching them necessary material. I think a better phrasing is that students are failing themselves by not learning the material. I've personally never considered it the responsibility of my university to educate me - some of the classes are certainly useful for learning, but the ultimate onus falls on me to gain the skills that will lead me to success. I find it kind of distasteful how classes encourage a sort of passive victim mentality when it comes to learning - as if students need to be bribed with credits and cudgeled with a gpa to be forced to learn genuinely useful things.


You don't consider paying tens of thousands of dollars as creating responsibility to educate?

Of course, students need to be active in the learning process. But in my experience, it is more likely that professors and departments are terrible at educating than it is for students to not be motivated to learn.


> You don't consider paying tens of thousands of dollars as creating responsibility to educate?

I get OPs point. It's like getting an english literature degree and you've never read a book on your own.

My guess is most people needing the missing semester never coded outside of their assigned tasks. Which is fair enough, but its surprising to me to meet phd candidates who marvel over the missing semester (I've met 2).


To be clear, I was responding to the commenter's general point and not regarding this specific class or its contents.


There’s absolutely a responsibility to educate on the topics needed for the degree to be granted.

This class is an adjacency to an EE or CS candidate. Are universities also failing their students by not offering/requiring a touch-typing class? I don’t think so, in large part because computer science is not programmer occupational training.


A CS course isn't programmer occupational training in name only. Practically, there aren't many CS research jobs and working as a programmer is more often than not the career path for someone with a CS degree.

Universities can choose to be puritans about what CS is as you seem to be advocating for, or they can be realists and fill a very real gap in skills and knowledge.

Your point about "the topics needed for the degree to be granted" is also a very purist view of the role of university. Is the role of university solely to teach a curriculum that aligns with some abstract ideal of what a particular degree title means? Partly it is. But again, that doesn't match the expectation and the practical reasons why students choose a course. There are very few students studying CS for the beauty of it. Those that do probably do end up in academia and don't need this course. The rest are there for jobs, and they certainly could benefit from this.


What occupation are the vast majority of CS students intending to pursue when they enter a computer science program? What occupation did the vast majority of computer science graduates end up pursuing?

I'm willing to bet the answer to both of those questions is a computer programmer.


> Are universities also failing their students by not offering/requiring a touch-typing class?

Universities make assumptions based on the larger student body as to what requirements are needed for admission. Generally, we assume students can read and write and have general computer literacy, but actually the last assumption is starting to fray a bit; more and more, students are coming into school without basic computer desktop literacy. This hasn't been a problem for decades, as students tended just to pick up skills like touch typing. But today, some students are hard pressed to to save a file to the desktop.

I could see universities might actually have to start adding computer literacy as an entrance requirement, the same way we require basic reading and writing and English speaking, so we don't have to teach those things.


A degree takes already 3 or 4 years. In order to incorporate this “missing semester” universities would have to either a) remove existing material to make space for it, or b) extend the degree one more semester.

I don’t think universities should remove existing material in general to incorporate “bash 101”. Mainly because learning bash is easy and one can learn it by oneself without a professor. Extending the degree one more semester doesn’t make much sense either.


Half the value of having the material in a course is that it specifically highlights what should be learned.


>a) remove existing material to make space for it, or b) extend the degree one more semester.

It's not literally an entire semester's worth of material. "The Missing Semester" is just a catchy name they gave it.

The site says:

>The class consists of 11 1-hour lectures, each one centering on a particular topic. The lectures are largely independent, though as the semester goes on we will presume that you are familiar with the content from the earlier lectures. We have lecture notes online, but there will be a lot of content covered in class (e.g. in the form of demos) that may not be in the notes. We will be recording lectures and posting the recordings online.

And in that paragraph the word "semester" it doesn't mean a normal full-length semester:

>The class is being run during MIT’s “Independent Activities Period” in January 2020 — a one-month semester that features shorter student-run classes.

So, 11 lectures over the course of a month (actually three weeks if you look at the listed dates). And it's an unofficial class taught by grad students, alongside other classes.

If a CS program made this official, it could fit into the first two weeks of the course. And that'd be a great thing, since these tools make you way more productive in everything computer-sciency you do. It's like compound interest: the earlier you get good at the shell, the bigger the returns.

I think they call it the "Missing Semester" because

a) it's as useful as an entire semester

b) when you don't already know this stuff, it seems much bigger and more difficult than it really is. and your fellow students who already do know it seem like they're a semester ahead of you in comparison.

c) it might take you a semester to learn the material if you don't have instruction, feedback, a roadmap, while you're juggling your other academic obligations. people remember the things they succeeded in teaching themselves but forget the immense wasted time of rabbit holes they went down because they didn't have a mentor to guide them.

-----

I didn't study CS, I studied Physics instead. My hands-down favourite course, the one whose material I still use even though my day job has nothing to do with physics, was called something like "Problem Solving for Physicists" (google "university of sheffield PHY340", you can find PDFs of past exam papers to see what I'm talking about). It was this lovely hodge-podge of material, much of which had nothing specifically to do with physics at all. It had stuff like dimensional analysis, how to come up with sensible approximations and Fermi estimates, how to sanity check your calculations, coming up with lower and upper bounds, how to rule out certain classes of solutions even when you can't find the exact answer, that kind of thing. It was in the second or third year of my course, I forget which, but either way nothing in it had more than high-school level mathematics, so it could have been taught in the very first part of the first year, before we even did mechanics 101. That would have been tremendously helpful for everything that came afterwards. That was my "Missing Semester" (or perhaps, "Misplaced Semester").


I'm surprised anyone would object to this. Universities have a responsibility to prepare their students.

When I saw the title I figured it was another "computer science" class. But the curriculum was a significant portion of what I lacked when I graduated, which prevented me from finding work for a year.

Had someone at university told me before I graduated that I'd have no chance of finding work if I didn't know Git, Linux, REST, how to use the command-line, how to use an IDE, how to use an editor on the command line, and bash, I would have prepared myself for those things.


Can you elaborate on how not knowing those things specifically is what led to you not being able to find work

Did your interviews ask specific questions about Git, Linux things not covered in a standard operating systems course, and command line editing?


go on indeed, search for "software engineer", look at how often those things crop up in the "essentials" section


I’m just not sure how jot knowing these thing practically stopped you from getting a job.

Did they specifically ask about them in an interview?


Yes, they were asked about in interviews, when I managed to get interviews.

Also, since I wasn't using git (and hadn't even heard of git or github until after I graduated; mind you, this was in 2012), I didn't have any visible work to highlight when responding to job listing. I also didn't have any demonstrable ability to collaborate, dive into existing projects and find my way around, or do things like make PRs.

I didn't have anything to say about my ability to work with software tooling, using an operating system beyond just opening a web browser, notepad, Borland C++... I think I used netbeans for a group Java project also.

I definitely wouldn't have been able to demonstrate editing a file on the command line. Most tutorials were too dense for me without significant head-desk banging, because they assumed you knew how to compile a file, or run make.

Yes, we did do a lot of these things in a very limited way for our classes, but the teacher would always tell us exactly what we needed to do, and the very small amount of actual programming we did in our program didn't require me to know how to do something like debug the code, enable linting or error highlighting in the IDE (it was mostly simple enough that I could get it right, or close to right, with a few tries anyway).

I know these things contributed to me not getting a job, because I spent almost a year learning them and more, and was able to get jobs after.

All that said, I didn't do an internship, I didn't have a good GPA, and my school wasn't considered good. I was about as unattractive a candidate as I could be while still having a CS degree.


What school did you go to?

Regardless, I have my doubts that not knowing about git or software tooling held you back. I know many recent grads that don’t include anything about Git on resumes and it doesn’t come up in interviews.


> the ultimate onus falls on me to gain the skills that will lead me to success

I see where you're coming from, but sometimes you don't even know what the necessary skills are. Even if you're very self-motivated and enthusiastic, you can still benefit by being pointed in the right direction. That's part of what a good school or teacher should do for you. (And while they're at it, they can provide materials that smooth out the path to get there.)

You should never expect them to cover 100% of that, but if they're aware of a way that they can get closer to 100% than they currently are, then it's a good thing for them to do it.


I think you’re conflating two different things: universities selecting and presenting a syllabus needed to earn a certain degree, and students actually learning the material.

The latter is the solely the responsibility of each student, but I don’t understand why the former would be. Some of the content in this course strikes me as unknown unknowns for new programmers. Why would they be to blame if no one told them to learn a particular skill?


Honestly, because its something that should be a prerequisite for starting the degree program in the same way basic algebra is a prerequisite. Likewise, not knowing you need to know this stuff is a sign that you are probably not at the point where you should even be able to have declared the major. The fact that colleges allow this at all is doing a disservice to students, many of whom will go on to permanently damage their academic records.


We don't expect med students to have spent their teenage years doing experimental surgeries on their friends. Or accounting students to have taught themselves by doing accounting for a major business in their after-school time. Nor do we expect a microbiology student to have spent their childhood experimenting with infectious viruses and bacteria in their garage.

I think we expect that in comp sci just because many of us did happen to grow up doing that. But it's a weird and unusual expectation, and probably not a good one.

It also certainly wouldn't have been expected a few decades ago. You just wouldn't assume that a kid had a mainframe in their house to have learned on. Now that PCs have been around for awhile, we make that assumption, but again I don't think it's a good assumption. Certainly not for the less-affluent, nor the younger ones who grew up with smartphones and tablets instead of a PC.

I think there's also a bit of disconnect in what the purpose of the major is. Actually having a separate 'Software Engineering' major is relatively quite new and generally, Comp Sci was what everybody took if they wanted to learn to work on software. But now some people think it's a totally academic thing, while others think it's industry training, and that always confuses the discussion. But even in spite of that, it's just a bad assumption/expectation.


So, beyond a fairly standard high school curriculum and not too much distaste for math, what implicit requirements should we add for physics, mechanical engineering, chemistry, chemical engineering, material science, etc.? Because where I went to school, there were no special requirements for those majors--nor for CS/EE. Is CS today unique for some reason among STEM majors?

Different majors have varying degrees of difficulty for different people. By and large schools don't (and shouldn't) get into the business of heavily policing who gets to give a particular major a whirl.


People are going to really dislike what you said but I agree to a certain extent, especially when it comes to the basics of working in the command line. If somebody can't read the manual on that and figure it out then they are going to be so hopeless for so many other things that I don't want anything to do with them.


If someone has literally never opened a terminal with a command line, you're probably being rather dismissive of how unintuitive it will be for a lot of people at first.


"I've personally never considered it the responsibility of my university to educate me" - you're going to have to explain yourself


I started college in 1993 and my school had a mandatory "Introduction to Unix Computing Environment" class for all incoming engineering freshmen.

We learned the basics of the shell, file system, file editing, AFS ACL's for group projects, and more. It looks very similar to this MIT course which makes sense as our school's computing environment was based on MIT's Project Athena and AFS (Andrew File System)

https://en.wikipedia.org/wiki/Project_Athena

I looked and the same course is still mandatory for incoming engineering students.

I'm in the semiconductor industry and everything runs on Unix / Linux. Back in 2000 we would get new grads that knew very little about Unix, command lines, or scripting. That kind of stuff is half my job. These days Linux is so popular that most of the new grads know this stuff.


Even though a lot of incoming students would have had PCs by that time, they'd mostly have been running Windows. As you suggest, as I understand it MIT really focused on Project Athena clusters for engineering work and people used PCs more for word processing, etc.


My university had Dijkstra's quote "Computer science is no more about computers than astronomy is about telescopes". They made us aware of tools and we were free to use them as little or as much as possible to do the science. I always assumed that software engineering degrees focused on tools more than computer science degrees (among other differences).


The following isn't aimed at you in particular, but in HN threads about the Missing Semester there will always be someone who earnestly repeats this stinking turd of a Dijkstra quote, so I'll put my rant here:

Dijkstra was full of it. He wanted CS to be just a branch of abstract mathematics but that's never been the case. That's a retconning of history by people with math envy. Before Alan Turing had ever heard of the Entscheidungsproblem, he had already built simple mechanical computers with his bare hands.

It's cousin to a stupid mindset you see in software engineering, that you can somehow be a good engineer while not knowing what your hardware is actually doing. That's how you get complicated architecture-astronaut systems with good theoretical big-O characteristics, that get crushed by a simple for loop written by the guy who ran a profiler and knows what a cache line is. We live in a world made of atoms, not lemmas.

Research fields go rotten when they don't come into contact with reality enough: quantum computing, string theory, etc.

And as for astronomy: knowing how telescopes are constructed, how they work, their optical characteristics, limitations, failure modes, all of that is essential to observational astronomy. And if you study astronomy, you sure as fuck are taught how to use a telescope!!!

Astronomy as we know it didn't exist until we had good telescopes. Cosmological theories have risen and fallen on the advances in optical theory and engineering. Astronomy is very much about telescopes.

What other field is so ashamed of its own tools? Like, art isn't about pencils, but art students are taught how to hold a pencil! Stop repeating this thought-terminating cliche.


Fwiw, your rant did not convince me.

I estimate that astronomers need to know about tradeoffs on a telescopes' settings for the data they are looking at. But I'm unconvinced that they necessarily need to know how to operate it (would depend on the workplace) and I certainly disagree that how they are constructed is absolutely necessary for all astronomers.

More knowledge is always good, so of course learn what you want. But it's not being "ashamed of tools" to say that a CS degree should "do one thing and do it well".

Additionally, we can simultaneously say that a university should encourage tool mastery while also saying that they don't need to teach entire courses on it.


Thanks for calling this out. I belive Knuth remarked that Dijkstra himself didn’t program the way he thougth CS students should.


Completely agree. Things like shell scripting, debugging tools, IDE usage can all be naturally picked up on the job given whatever tools that they recommend you use at their company.

You know what you're not going to be able to pick up at your first software engineering position? Discrete mathematics or linear algebra.


Not trying to dismiss the importance of knowing discrete math etc. in general, but I would posit that vast majority of entry level swe positions require no knowledge of it.

However, knowing the tools of the trade is something that is invaluable. And yes, it can be picked up on the job, but deliberate learning and practice is more effective and less stressful.


> Not trying to dismiss the importance of knowing discrete math etc. in general, but I would posit that vast majority of entry level swe positions require no knowledge of it.

Directly, sure. I do think there is something about the rigor of the math thought process that lends itself to writing software. Thinking through algorithms and proofs is really not much different than writing code or debugging.

Even with tools I think learning concepts are better. I've used so many IDEs through my career, but they are all roughly the same conceptually. One thing that has helped though is embracing vim keystrokes and using them everywhere.


This was my first thought too. The tools talked about in the link are useful but they aren’t really computer science. This was also hit home in my CS courses. I was being taught the science behind computers, not necessarily the practical application of them.


As far as cryptography is concerned, I think one of the best options these days is to teach the libsodium API. It's very rationally structured and well documented, and also available on every platform and in every language. Most importantly of all, there's nothing in there that you shouldn't use, which is one of the biggest problems with real world cryptography.


What's has only a single, passing mention in the article, and none here, is how this fits in to MIT's curriculum. The content is supremely useful, but does not properly fit into a traditional college semester 10-week course schedule. MIT has this weird 'half semester' in which to fit this, and similar shaped content, for all disciplines. It is largely student run, as we see here.

> Independent Activities Period (IAP) is a four-week period in January during which faculty and students are freed from the rigors of regularly scheduled classes for flexible teaching and learning and for independent study and research. IAP is part of the academic program of the Institute—the "1" month in MIT's "4-1-4" academic calendar. Students are encouraged to explore the educational resources of the Institute by taking specially designed subjects, arranging individual projects with faculty members, or organizing and participating in IAP activities. They may also pursue interests independently either on or off campus.

http://catalog.mit.edu/mit/undergraduate-education/academic-...


A thought occurred to me when I read this bit. Two thoughts, actually. But they're related.

We’ve also shared this class beyond MIT in the hopes that others may benefit from these resources.

Thought one: they should make this an official OCW thing

Thought two: OCW class resources should be a two way street, like open source development, not just a "throw it over the wall" model.

Not that I'm criticizing MIT for making any content freely available, mind you. Any free, high quality, educational content makes the world an overall better place IMO. No, it's just that I saw this bit:

Editors (Vim)

and couldn't help but think "Great, but what about Emacs?" Which got me thinking something like "Well, why couldn't I, or somebody else (preferably somebody actually qualified) create a corresponding 'Editors (Emacs)' section and contribute it back?"

I dunno, maybe it's a nutty idea. And certainly for the stuff that is released by OCW under corresponding open licenses, I suppose one could "fork" the class somewhere else and run it on a model where outside contributions are accepted. Anyway, this just got me thinking about this concept.

EDIT: never mind, I actually just noticed that this particular course actually is on Github[1], and they do accept pull requests! Very cool.

Would still be cool to see that approach become even more widespread for OCW courses (both from MIT and elsewhere).

[1]: https://github.com/missing-semester/missing-semester/pulls


I love vim, it's great to know how to navigate it, but I have met very few people who use it professionally. I'd love to hear others' experiences, there.


Lots of people at my work use it, but I don't know if I'd consider it essential. If you can use any editor/IDE very fluently that's good; probably doesn't hurt to learn with vim if you don't already have that.


if you ever find yourself needing to modify files at the shell on any sort of remotely modern unix/linux system, you're probably going to have vi or vim available.

it's worth knowing an editor that doesn't require 6GB of RAM and a graphical environment, but it's probably not going to be most folks' primary editor. ;d


<6 GB RAM?

Sooo many choices. I started[1] programming on a 5 MB machine with ~4 MB free.[2]

But I'm not allowed a GUI? Ok… then, I'll choose an editor with a keyboard-driven menubar across the top (keyboard shortcuts appearing in the menus), a status line on bottom, and as many keycombos that match my preferred GUI as possible.

Failing that, I can muddle along with Nano.

Until some once-in-20-years task actually requires using an arcane 'editor' that assumes I live in PDP-11. (^:

[1] Wellll… I sometimes forget typing BASIC from magazines into an Apple II and making small changes.

[2] And it was luxurious…until I tried to fit 8 MB of a CD-ROM app into compressed RAM.


I use (Neo)vim as my daily driver, professionally (for the past 4 years, coming from Goland). I've found every other editor to be too heavy/slow (VScode/Jetbrains), or too noisy (regarding features).

Neovim allows me to specify the precise minimum I desire to have a fully functional IDE-like experience (basically treesitter + LSP + DAP + minimal extra plugins).

It's super fast, I'm already in my shell, and my memory usage gets to stay super low :)


Once I learned how to use vs code productively it really changed my mind. The other day I needed to do a quick base64 encode and vs code had that. Same goes for formatting some json or doing a diff. It's all available in a single tool with no learning curve. Hell it even has git and git blame in there.


I think VS Code is just extremely convenient and can address most peoples needs out of the box. It’s therefore also very beginner friendly. You don’t need to constantly fight the tool or spend hours researching how to configure some trivial feature. That being said VS Code is quite heavy and has an enormous amount of GUI elements. I mean they even have buttons for things like git fetch and pull for heavens sake. They could release a VSC Light that strips most of the GUI out.


True, but you can customise it a lot. So the number of UI elements is not a problem.


Yeah - to each their own. I have all these tools in my terminal too. I'm just a lot faster in the terminal, and I don't really think it's possible to be faster in a GUI (even if there are shortcuts - there will surely be shortcomings before you eventually just re-invent a terminal via VSCode).

Plus - modal editing :)


Vs Code is the gateway drug of real IDE's. Last time I checked, it still cant do things like navigate to definition, or rename a calss along with all mentions of it in comments. Or count TODO: comments


navigate to definition -> f12

rename all instances of class/method/variable -> f2

count TODOs: too many vs code plugins that will do that for you

Not sure when the last time you checked, but vs code is a practical full fledged IDE with the right plugins.


I don't think this is accurate - at least not anymore. VS Code supports LSP configuration which means all those features should be available.

That said - I haven't used it in years. Way too heavy and Vim does all of those + modal editing.


I'm not fond of Vim, have been an Emacs user for going on 40 years, starting with Gosmacs. That said, I'd rather that students learn one editor really well, rather than the smattering of editor/IDE usage most of them seem to acquire. So, fine, teach them enough Vim that they can use it well, afterwards, they can use whatever editor they choose.


I use Vim infrequently. I know it and if it was the only option I'd be efficient with it. I've even started with Vi as the only option. However I prefer Emacs because I feel way more efficient and comfortable with it. As a bonus main key bindings are consistent with Bash and others GNU tools. The course should introduce Emacs as well so that students can choose on their own. Whatever your preferred editor is IMHO it is up to an IDE to integrate with it.


I use neovim with Ale plugin for LSP and linters. Somehow it maps into my brain better than anything else. Its fast, on cli, feature rich, usable without a mouse...

The only thing where vim/neovim really sucks against other tools is the project specific configuration.


I'm 30 and have been using vim since college. Usually I'm not actually using the vim editor, but some IDE that has vim keybindings. Good enough for me since I don't customize it much anyway. And then I get the best of both worlds: IDE features for navigating around files (jump to definition, display references), and vim for navigating within a file. Professionally I never paid attention to how many people are using it. Off hand I know a couple people that use emacs, and zero that know vim, but this is probably because emacs users talk about emacs :)


I use it (well neovim) exclusively now because I like it. I have in the past used Emacs, IntelliJ, and VSCode. All work, the choice between them doesn't matter a jot, and that choice has never had the slightest influence on my competence or so-called 'productivity'.

More useless words have been spent over editor choice than .. "how good is Linux?" or "how terrible is Electron!" or any one of the handful of sometimes entertaining and always vacuous areas of argument software people frequent over their peccadilloes.


I switched to vim in college, never went back. That was 20 years ago.


Using vim itself? I mean, yeah picking up ninja edits here and there.

Using vim motions? Everywhere. All the time.


Yep! Evil for Emacs, IntelliJ, ranger, ksh93 shell history editing (this one includes search!), DuckDuckGo, Fastmail, Gmail....


70% of my coworkers use vi but most of us are in our 40's. About 20% use emacs and the other 10% use some other usually GUI text editor.


I've given up on Vim a long time ago. However, every IDE or editor or notebook tool I use that has Vim keystrokes I enable that and use that. You have to let the Vim purist in you die, it's not a fight that can be won. Settle for like 80% of what you used to be able to do in Vim and call it a day.


I don't understand. Why do you need to do this? What am I settling for? I use vim daily and want for nothing.

Vim becomes a state of mind, a model for editing that is really empowering. I can translate any thought to code by doing a little dance on my keyboard. My mind is freed by this expression.


I've been using it professionally since 2011.


Of the 10 developers I work with the most, 9 are vim users and one is VSCode. The VSCode guy won’t shut up about it but then he is also the proselytizing type.

Honest, I’m not that attached to vim and mainly keep using it because it’s what my colleagues use. If everyone suddenly switched to VSCode I would follow the herd. The only thing about vim that I love, and would not give up, is the keybindings and I am sure I could find a decent plug-in for that.


Every time I'm working on a personal project I boot up neovim and have some fun. I have an extensive neovim (and emacs) configuration.

If I need to do anything in a professional environment, I boot up intellij/vscode/whatever they use and just install a vim emulation plugin. Vim is fun and all, but when time is money the last thing I want to worry about is configuring my editor.


I can't edit text files without vim. I use modern IDEs (IntelliJ) and the first thing I do when I install them is to install Vim plugin.


I have used neovim for a few years now. With lsp support and other additions, using a vim variant hasn't been better!


I use it professionally and personally for all text editing, even just taking plain notes. I feel handicapped without it.


I don't use vim much since nano seems to be on every installation I've seen recently.

But I don't think anyone can avoid learning how to quit vim. We've all been there, you think you know the keys, you don't, now you are trapped.


Depending on what kind of programming you're doing. I think it's fairly command in sysadmin stuff. But I can't imainge doing, say, Android development in vim. (Of course it's physicall possible, but...)


been using it my entire career. works well, never had much reason to switch, especially these days when I do most of my work sshed into another machine (we aren't allowed to have work code on our laptops)


I switched from TextMate to Vim as my main driver in 2010. Never looked back.


Always think we should do this at uni and even for our PhDs. Some students are actually much better in control of those tools than their advisors, but it is still a minority. Always lacking the time to teach this. German system lacks incentives for actual sensible courses IMHO (applied universities might be a bit different)

BTW: Pops up every year seemingly. E.g. 3yrs ago with more than 1000 comments:

https://news.ycombinator.com/item?id=22226380


I would add jq to the data wrangling section https://stedolan.github.io/jq/

Being moderately competant at jq on a team that doesn't know jq is a super power.


Any reason why you would use jq over Python? Certainly this can be attributed to my lack of knowledge in jq, but anything beyond a simple query is going to be done with Python for me. Beyond not having to look up query syntax, doing it in Python (or any script) is easier to read (both during and after writing it) due to auto formatting in an editor.


> Any reason why you would use jq over Python?

To give an example from another context: This is a bit like asking "why would you use document.querySelector() instead of just looping over childNodes directly?"


Readability is important if you're writing actual code, but for things like debugging a giant JSON blob from an API request JQ is far faster IMHO.


Incidentally I found that chat GPT is excellent for asking for complex csv/json transformation queries and having it spit out the correct script.


Kind of ironic that this course is from MIT and it teaches Vim without even mentioning Emacs.

I get that they chose Vim because it's more popular, but it feels as weird as taking a course from Apple that teaches you how to use Windows.


I think it's because it's a course focused on Linux and not Emacs. Different operating systems.


I think they should add a lecture on test driven development and CI/CD pipelines (e.g. Github actions). And I think after such a course people should be capable of using Github, but for that it also lacks a discussion of issues (or ticket systems in general) and pull requests.

Instead of a whole lecture on Vim they should rather teach a modern editor like VS Code or how to use a real IDE like IntelliJ. With these modern editors you also get refactoring and a Gui for Git, which makes it much less painful to use.

So all in all its a good start, but already outdated on several topics.


The content similar to this is being taught at Indian Institute of Technology Kanpur for well over a decade. It runs as 3 different courses seperated in different semesters depending on complexity and prerequisites:

* CS251 - Computing Laboratory - 1 - https://www.cse.iitk.ac.in/pages/CS251.html

* CS252 - Computing Laboratory - 2 - https://www.cse.iitk.ac.in/pages/CS252.html , and

* CS253 - Software Development And Operations - https://www.cse.iitk.ac.in/pages/CS253.html


Instead of smashing tools into a single class, they should be incorporated early and throughout the curriculum so they can aid students as they progress.

For example we have a sophomore fall class called Linux/Unix Programming that covers about half of these topics and a sophomore spring class called Open Source Software Development that covers the rest (and philosophy, history, etc.). We have an advantage for this sort of work, though, in that we’re not a prestigious research university, but instead a small teaching college that calls ourselves “professionally focused.” Meaning while we have a CS degree, we acknowledge that 90% of our graduates will become software developers. That approach means that while theory is still important, tools for software development are very intentionally incorporated as part of the core of what we teach. Think of it as a blend of CS and software engineering (which is also one of our concentrations for a deeper dive).


Excellent guide! Lol at title though, I just realized how spoiled I was with my CS degree from Drexel.

We had a UNIX tools and an Advanced Programming Techniques course. Both gold. Taught from the sausage dog book and the Unix Programming Environment book iirc. Totally standard texts I thought?

Link to sausage dog (ha!): https://en.wikipedia.org/wiki/The_Practice_of_Programming

Advanced Unix: https://en.wikipedia.org/wiki/Advanced_Programming_in_the_Un...

Some of it is old hat, but TPOP ages particularly well due to its lack of language specificity. Something we could all do with :).


I'm torn on this. As somebody who is too stupid to goto MIT, I chuckle thinking of spoon-feeding how to use Git or a terminal to MIT students. It's useful information but not everything useful needs a sit-down, by-the-numbers, "this is how you branch" discussion. I also wonder if it does more harm than good teaching these topics so matter-of-factly since it's only the current programming zeitgeist anyway. It only further entrenches the "way things are". Like depriving children of that impactful exploratory phase and sticking them into a rigid box. In any case- won't these topics propagate natually from good professors using them themselves? What do I know, I'm too stupid to attend MIT.


If MIT students are really so smart then they can probably learn what to learn on their own. If they can't, then they aren't so smart.


The field has grown to the point that if you don't teach it top down no one will have the skills to be dangerous until the end of their degree. And projects that shorten the size of that stack still come out of MIT.


By a happy accident I did a 3 year masters program in India called Research Assistant program. I was a system-admin at the CS department's for 3 years. I had to do ton of stuff, like setting up and maintaining backups, installing tools such as DNS, mail servers, handling server failures, provisioning new servers, debug networking failures by inspecting router connections, so on and so forth. It was a lot of work but super fun and lots of stories. This was ~20 years ago so no SaaS tools etc., One had to do almost everything in house and we had to use free/open-source tools to save money.

That experience taught me so much and I still use most of those learnings on a day to day basis.


Vim? Really? No text editor will make you a good programmer. I barely use/know Vim with 20 years of programming experience.


I think the point is to not leave yourself in a position where you're not proficient with any text editor at all.

This is an actual issue. Some students show up at college without ever having written a line of code and without knowing how to open a plain text file, make changes to it, and save it.

It doesn't matter that much which one you learn. If you already have an opinion about which one you like better, you're already where you need to be and can skip the section on editors. If you learn vim and don't like it, you can learn a different editor.


Having only very rarely needed to use vi/vim or emacs, I'd say it does very much matter which editor(s) you learn. If you learned BBEdit, DOS Edit, TeachText, or WordPerfect 1.0 for UNIX, you'll be lost it vi/vim or emacs.

What you'll think you know about "how editors work" will be actively harmful. I've seen and heard of people resorting to pressing the computer's reset switch when they can't figure out how get out of a 'real' textmode editor whose behaviour defies all commonalities. And good luck getting to the man-page if you can't get out of the editor.

When I've needed to use one of those, I fortunately had been provided exact steps or had a 'real user' who could help me. I've had a few goes at the man-pages ages ago (which were not installed on one or two systems!), but paging through them just made 'real' editors seem even more like "not a usable editor" to me. Too dainbramaged by having grown up with 'intuitive' editors that give more affordances.

[Edit: missing word]


The first thing I do on any linux box is remove Vim for nano

Vim is like Dvozak keyboard. Consistency and standardisation is more important than 1% performance improvement.


Wait - you actively remove vim? Or you alias nano to vim? Neither of those make sense to me. Why not leave it there and just not use it?

Every system I've ever been on has vi and nano and usually vim.


There’s a struggling developer on my team and this course covers everything he is having trouble with.

Writing performant code? He does a fine job.

Setting up an environment, navigating a couple terminals and getting code into revision control? He is completely lost. Someone basically has to do it for him.

I’m going to send this course material to his manager* and hopefully a few weeks from now the team stress levels will be greatly reduced.

*yes, company culture dictates that it would not be appropriate for me to flat out say to him “dude you suck and here’s a way to be better”.


That's funny. This is much of what I teach my own IT Product Development students in the first year course The Web of Things at my CS Department, Aarhus University. My goal for the course was precisely to introduce tooling and techniques, and to habituate the students to their use through the construction of systems integrating Web and IoT. It's a busy seven weeks, but students further along often tell me that they learned a lot and are still using the same tools.


At my university they thought me: tons of mathematics, electronics, automata theory, compilers, and a bunch of stuff that “I don’t use on my day to day”. They didn’t teach me debuggers, profiling, or bash.

I’m glad they did that. Because I ended up learning bash, profiling, debugging, etc. all by myself. I would have never learned by myself in my free time the pumping lemma, for instance. So, all in all, I’m grateful my uni didn’t teach me “useful” stuff.


The course seems to advocate for vi, but when I Google: "vi", the response is: "Did you mean: emacs"

I don't know what to make of this. :-)


Purdue also had a CS190 tools course when I was enrolled that was pretty similar, surprisingly the course website is still up! https://courses.cs.purdue.edu/cs19000:fall13:lecture_1 (see lectures in sidebar). This course is actually how I found out about Hacker News!


Hope you liked the course! I was the one who first made and taught the course. To my knowledge, it’s still a required course for all CS freshman and is still taught by students.


I studied motion graphics and learning Adobe AfterEffects was predominantly left up to the students.

Nothing like fighting the software for hours the night before to animate something and, the next day, have my professor show me how it could be done in 15 minutes using a property or tool I didn't know existed.

It was not only assumed but explicitly stated that AfterEffects and, Illustrator, which is commonly used alongside, were too time consuming to teach and we would just have to dive in on our own.

In the interest of saving others from that pain: https://youtube.com/playlist?list=PLjNI3J96cKVKmujglFYJEkHzZ...

A YouTuber (Jake in Motion) made a series of videos demonstrating every effect and tool in AfterEffects. They aren't deep but it's a fantastic resource to get you started.

I just wish this had been around when I needed it. Hats off to MIT for providing the same to others.


At university, my co-founders and I took over teaching a few modules. We were eager to find a larger pool of students that had useful skills for a company, rather than those versed in just theoretical knowledge and Java.

To provide the students with the skills they'd need in their career, we taught them how to use real world tooling: how to make a website from a design, how to use Git, how to write backend code, identifying security risks, how to use editors that weren't JEdit or Netbeans, how to use PhoneGap, how to deploy to a server, how to use Unix.

As a result of our training, we managed to get some great students on-board. No longer were we surrounded by students who could make some ServiceFactoryBean, but instead ones who were fully capable of making real things in a real company.

It's awesome to see that MIT has a similar programme - covering all the skills that we actually did teach. Too much of university is spent theorising and not spent making students employable.


I posted already in the thread but when I started college in 1993 we had a mandatory intro to computing environment class. At the time everything on campus was Unix based with a mix of Sun, DEC, HP, and IBM workstations.

The class is still called E115 and I found the online text book. It has definitely changed in the last 30 years and the MIT course is probably a lot more in depth but the concept is the same.

You have students coming from high school that may have only used Windows or Macs. I've heard that a lot of students have got used to iPads and cloud storage don't even understand the concepts of files or file systems.

https://e115.engr.ncsu.edu/online-textbook/


When I was at University of Denver, this was exactly the kind of thing covered in a course called “Unix Tools” and it was hands-down the most valuable quarter I spent there. I had assumed it was a standard thing across CS programs.


Same thing at my state uni. We used https://www.oreilly.com/library/view/unix-power-tools/059600... as our textbook, and I still keep it on my shelf 20 years later.


My university (ANU in Australia) added a course very similar to this along side their first year courses. Didn’t get to see the results, but having tutored the years before the course’s introduction it was certainly… necessary


The course videos: "Missing Semester IAP 2020" https://youtube.com/playlist?list=PLyzOVJj3bHQuloKGG59rS43e2...

Software Carpentry has some similar, OER lessons: https://software-carpentry.org/lessons/


The metaprogramming lecture [0] perpetuates the common misunderstanding of what “mocking” is [1], which is unfortunate.

[0] https://missing.csail.mit.edu/2020/metaprogramming/#a-brief-...

[1] https://martinfowler.com/articles/mocksArentStubs.html


At some point, we might need to accept that "mock" has become a generalised term which refers to many kinds of test-doubles. If it's being taught that way in universities and embodied that way in tooling, then the term has a new meaning.


This is great! I can remember how painful it was to figure this stuff out when I was a student. Even with what I know now, I still learned a few things from browsing through the material.


My education was at an EET (technical school -redneck-dirty-hands-technician) school. A trade school.

Very practical, and vocational, so they had stuff like this (but for E Techs).

Universities tend to be less "vocational," so may skip some of the more practical stuff.

Also, developing a curriculum is a huge pain in the butt, and keeping it updated, is even worse. Stuff like this tends to be highly topical, so it needs to be kept up to date, with almost no lag.

But this is great.


There was a course, Software Development Environments, on the CS degree programme I studied that taught a lot of these things. I was very grateful for it.


I love the content, and I'd like to add another semester in the same tone about people and communication : - how our brains work (in particular, emotions !) - group behavior theory - non violent communication - scientific method - design/ergonomy 101 - written communication (how to reach an understanding with someone through slack?)


The data wrangling lesson was super enlightening! It's held by Jon Gjengset which is great at explaining stuff. I was super impressed with how much stuff you can do directly from the command line.

I remember that after watching that lesson I straight up wrote two big bash scripts that made users download recorded lessons directly from WebEx and MS Teams.


At my uni this is a course in the first semester, called introduction to operating systems, covering command line, git, encryption, bash and all that. It's pretty good for second year student that are TA's because it's quite some infrastructure needed to teach the labs(containers, configuring vms, custom autograders) so


Looking through the links, I noticed most of the course was video, so I immediately turned it off. Compared to text, videos are an inefficient learning tool, and I think a lot of people aren't aware of that. Actually, I hate video learning materials. Though it's better than going to a classroom.


Reminds me of CMU's 15-131 GREAT PRACTICAL IDEAS FOR COMPUTER SCIENTISTS. Reminiscent of the old CMU days.


Could have done with a small guide to how the filesystem works and what XDG dirs are, and stuff like that. A lot of newbies trip on these while figuring out how configuring anything on linux works, because it's a very alien concept if you're moving from windows.


We learned this exact stuff (okay, maybe not git, for obvious reasons) when I got my undergrad CS degree in the late 90s. I agree it's invaluable knowledge, but I don't think any reputable undergrad CS program is omitting this from their year 1 curriculum.


I've learned most of this stuff at VUSEC.

VUSEC: learn about systems security!

In your own time figure out:

- git

- vim (you wanna be cool right? Btw, use vimtutor in the commandline ;-) )

- C (okay okay, it was a prerequisite, I didn't have it, haha, so learned some basic C in a week to do the assignments. I knew Java only at that point)

- ssh

- Many other things


This feels like the things I learned from the Gentoo Wiki in the early 2000s. (The wiki crashed without backups and all was lost.) It was a very meaningful foundation to my career and this page similarly will help people a ton.


man I forever wanted to have a class like this... but also know it would have gone way over my head, I learn by solving problems I actually have. But having known version control would have saved me so much time. I got by with judicious use of tar -gz



I've wanted to write a "what your comp sci program didn't teach you" book fit so long, based on what I've learned and found useful only after graduating and working professionally.


If your first job is at a Windows shop, good luck with that Powershell prompt.


Does it really matter nowadays with Windows Terminal and WSL? With WSL2 you can even run GUI apps!


I teach various classes of CS in college. Have been trying to get this into the curriculum for a few years. It seems to be happening after the summer.

Software engineers are way more productive if they know their tools.


Odd. We got taught how to use either teco or sos on tops-10 and it was in our first term (british university term based not semester based) in 1979.

They taught command scripting and code management later.

Debugging was a black art.


Although I commend the idea of practical learning, I wouldn't want to be graded for learning about bash or makefiles sorry. Those things should be self-taught.


Caltech's mechanical engineering program did not teach you how to use machine tools. The courses were all math, more math, lotsa math, a heapin helpin of math, with a side dish of math.


One of my favorite projects. I was most receptive to Athalye's[1] manner of teaching.

1. https://github.com/anishathalye


The "data wrangling" chapter goes off the rails around the time it starts massaging the regex to handle the edge case "What if the user's username is 'Disconnected from'?" In my experience if you are using sed or awk on the logs you should give in and write some Python, or start using an actual log tool like Splunk. You'll never remember the awk syntax anyway.

On the other hand, the idea of piping your log data to a statistics package or gnuplot is fascinating. It could have benefited from showing some output from those programs to illustrate the kind of things you can get from it.


Wow I wish I had been taught any of that before my first job.


Went through the syllabus. The content is really useful since I spent most of my post-graduate years reading man pages for various debugging tools.


It's actually kinda horrifying how many professional software engineers have almost zero working knowledge of these tools.


This is great content. There is an entire generation of "Windows developers" that could use a course like this tailored to the MS stack.


from reading the comments here, i get a sense that most people on hn assume that a cs degree is just for entering the software industry.

not to discount the initiative itself, but bachelors in cs is more about the core cs theory that will be valid decades from now, unless the entire paradigm changes.


I was blessed with a badass AIX teacher. Thank you Coleman University for allowing hackers and AIX in your facility!


Is there anything about AIX that differentiates it or makes it exceptional over other UNIXes?

Not saying you're insinuating that. Just curious about AIX.


Yes, in many ways it is closer to Windows and IBM mainframe/micros semantics, even though it looks like UNIX on the surface.

For example, it uses COFF (although it also does ELF), the shared libraries also use export files, are private by default, and allow lazy loading on demand when symbols are touched for the first time, just like on Windows.

The TUI tooling to manage the OS is similar to what the other IBM platforms use.

Not much I can remember, the last time I used Aix was in 2002.


This is such a great resource. Definitely useful as a newish SWE.


I like how make is like half of a day.

make can be a career in and of itself.


All of mine are missing bar the first one


Hmm, my education must have been better than I give it credit: I definitely was taught all of these concepts at uni.


Which uni was this? Presumably you're a proud alumni, even if you don't want to donate money to them, you might at least consider repping them in this context for having taught you this stuff. Fwiwi, many CS programs focus on the science end of things like Big Oh notation, and utterly fail to discuss any day-to-day, practical applications of things.


I try to minimize the amount of personal info I share. Hacker News community is smart people, and my thread history is triangulation data.

Wouldn’t say I’m a particularly proud alum. They dropped the ball on a wide swath of other topics that I would consider integral. For instance, distributed computing should be a more common undergrad elective, not something you get blindsided by when you get out of school. “Sorry, you said Who-bernetes?” Now, I get that Kubernetes is a relatively young project, as is Docker and other containerization tech, but it is nonetheless staggering how school felt like it was 10 years behind the ‘cutting edge’ when I was there. They seemingly had no interest in educating engineers to go out and join the web community unless that individual sought self-guided study.


Some programs manage to sneak in some of this material in an "operating systems" or "network programming" or other such class where you do more hands-on programming. More so in technical universities and CS Engineering degrees (at least this is the case in Europe).


I don't think it's so much a case of better or worse--but rather that some schools see a place in the CS curriculum for a Computers, Programming, and Tools 101 course and others do not--figuring you know a lot of this stuff or can pick it up.


Ah yes, using tmux so you can monitor your <endless search for for twin primes>.


Except this is just software engineering and has nothing to do with computer science.


In approximately the same way that knowing how to type has nothing to do with computer science.


True, that also has nothing to do with computer science.


Nor does knowing how to read, yet all of those things are useful in studying computer science.


this was posted here a few weeks back


Well I’m glad I didn’t have a full semester dedicated to trivial tools and boomer software




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: