this post was submitted on 02 Mar 2024
61 points (90.7% liked)
Technology
59436 readers
4272 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
In the good old days, you had to learn assembly/machine language, C, and OS-level programming to get anything done. Even if you mostly worked on applications, you'd drop down and do something useful. At the time, this was writing machine language routines to call from BASIC. This is still a practical skill, for instance I mostly work in Scheme, but use C FFI to hook into native functionality, and debug in lldb.
Computer Science is supposed to be more math than practical, though when I took it we also did low-level graphics (BIOS calls & framebuffers), OS implementation, and other useful skills. These days almost all CS courses are job training, no theory and no implementation.
Younger programmers typically have no experience below the application language (Java, C#, Python, PHP) they work in, and only those with extensive CS degrees will ever see a C compiler. Even a shell, filesystems, and simple toolchains like Make are lost arts.
The MIT Missing Semester covers some of the mid-high levels of that, but there's no real training in the digital logic to OS levels.
I will totally check this out, thanks for the reference!