Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm surprised and a bit dismayed to read Sussman's reasoning:

"...Sussman said that in the 80s and 90s, engineers built complex systems by combining simple and well-understood parts. The goal of SICP was to provide the abstraction language for reasoning about such systems.

Today, this is no longer the case. Sussman pointed out that engineers now routinely write code for complicated hardware that they don’t fully understand (and often can’t understand because of trade secrecy.) The same is true at the software level, since programming environments consist of gigantic libraries with enormous functionality. According to Sussman, his students spend most of their time reading manuals for these libraries to figure out how to stitch them together to get a job done. He said that programming today is 'More like science. You grab this piece of library and you poke at it. You write programs that poke it and see what it does. And you say, ‘Can I tweak it to do the thing I want?''. ... "

Surely graduates of MIT, of all places, would be the ones building the libraries that ordinary programmers are stitching together and would still need the ability to build complex systems from fundamental CS abstractions? I'm absolutely baffled.



>Surely graduates of MIT, of all places, would be the ones building the libraries that ordinary programmers are stitching together and would still need the ability to build complex systems from fundamental CS abstractions? What's going on over there?

Ordinary programmers everywhere are building those libraries, just like your assumed wunderkind are building programs out other people's libraries. The nature of programming has changed for >everyone<.


Yes, but the reason it has changed for everyone is that no one understands the fundamentals any more, because they aren't being taught to anyone. The ecosystem is thus becoming infested by horrible hacks which kinda-sorta work, and which everyone uses, because they kinda sorta work, and there is nothing else. The idea that programmers need to understand how to "program by poking" thus becomes a self-fulfilling prophecy.

[UPDATE:] One of the symptoms of no one understanding the fundamentals is how excited people get about things like XML and JSON, both of which are just (bad) re-inventions of S-expressions.


The ecosystem is thus becoming infested by horrible hacks which kinda-sorta work, and which everyone uses, because they kinda sorta work, and there is nothing else.

Yes, but the reason it has changed for everyone is that no one understands the fundamentals any more, because they aren't being taught to anyone.

I mostly agree with both of these statements, but with a slight twist. To some extent, I feel like the reason that "no one understands the fundamentals any more" is simply because the stack has gotten too deep (or the field has gotten too large if you'd rather visualize it that way). That is, no one has time to learn everything all the way from NAND and NOR gates, up to 7400 series IC's, to microprocessors, to assembly language, to C (portable assembly), to operating systems internals, to TCP/IP, HTTP internals, to HTML, and ultimately to Javascript, and also including a side order of databases, filesystems, machine learning/AI, etc.

As some point people just have to start treating some lower layers as black box abstractions, so they can actually work. Of course it is always beneficial to try and learn some of the lower level fundamentals, but I just don't see any that for everybody to have full knowledge of an entire application stack from end to end.

To me, the best you can do is include fundamentals in an "always keep learning" mindset. That is why, for example, I am still working on learning assembly language (x86/x64) in idle bits of spare time here and there, at the age of 42 and after 20+ years of programming. And why I still go the hackerspace and build circuits from discrete components and low level IC's for fun. But as it happens, for most of my career, not knowing assembly or how to assemble a microcomputer from individual ICs has not prevented me from getting useful stuff done.


> no one has time to learn everything

Nonsense. The fundamentals don't take a long time to learn. And once you know them, everything else becomes much easier to learn. That's the reason that learning the fundamentals matters: it's a huge lever.

Here's a single, small, very accessible book that takes you all the way from switches to CPUs:

http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

SICP gets you from CPUs to Scheme (well, it goes in the other direction, but the end result is the same). That's two books to get from switches to compilers. Anyone who thinks they don't have time for that needs to learn to manage their time better.


Yes, learning the fundamentals is a huge lever. I absolutely agree. But I still stand by the assertion that "no one has time to learn everything" - especially at the beginning of their career.

As the old saying goes "if I had 3 days to cut down a tree, I'd spend the first 2.5 days sharpening my axe". Sure, but at some point you have to actually start chopping. By analogy, at some point you have to quit worrying about fundamentals and learn the "stuff" you need to actually build systems in the real world.

By and large I'm in favor of spending a lot of time on fundamentals, and being able to reason things out from first principles. And when I was younger, I thought that was enough. But the longer I do this stuff, and the larger the field grows, the more I have to concede that, for some people, some of the time, it's a smart tradeoff to spend more of their time on the "get stuff done" stuff.


especially at the beginning of their career.

That's why you spend four years in a university before you start your career. If you're not going to learn all the fundamentals, you might as well go to an 8 week code camp and save your time.


But that's exactly the point... these days, even 4 whole years isn't enough to learn "all the fundamentals", at least not while balancing things so that you learn enough practical skills to also do something useful. So what do we do, have people do a 4 year degree, and then go spend 8 weeks, or 16 weeks, or a year, learning to actually build systems?

I don't know, maybe that is the answer. But I suspect the MIT guys have a point in terms of making some small concessions in favor of pragmatism. Of course, one acknowledges that college isn't mean to be trade school... Hmmm... maybe there is no perfect answer.


> even 4 whole years isn't enough to learn "all the fundamentals"

No. You are missing the point. The fundamentals are very simple and easy to learn. That's what makes them "fundamental." It's all the random shit invented by people who didn't understand the fundamentals (or chose not to apply them) that takes a long time to learn.


I disagree. The field has exploded. It's becoming more and more difficult to take vertical slices of every sub-field. What should we consider fundamental?

Programming languages, automata theory, set theory, compilers, assembly language programming, microprocessors and system architecture, algorithms, graph theory, category theory, artificial intelligence, machine learning, operating system, parallel and distributed programming, natural language processing, web design, network architecture, databases.

There are plenty of core fields that I've missed. Which are fundamental? Which do we teach? It simply isn't possible to adequately teach everyone the core fundamentals of all of these fields during the course of an undergraduate degree while also conveying the basic design fundamentals that a software developer needs to know. There is just too much in the field to expect every software developer to have a complete understanding of all of the fundamentals in every sub-field. Our field is getting a lot wider and a lot deeper, and with that, people's expertise is forced to narrow and deepen.


There is the actual complexity, and then there is the accidental complexity lamented by the poster to whom you responded to. I would claim both are a thing. Especially in projects where the true complexity is not that great and the theoretical basis of the solution is not that well documented people have a tendency to create these onion layered monstrosities (the mudball effect).

If we look just traditional CRUD-kin (i.e. database, view, domain logic) programs these seem to be the most in danger of becoming a mudballs. Which is really strange, given that they are the oldest types of programs out there and closely mimic the second use computers came used for (accounting) just after military applications.


> If we look just traditional CRUD-kin (i.e. database, view, domain logic) programs these seem to be the most in danger of becoming a mudballs. Which is really strange, given that they are the oldest types of programs out there and closely mimic the second use computers came used for (accounting) just after military applications.

Perhaps because this type of program is so old, it had so much time to stick lots of mud on it. :-)


> What should we consider fundamental?

A fair question, and a full answer would be too long for a comment (though it would fit in a blog post, which I'll go ahead and write now since this seems to be an issue). But I'll take a whack at the TL;DR version here.

AI, ML, and NLP and web design are application areas, not fundamentals. (You didn't list computer graphics, computer vision, robotics, embedded systems -- all applications, not fundamentals.)

You can cover all the set theory and graph theory you need in a day. Most people get this in high school. The stuff you need is just not that complicated. You can safely skip category theory.

What you do need is some amount of time spent on the idea that computer programs are mathematical objects which can be reasoned about mathematically. This is the part that the vast majority of people are missing nowadays, and it can be a little tricky to wrap your brain around at first. You need to understand what a fixed point is and why it matters.

You need automata theory, but again, the basics are really not that complicated. You need to know about Turing-completeness, and that in addition to Turing machines there are PDAs and FSAs. You need to know that TMs can do things that PDAs can't (like parse context-sensitive grammars), and that PDAs can to things that FSAs can't (like parse context-free grammars) and that FSAs can parse regular expressions, and that's all they can do.

You need some programming language theory. You need to know what a binding is, and that there are two types of bindings that matter: lexical and dynamic. You need to know what an environment is (a mapping between names and bindings) and how environments are chained. You need to know how evaluation and compilation are related, and the role that environments play in both processes. You need to know the difference between static and dynamic typing. You need to know how to compile a high-level language down to an RTL.

For operating systems, you need to know what a process is, what a thread is, some of the ways in which parallel processes lead to problems, and some of the mechanisms for dealing with those problems, including the fact that some of those mechanisms require hardware support (e.g. atomic test-and-set instructions).

You need a few basic data structures. Mainly what you need is to understand that what data structures are really all about is making associative maps that are more efficient for certain operations under certain circumstances.

You need a little bit of database knowledge. Again, what you really need to know is that what databases are really all about is dealing with the architectural differences between RAM and non-volatile storage, and that a lot of these are going away now that these architectural differences are starting to disappear.

That's really about it.


> You need automata theory... Turing-completeness... PDAs and FSAs...

Why? I know that stuff inside and out, and across multiple jobs I have used none of it ever. What chance to use this favourite bit of my education did I miss? (Or rather, might I have missed, so that you might speak to the general case)

> You need to know how to compile a high-level language down to an RTL.

Why? Same comment as above.

> You need to understand what a fixed point is and why it matters.

Well, I don't, and I don't. I request a pointer to suitable study material, noting that googling this mostly points me to a mathematical definition which I suspect is related to, but distinct from, the definition you had in mind.

Otherwise... as I read down this thread I was all ready to disagree with you, but it turns out I'd jumped to a false conclusion, based on the SICP context of this thread. Literacy, what a pain in the butt.

In particular, the idea that ((the notion of an environment) is a fundamental/foundational concept) is a new idea to me, and immediately compelling. I did not learn this in my academic training, learned it since, and have found it be very fruitful. Likewise with lexical vs dynamic binding, actually.


>> You need automata theory... Turing-completeness... PDAs and FSAs...

> Why?

So you can write parsers. To give a real-world example, so you can know immediately why trying to parse HTML with a regular expression can't possibly work.

>> You need to know how to compile a high-level language down to an RTL.

>Why?

So that when your compiler screws up you can look at its output, figure out what it did wrong, and fix it.

>> You need to understand what a fixed point is and why it matters.

> Well, I don't, and I don't. I request a pointer to suitable study material

http://ocw.mit.edu/courses/electrical-engineering-and-comput...

Particularly lectures 2A and 7A.


I'm of two minds about this. Everything you mention is great background to have. (Though non-trivial programs can't be reasoned about mathematically much more than biological systems can).

I think this deep background is a great goal. But in another way programming is a craft. You can learn as you go. There are millions of bright people who could do useful, quality work without an MIT-level CS education. They just need some guidance, structure, and encouragement.


> (Though non-trivial programs can't be reasoned about mathematically much more than biological systems can).

I wouldn't be so sure. I once applied for a company that was about proving various safety properties of control and signalling applications (for trains). Sizeable applications. They have problems with the halting problem and combinatorial explosions, but they get around those and do it anyway.


A very vaguely related question: are bindings lexical or dynamic in R? Or would it be fair to say that it's actually both at the same time? Or do we need a new term altogether?

For those unfamiliar with it, in R, environments are first-class objects, and identifier lookup in them is dynamic. But the way those environments are chained for lookup purposes is defined by the lexical structure of the program (unless modified at runtime - which is possible, but unusual) - i.e. if a function's environment doesn't have a binding with a given name, the next environment that is inspected is the one from the enclosing function, and not from the caller. So R functions have closure semantics, despite dynamic nature of bindings.

It would appear that this arrangement satisfies the requirements for both dynamic scoping and for lexical scoping.


> AI, ML, and NLP and webdesign are application areas

On first thought, I do agree. However, they are fundamental applications. Category Theory is categorizing the fundamentals. It uses a lot of the fundamentals on itself, I guess, but that doesn't mean ti me it's inaccessible or useless.


Don't confuse "important" with "fundamental". He probably meant foundational to begin with.

The web for instance is an unholy mess. We can do better for mass electronic publishing. We don't because that huge pile of crap is already there, and it's so damn useful.


Webdesign IMHO is be an extreme example of formatted output. I/O is a fundamental concept.


>What you do need is some amount of time spent on the idea that computer programs are mathematical objects which can be reasoned about mathematically.

Yes please.


I don't know about your university, but mine at least some coverage of all those categories.

At a minimum an education should give you a strong enough base that you can teach yourself those other things should you so desire.


> Programming languages, automata theory, set theory, compilers, assembly language programming, microprocessors and system architecture, algorithms, graph theory, category theory, artificial intelligence, machine learning, operating system, parallel and distributed programming, natural language processing, web design, network architecture, databases.

That was exactly the curriculum of my CS degree, minus the web design, and I didn't even go to a first rate CS program like MIT (PennStateU 20 yrs ago, no idea what the curriculum is now.).


" It's all the random shit invented by people who didn't understand the fundamentals (or chose not to apply them) that takes a long time to learn."

Thanks, you succeeded in verbalizing succinctly the agony of so many software projects. However, I would claim it's just not ignorance of the fundamentals. It's simply more fun for a particular kind of mind to invent new abstractions and constructs than to solve actual problems. I call it cargo-cult progress.


> It's simply more fun for a particular kind of mind to invent new abstractions and constructs than to solve actual problems.

Spolsky calls these 'architecture astronauts' (http://www.joelonsoftware.com/articles/fog0000000018.html). (As a mathematician, I have a certain leaning to this mindset, so I clarify that I am not criticising it, just reporting on existing terminology.)


It is absolutely essential to have a theory of system while implementing it. The software system itself, however, should exhibit only a simple subset of the whole theory in conscise form as possible. Because - usually the full theory becomes obvious only while writing the software. And, in practice, one uses only a tiny part for the implementation at hand.

I think one facet of this thing is that people have an ambition to build huge systems that encompass the entire universe - while in fact, most software system only need to utilize the already excisting capabilities in the ecosystem.

It's like since people are eager tinkerers they approach software projects with the mindset of miniature railroad engineers - while in fact, the proper way to attack the task should be as brutally simple as possible.

The reason huge theoretical systems are a problem in software engineering is that a) they tend to be isomorphic with things already existing b) while not implementing anything new and c) they obfuscate the software system through the introduction of system specific interface (e.g. they invent a new notation of algebra that is isomorphous in all aspectd to the well known one). And the worst sin is, this method destroys profitability and value.


Amen.

How could they ever understand "simple and easy"? Their concept of simple is not based in reality.

There seems to be this idea (I wish it had a name) that one can just apply a new abstraction layer and the lower layers magically disappear or are rendered insignifcant.

And of course producing 100's or 1000's of pages of "documentation". This is disrespectful toward the reader's time. The best documentation is and will always be the source code. If I cannot read that, then for my purposes there is no documentation.

This is not to say some of these old new thing higher-level of abstraction solutions do not work well or are not easy to use or do not look great. Many people love them, I'm sure. But until I see more common sense being applied it is just not worth my time to learn. I would rather focus on more basic skills that I know will always be useful.


No. You are missing the point. The fundamentals are very simple and easy to learn. That's what makes them "fundamental."

I think what we actually disagree about is just the exact definition of "fundamentals". I personally would not say "the fundamentals are very simple and easy to learn". At least not if you're learning them to the level of depth that I would want to learn them.

Now if we're just saying that people only need a very high level overview of "the fundamentals", then that might swing the equation back the other way. But that, to me, is exactly the knob that the MIT guys were twiddling... moving away from SICP and using Python doesn't mean they aren't still teaching fundamentals, it just sounds like they aren't going quite as deep.

Anyway, it's an analog continuum, not a binary / discrete thing. We (the collective we) will probably always be arguing about exactly where on that scale to be.


> I think what we actually disagree about is just the exact definition of "fundamentals".

That may well be, but as I am the one who first used the word into this conversation (https://news.ycombinator.com/item?id=11630205), my definition (https://news.ycombinator.com/item?id=11632368) is the one that applies here.


Fair enough... from that point of view, I think we agree more than we disagree.

And please don't take anything I'm saying here as an argument against learning fundamentals.. all I'm really saying is that I can understand, even appreciate, the decision MIT made. Whether or not I entirely agree with it is a slightly different issue. But I will say that I don't think they should be excoriated over the decision.


> But that's exactly the point... these days, even 4 whole years isn't enough to learn "all the fundamentals", at least not while balancing things so that you learn enough practical skills to also do something useful

A minor point to make here, college isn't about learning practical skills; that's what trade schools and on the job training do. College is about learning the fundamentals, however, computer science is not software engineering and you don't need learn computer science to be a good software engineer because software engineers don't do much in the way of computers science anyway.


A minor point to make here, college isn't about learning practical skills;

Yes, on paper that is true. And I alluded to that point in my answer above (or at least I meant to). But from a more pragmatic and real world point of view, colleges are at least partially about teaching practical skills. I don't know the exact percentage, but I'm pretty sure a large percentage of college students go in expecting to learn something that will pointedly enable them to get a job. Of course one can argue whether that's a good idea or not, and I don't really have a position on that. But I don't think we can ignore the phenomenon.


>> "no one has time to learn everything" - especially at the beginning of their career.

I wish I had this book at the beginning of my career. http://www.amazon.com/Elements-Computing-Systems-Building-Pr.... Makes you design the hardware, then the software for that hardware.

Should not take more than 8 - 12 weeks with school work/day job.


"no one has time to learn everything"

right. No one has time to learn endless js frameworks, injections and reinventions of the wheel. So (1) read the fucking SICP book, (2) learn about the business problem you are trying to solve, put 1 and 2 together and "get stuff done".


This is what co-op programs address. 5 year degree, learn all the fundamentals from silicon to applications, with 6 co-op placements of 4 months each interspersed throughout. That was the Waterloo formula when I went through their CS program and it works tremendously well. Sure, it's a lot to learn in 5 years, and there's always more to learn, but it does give you a very solid foundation to build on.


> SICP gets you from CPUs to Scheme

I don't recall anything about CPUs in SICP. Its more about data driven programming and writing of intepreters.

What i liked about SICP and scheme programming was that it is a pretty good environment for tinkering - the REPL makes it easy to combine functions, and to work in a bottom-up manner. (btw you had less of that in common lisp, and most other environments that teach you to work in a top down way, however you can still work with the Python REPL).

Maybe this bottom-up approach is what Sussman really has in mind when he is talking about first principles, because SICP is really not about working up from the atoms to the scheme interpreter/compiler.


> I don't recall anything about CPUs in SICP.

https://mitpress.mit.edu/sicp/full-text/book/book-Z-H-30.htm...


yes, the compiler target. I stand corrected.


I bought this book for my son who will be starting a CS program in the fall. He seems to have enjoyed it, and I'm hopeful it will give him a good grasp of the fundamentals that you might miss by starting out with Java.


> learn everything all the way from NAND and NOR gates, up to 7400 series IC's, to microprocessors, to assembly language, to C (portable assembly), to operating systems internals, to TCP/IP, HTTP internals, to HTML, and ultimately to Javascript, and also including a side order of databases, filesystems, machine learning/AI, etc.

It's called a Computer Engineering degree, sometimes called EECS (as at MIT). I did it and you can too. The Javascript and HTML were self-taught, admittedly, because they're the easiest parts of that list.


It's called a Computer Engineering degree, sometimes called EECS (as at MIT). I did it and you can too.

And by your own admission you didn't learn all the stack elements I listed above. And that was not in any way, shape, form, or fashion an exhaustive list!

Sure, if you finish a four year degree in CS/EE/EECS you learn a lot of stuff... and if you spend a big chunk of that four years on the really low level stuff, you have to tradeoff time spent on higher levels of the stack. You can only pour so much water in a 1 gallon bucket.

And even then, you only get the fundamentals are a certain level of depth. At some point, one has to ask "how important is it that I be able to go out, buy an 8088 chip, a memory controller chip, a floppy drive controller chip, etc., solder a motherboard together, code an OS in assembly, write my own text editor, etc, etc., etc."

Don't get me wrong. I'm not against teaching the fundamentals, and I'm not even sure I agree with MIT's decision on this. But I will say I can understand it and cede that it has some merit.

And all of that said, I'll go back to what I said earlier.. to me, the important thing is to continue learning even after college, including going back to fundamentals like building circuits from individual transistors and what-not. That stuff has value, you there's no reason to think you can't be productive even without that.

I mean, if you think about it, every field eventually segments into layers where certain practitioners treat some things as a black box. Does an engineer building a car also need to be a metallurgist or materials scientist? No, he/she just grabs a book, and looks up the parameters for the correct material for the application at hand. Etc.


For those that want to learn that information in a structured way, but don't want to accumulate debt doing it, there are some online resources that are meant to be pretty good. I've heard good things about NAND2Tetris, for example, would be interested to hear if anyone here has given it a go.

http://nand2tetris.org/


It's not because no-one understands the fundamentals, it's that we require so much functionality to be built in such a short space of time and can afford so much processor time for it, that there's no time to build everything from fundamentals. Thirty years ago you'd build a text editor or modem control interface into your program. Today you need to embed an entire web browser, SQL database, AAA-level game engine etc. Most 'trivial' stitch-some-libraries-together software built today would take decades to create from scratch.


People were excited about JSON because there was a desperate need for a data format that wasn't overly complex and/or unsafe to parse.

S-expressions just weren't up to snuff because there's no standard way to do key-value mapping.


Sigh. This is exactly the kind of lack of understanding that I'm talking about.

> S-expressions just weren't up to snuff because there's no standard way to do key-value mapping.

That's not true. There are two standard ways to do this: a-lists and p-lists.


"Two standards" isn't much different from "no standard" in this context. JSON was successful because it had a simple spec that almost anyone could implement for any language with virtually no ambiguity. It's not ubiquitous because it's good, it's good because it's ubiquitous. It's another episode in the long history of "worse-is-better".


Formally, I think this to be a good list of fudamentals: programing paradigms, algorithms, data structures, compilers, operating systems, networking, math for CS.

What else would you recommend adding to the fundamentals list?


Digital logic and computer architecture.


Heh, it is also a consequence of the web limitations (can't easily replace javascript, or anything else without breaking backwards compatibility) and its associated technologies. You can only poke the browser, not understand it.

I find funny that only last year with MVC scaffolding generators I can do forms to access a database just as easily as I was doing them in Foxpro 20 years ago.

Yes, now it is client server, safe against common hacking attacks, responsive, etc. But it's been 20 years !

And the productivity valley between both points is abysmal.


I'm not sure that it's because no one understands the fundamentals any more. (Which, BTW, the latter part I agree with.) I think it's because everything's gotten too large. You need to use too many frameworks and too many libraries just to get anything done today, and there's simply too much code there (assuming you're even allowed to look at the source!) to comprehend it.

I don't like it, but I think it's an unavoidable consequence of computing's evolution into ubiquity.


That's because at some point the industry lost the plot.

The fundamentals are fundamental. They don't change very fast.

Applications change all the time. So do frameworks. But there's very little genuinely new in most frameworks, or in most languages for that matter. They're mostly repackagings of the same few ideas.

Which is why it's a lot easier to pick up applications and do a good job with them if you know the fundamentals than if you're hacking away without any contextual or historical understanding.

Meanwhile someone is going to have to do the next generation of pure research, and it's a lot harder to do something creative and interesting in CS if all you've ever known is js, Python, and Ruby.

The reality is that software quality is decreasing. Never mind maintainability or even documentation - applications are becoming increasingly buggy and unreliable.

It's common in the UK now for bank systems to crash. Ten years ago it was incredibly rare, and twenty years ago it was practically unthinkable.

Software is too important to be left to improvisation and busking. So "just learn to make applications from other applications" is not a welcome move.


I wonder how much of this is due to the idea: 1) non-technical users need to be able to use our software. How much code today is about preventing users from doing something they should know better? 2) we're less and less able to say "no, that goes way beyond the project's scope" to our bosses. Our bosses will quickly reply "yeah well Google's ______ does it, so why can't we?".


this is due to the idea that modern sw developers do not care what they do, but care a lot about how they do things. They pay no attention to the business problem at hand (in my experience they get very upset when I try to draw their attention to the business requirements) but spend all their time discussing what *-pad framework is the best candidate.


A few years when MIT switched from SICP/Scheme to Python, Sussman had this to say:

"I asked him whether he thought that the shift in the nature of a typical programmer’s world minimizes the relevancy of the themes and principles embodied in scheme. His response was an emphatic ‘no’; in the general case, those core ideas and principles that scheme and SICP have helped to spread for so many years are just as important as they ever were"

From: https://cemerick.com/2009/03/24/why-mit-now-uses-python-inst...

If anything, I would think Sussman is more practical, and understands what the world needs/expects(now).

Literally any programmer who hasn't read SICP before will benefit from it. I think the principles still apply.


In the context of the class (The Structure and Interpretation of Computer Programs), I feel like this is not such an outlandish view. It sounds to me like the field of software engineering has simply evolved since the 80s.

Don't get me wrong, if you are going for post-graduate studies such a course will always be relevant, but it sounds like he is talking within the context of undergraduates. And in the context of undergraduates, I too would be circumspect of how useful it would be for preparing you for your first job as a Software Engineer.

Their choice to go toward a Python-based course at the undergraduate level would also seem to reaffirm this view from afar...


> It sounds to me like the field of software engineering has simply evolved since the 80s.

What is ridiculous in the face of this "programming by experimentation" fantasy is that programming has evolved since the 1980s... to be even more about composable abstractions with provable semantics. Hindley-Milner-Damas types and monads are now everywhere.


> Hindley-Milner-Damas types and monads are now everywhere.

Haven't run into those. Perhaps I know them by a different name?


Most likely you have. Or simply something very heavily inspired by either.


Can you expand on the last sentence. I'm not sure I understand what you were trying to express. (not trolling, genuinely curious)


The application of mathematical type theory (https://en.wikipedia.org/wiki/System_F) to popular programming languages goes back to 1998 when Philip Wadler designed generics for Java.

Local type inference is now used in Visual Basic, Scala, Rust, probably a lot of other new languages I am missing. Gradual types are coming to Clojure and probably Python and Lua.

Erik Meijer did a lot of work on bringing monads and FRP as patterns to .NET programmers. Java 8 has monads (Optional and Stream interfaces). Bartosz Milewski has been getting a lot of attention in C++ circles (see his blog: https://bartoszmilewski.com/)


Its application to unpopular languages goes back farther, of course.


Great, thanks for the clarification.


MIT grad here: Nope. I've never had a workday that didn't involve reading through the docs or source of a library I didn't write. I am genuinely grateful to have taken 6.01, the course that replaced 6.001.

I am also grateful to have taken the condensed version of 6.001. You do need the ability to understand those abstractions in order to be an informed shopper though.


Your assumption about who writes libraries and who uses whose code is broken.


I think he meant to say "should".


The point still stands if you add a couple of shoulds to my response as well.


>Surely graduates of MIT, of all places, would be the ones building the libraries that ordinary programmers are stitching together and would still need the ability to build complex systems from fundamental CS abstractions?

"MIT" doesn't necessarily mean "super-elite programmer". I work in an office that's half MIT grads, and the non-MIT half are pretty much equivalently good (though with much worse connections around Cambridge). That's not to say MIT sucks or anything, but more to say that with luck, a really solid CS or EECS degree gets the student up to being able to build important components from scratch at all, which isn't necessarily the level needed to build those components from scratch for public release or for profit. That latter goal requires a good education followed by professional training and experience.


>libraries that ordinary programmers are stitching together

I watched the SICP videos, and I remember Abelson specifically endorsing just that.


Calling it his "reasoning" with all the connotations that come with that word goes way too far.

This was his polite implicit criticism of the new core, which among other things also teaches much less in the way of EE fundamentals, a topic he's cared about very much since at least the late '70s (i.e. 6.002 is no longer a required course for all EECS majors).

The bottom line is that in the post-dot.com crash panic which saw MIT EECS enrollment drop by more than half, after it had been steady at 40% of the undergraduates for decades, one result was "MIT" deciding that what a MIT EECS degree meant would be substantially different.


It may have been that it just took 7 years to actually get a new course in place, but it wasn't until fall 2007 that MIT officially got rid of 6.001 as required course, well after the dot com crash.

There were a TON of changes that happened with the MIT EECS curriculum at that time, so perhaps it was a holistic response to the dot com crash that was beyond just 6.001.


I think "panicked spasm" is more accurate than "holistic response", at least in connotation, but as I recall the only real changes were in what was required, terminating the use of Scheme in the entire required curriculum with extreme prejudice, and adding, what, 6.005? Where they claimed to teach most of what was in 6.001/SICP, but using Java, a language which is "not even wrong" for that purpose.

Yeah, MIT has become a Javaschool, plus Python....

And just when the failure of Dennard scaling was making functional programming a lot more important.


No, there were many changes to the curriculum, and for a variety for reasons.

They created 6.01, which serves at an "intro to EECS" so it involves both EE and CS, as opposed to the old intro 6.001 which was really an introduction to CS. Some argue that this course is easier (I never took it, so I can't say) which could definitely be argued is to make EECS a little more gentle and open in response to the dot com crash.

They also broke up 2 very difficult courses, 6.046 introduction to algorithms and 6.170 lab for computer science, and at least put some of their course work into new courses. Again this could be seen as making the entire major a little gentler.

They also changed requirements. In the past there had been a lot of CS focused students who were uninterested in doing any EE, and were choosing to major in 18.c (math with computer science) to avoid an EE course load. The department right thought it was a little crazy that people were leaving the CS department in order to focus more on CS, so they lightened the required EE coursed for 6.3 (a major focused on CS) and vice versa.

This is all my recollection from the 07-09 era and from talking to some students since. There could be some errors in details.


> And just when the failure of Dennard scaling was making functional programming a lot more important.

Yep, there's the real irony. Having functional programming skills and experience is a real asset in today's job market, I've found.


I took 6.002x when MITx was first launched. I don't know how similar it is to 6.002, but if it's still available, I highly recommend it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: