Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

especially at the beginning of their career.

That's why you spend four years in a university before you start your career. If you're not going to learn all the fundamentals, you might as well go to an 8 week code camp and save your time.



But that's exactly the point... these days, even 4 whole years isn't enough to learn "all the fundamentals", at least not while balancing things so that you learn enough practical skills to also do something useful. So what do we do, have people do a 4 year degree, and then go spend 8 weeks, or 16 weeks, or a year, learning to actually build systems?

I don't know, maybe that is the answer. But I suspect the MIT guys have a point in terms of making some small concessions in favor of pragmatism. Of course, one acknowledges that college isn't mean to be trade school... Hmmm... maybe there is no perfect answer.


> even 4 whole years isn't enough to learn "all the fundamentals"

No. You are missing the point. The fundamentals are very simple and easy to learn. That's what makes them "fundamental." It's all the random shit invented by people who didn't understand the fundamentals (or chose not to apply them) that takes a long time to learn.


I disagree. The field has exploded. It's becoming more and more difficult to take vertical slices of every sub-field. What should we consider fundamental?

Programming languages, automata theory, set theory, compilers, assembly language programming, microprocessors and system architecture, algorithms, graph theory, category theory, artificial intelligence, machine learning, operating system, parallel and distributed programming, natural language processing, web design, network architecture, databases.

There are plenty of core fields that I've missed. Which are fundamental? Which do we teach? It simply isn't possible to adequately teach everyone the core fundamentals of all of these fields during the course of an undergraduate degree while also conveying the basic design fundamentals that a software developer needs to know. There is just too much in the field to expect every software developer to have a complete understanding of all of the fundamentals in every sub-field. Our field is getting a lot wider and a lot deeper, and with that, people's expertise is forced to narrow and deepen.


There is the actual complexity, and then there is the accidental complexity lamented by the poster to whom you responded to. I would claim both are a thing. Especially in projects where the true complexity is not that great and the theoretical basis of the solution is not that well documented people have a tendency to create these onion layered monstrosities (the mudball effect).

If we look just traditional CRUD-kin (i.e. database, view, domain logic) programs these seem to be the most in danger of becoming a mudballs. Which is really strange, given that they are the oldest types of programs out there and closely mimic the second use computers came used for (accounting) just after military applications.


> If we look just traditional CRUD-kin (i.e. database, view, domain logic) programs these seem to be the most in danger of becoming a mudballs. Which is really strange, given that they are the oldest types of programs out there and closely mimic the second use computers came used for (accounting) just after military applications.

Perhaps because this type of program is so old, it had so much time to stick lots of mud on it. :-)


> What should we consider fundamental?

A fair question, and a full answer would be too long for a comment (though it would fit in a blog post, which I'll go ahead and write now since this seems to be an issue). But I'll take a whack at the TL;DR version here.

AI, ML, and NLP and web design are application areas, not fundamentals. (You didn't list computer graphics, computer vision, robotics, embedded systems -- all applications, not fundamentals.)

You can cover all the set theory and graph theory you need in a day. Most people get this in high school. The stuff you need is just not that complicated. You can safely skip category theory.

What you do need is some amount of time spent on the idea that computer programs are mathematical objects which can be reasoned about mathematically. This is the part that the vast majority of people are missing nowadays, and it can be a little tricky to wrap your brain around at first. You need to understand what a fixed point is and why it matters.

You need automata theory, but again, the basics are really not that complicated. You need to know about Turing-completeness, and that in addition to Turing machines there are PDAs and FSAs. You need to know that TMs can do things that PDAs can't (like parse context-sensitive grammars), and that PDAs can to things that FSAs can't (like parse context-free grammars) and that FSAs can parse regular expressions, and that's all they can do.

You need some programming language theory. You need to know what a binding is, and that there are two types of bindings that matter: lexical and dynamic. You need to know what an environment is (a mapping between names and bindings) and how environments are chained. You need to know how evaluation and compilation are related, and the role that environments play in both processes. You need to know the difference between static and dynamic typing. You need to know how to compile a high-level language down to an RTL.

For operating systems, you need to know what a process is, what a thread is, some of the ways in which parallel processes lead to problems, and some of the mechanisms for dealing with those problems, including the fact that some of those mechanisms require hardware support (e.g. atomic test-and-set instructions).

You need a few basic data structures. Mainly what you need is to understand that what data structures are really all about is making associative maps that are more efficient for certain operations under certain circumstances.

You need a little bit of database knowledge. Again, what you really need to know is that what databases are really all about is dealing with the architectural differences between RAM and non-volatile storage, and that a lot of these are going away now that these architectural differences are starting to disappear.

That's really about it.


> You need automata theory... Turing-completeness... PDAs and FSAs...

Why? I know that stuff inside and out, and across multiple jobs I have used none of it ever. What chance to use this favourite bit of my education did I miss? (Or rather, might I have missed, so that you might speak to the general case)

> You need to know how to compile a high-level language down to an RTL.

Why? Same comment as above.

> You need to understand what a fixed point is and why it matters.

Well, I don't, and I don't. I request a pointer to suitable study material, noting that googling this mostly points me to a mathematical definition which I suspect is related to, but distinct from, the definition you had in mind.

Otherwise... as I read down this thread I was all ready to disagree with you, but it turns out I'd jumped to a false conclusion, based on the SICP context of this thread. Literacy, what a pain in the butt.

In particular, the idea that ((the notion of an environment) is a fundamental/foundational concept) is a new idea to me, and immediately compelling. I did not learn this in my academic training, learned it since, and have found it be very fruitful. Likewise with lexical vs dynamic binding, actually.


>> You need automata theory... Turing-completeness... PDAs and FSAs...

> Why?

So you can write parsers. To give a real-world example, so you can know immediately why trying to parse HTML with a regular expression can't possibly work.

>> You need to know how to compile a high-level language down to an RTL.

>Why?

So that when your compiler screws up you can look at its output, figure out what it did wrong, and fix it.

>> You need to understand what a fixed point is and why it matters.

> Well, I don't, and I don't. I request a pointer to suitable study material

http://ocw.mit.edu/courses/electrical-engineering-and-comput...

Particularly lectures 2A and 7A.


I'm of two minds about this. Everything you mention is great background to have. (Though non-trivial programs can't be reasoned about mathematically much more than biological systems can).

I think this deep background is a great goal. But in another way programming is a craft. You can learn as you go. There are millions of bright people who could do useful, quality work without an MIT-level CS education. They just need some guidance, structure, and encouragement.


> (Though non-trivial programs can't be reasoned about mathematically much more than biological systems can).

I wouldn't be so sure. I once applied for a company that was about proving various safety properties of control and signalling applications (for trains). Sizeable applications. They have problems with the halting problem and combinatorial explosions, but they get around those and do it anyway.


A very vaguely related question: are bindings lexical or dynamic in R? Or would it be fair to say that it's actually both at the same time? Or do we need a new term altogether?

For those unfamiliar with it, in R, environments are first-class objects, and identifier lookup in them is dynamic. But the way those environments are chained for lookup purposes is defined by the lexical structure of the program (unless modified at runtime - which is possible, but unusual) - i.e. if a function's environment doesn't have a binding with a given name, the next environment that is inspected is the one from the enclosing function, and not from the caller. So R functions have closure semantics, despite dynamic nature of bindings.

It would appear that this arrangement satisfies the requirements for both dynamic scoping and for lexical scoping.


> AI, ML, and NLP and webdesign are application areas

On first thought, I do agree. However, they are fundamental applications. Category Theory is categorizing the fundamentals. It uses a lot of the fundamentals on itself, I guess, but that doesn't mean ti me it's inaccessible or useless.


Don't confuse "important" with "fundamental". He probably meant foundational to begin with.

The web for instance is an unholy mess. We can do better for mass electronic publishing. We don't because that huge pile of crap is already there, and it's so damn useful.


Webdesign IMHO is be an extreme example of formatted output. I/O is a fundamental concept.


>What you do need is some amount of time spent on the idea that computer programs are mathematical objects which can be reasoned about mathematically.

Yes please.


I don't know about your university, but mine at least some coverage of all those categories.

At a minimum an education should give you a strong enough base that you can teach yourself those other things should you so desire.


> Programming languages, automata theory, set theory, compilers, assembly language programming, microprocessors and system architecture, algorithms, graph theory, category theory, artificial intelligence, machine learning, operating system, parallel and distributed programming, natural language processing, web design, network architecture, databases.

That was exactly the curriculum of my CS degree, minus the web design, and I didn't even go to a first rate CS program like MIT (PennStateU 20 yrs ago, no idea what the curriculum is now.).


" It's all the random shit invented by people who didn't understand the fundamentals (or chose not to apply them) that takes a long time to learn."

Thanks, you succeeded in verbalizing succinctly the agony of so many software projects. However, I would claim it's just not ignorance of the fundamentals. It's simply more fun for a particular kind of mind to invent new abstractions and constructs than to solve actual problems. I call it cargo-cult progress.


> It's simply more fun for a particular kind of mind to invent new abstractions and constructs than to solve actual problems.

Spolsky calls these 'architecture astronauts' (http://www.joelonsoftware.com/articles/fog0000000018.html). (As a mathematician, I have a certain leaning to this mindset, so I clarify that I am not criticising it, just reporting on existing terminology.)


It is absolutely essential to have a theory of system while implementing it. The software system itself, however, should exhibit only a simple subset of the whole theory in conscise form as possible. Because - usually the full theory becomes obvious only while writing the software. And, in practice, one uses only a tiny part for the implementation at hand.

I think one facet of this thing is that people have an ambition to build huge systems that encompass the entire universe - while in fact, most software system only need to utilize the already excisting capabilities in the ecosystem.

It's like since people are eager tinkerers they approach software projects with the mindset of miniature railroad engineers - while in fact, the proper way to attack the task should be as brutally simple as possible.

The reason huge theoretical systems are a problem in software engineering is that a) they tend to be isomorphic with things already existing b) while not implementing anything new and c) they obfuscate the software system through the introduction of system specific interface (e.g. they invent a new notation of algebra that is isomorphous in all aspectd to the well known one). And the worst sin is, this method destroys profitability and value.


Amen.

How could they ever understand "simple and easy"? Their concept of simple is not based in reality.

There seems to be this idea (I wish it had a name) that one can just apply a new abstraction layer and the lower layers magically disappear or are rendered insignifcant.

And of course producing 100's or 1000's of pages of "documentation". This is disrespectful toward the reader's time. The best documentation is and will always be the source code. If I cannot read that, then for my purposes there is no documentation.

This is not to say some of these old new thing higher-level of abstraction solutions do not work well or are not easy to use or do not look great. Many people love them, I'm sure. But until I see more common sense being applied it is just not worth my time to learn. I would rather focus on more basic skills that I know will always be useful.


No. You are missing the point. The fundamentals are very simple and easy to learn. That's what makes them "fundamental."

I think what we actually disagree about is just the exact definition of "fundamentals". I personally would not say "the fundamentals are very simple and easy to learn". At least not if you're learning them to the level of depth that I would want to learn them.

Now if we're just saying that people only need a very high level overview of "the fundamentals", then that might swing the equation back the other way. But that, to me, is exactly the knob that the MIT guys were twiddling... moving away from SICP and using Python doesn't mean they aren't still teaching fundamentals, it just sounds like they aren't going quite as deep.

Anyway, it's an analog continuum, not a binary / discrete thing. We (the collective we) will probably always be arguing about exactly where on that scale to be.


> I think what we actually disagree about is just the exact definition of "fundamentals".

That may well be, but as I am the one who first used the word into this conversation (https://news.ycombinator.com/item?id=11630205), my definition (https://news.ycombinator.com/item?id=11632368) is the one that applies here.


Fair enough... from that point of view, I think we agree more than we disagree.

And please don't take anything I'm saying here as an argument against learning fundamentals.. all I'm really saying is that I can understand, even appreciate, the decision MIT made. Whether or not I entirely agree with it is a slightly different issue. But I will say that I don't think they should be excoriated over the decision.


> But that's exactly the point... these days, even 4 whole years isn't enough to learn "all the fundamentals", at least not while balancing things so that you learn enough practical skills to also do something useful

A minor point to make here, college isn't about learning practical skills; that's what trade schools and on the job training do. College is about learning the fundamentals, however, computer science is not software engineering and you don't need learn computer science to be a good software engineer because software engineers don't do much in the way of computers science anyway.


A minor point to make here, college isn't about learning practical skills;

Yes, on paper that is true. And I alluded to that point in my answer above (or at least I meant to). But from a more pragmatic and real world point of view, colleges are at least partially about teaching practical skills. I don't know the exact percentage, but I'm pretty sure a large percentage of college students go in expecting to learn something that will pointedly enable them to get a job. Of course one can argue whether that's a good idea or not, and I don't really have a position on that. But I don't think we can ignore the phenomenon.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: