Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Why I like Java (plover.com)
339 points by snupples on March 25, 2014 | hide | past | favorite | 425 comments


This is one of those criticisms that sounds right if you know a little bit about the topic, and is certainly clever enough in and of itself to make the reader think that it's making a good point.

But it's not making a good point. It's not even close. I wish people would stop voting it up.

- "In Java, you can forget about doing it in the cleanest or the best way, because that is impossible." -- cleanest in any language? Or cleanest for Java? Of course the "cleanest" way for Java is possible, and it does matter -- there is a huge difference between "clean" Java and unclean, and anyone who says otherwise is a Java pretender. And arguing about "cleanest for any language" is just a proxy for a language flamewar.

- "And nobody can glare at you and demand to know why you used 576 classes when you should have used 50, because in Java doing it with only 50 classes is probably impossible." -- The "Har-har-so-many-classes-and-don't-even-get-me-started-on-variable-names" argument. In Java, as in any language, you can over-design, or you can under-design, or you can design just what you need. The number of classes may end up higher than in other languages, but this argument is silly and tired, and yes, length is one of the tradeoffs of Java. An IDE (gasp!) should make that largely irrelevant.

- "The code was ridiculously verbose, of course, but that was not my fault. It was all out of my hands." -- This word "verbose" keeps being thrown around as an insult, but nobody brings up the tradeoffs. Verbosity has a purpose, just like brevity and terseness do. And of course it's never out of the programmers' hands. Once given a language, it's all relative.

There's more, but I'm arguing with a person who believes that a question about stdin and stdout is a proper gateway for measuring skill toward any programming problem ever.


I took his post as a fairly standard "worse is better" argument. Many newer or more fashionable languages enable some very elegant programming styles, but this comes with a concern about the elegance of one's code, which can easily result in the programmer spending more time thinking about elegance than about functionality. For example... should I use map or reduce here, or maybe an iterator, or... oh fuck it, I'll use a for loop. It turns out that the old-school for loop works just as well despite earning you precisely zero style points.

There's actually something quite liberating about languages that deny you any clever solutions. Just write code that works and don't worry about whether you could have used <fashionable programming concept X> instead. I found Go to be quite refreshing for the same reason - the standard library is pretty small and the language lacks anything particularly magical, even some stuff that Java has like generics. The end result, however, is code that is very easy to read and write for anyone who learned imperative programming in the last 30 years.


> Many newer or more fashionable languages enable some very elegant programming styles, but this comes with a concern about the elegance of one's code, which can easily result in the programmer spending more time thinking about elegance than about functionality.

That's a common anti-intellectual fallacy about why people code in higher level languages. High level languages are attractive because they let you write code faster and make your code easier to maintain, and not because they're "fashionable".

>There's actually something quite liberating about languages that deny you any clever solutions

To me, "deny[ing] clever solutions" translates to "this code is going to be a pain in the ass to write because I'm going to waste time fighting the language to get what I want." Programmers learn to avoid unnecessarily complicated code as they gain more experience. Mature programmers should have the freedom to exercise their judgement about what kind of code is appropriate.


> That's a common anti-intellectual fallacy about why people code in higher level languages. High level languages are attractive because they let you write code faster and make your code easier to maintain, and not because they're "fashionable".

Writing code faster is not my problem. The problem is building systems (that work) faster and this need sometimes requires using higher-level abstractions that maybe allow for more modularity, that maybe are more composable, or maybe are safer to use (and I'm specifically thinking here of concurrency, parallelism, asynchronous I/O, etc...). Higher-level abstractions is my goal when learning a new language (this or gaining access to another platform).

"clever solutions" is indeed an unwarranted euphemism that people use. But we are ending up having such discussions, precisely because we aren't defining well what we are talking about. Writing less code is a subjective problem. Not being able to build and/or work with a certain abstraction is an objective problem.


>High level languages are attractive because they let you write code faster and make your code easier to maintain, and not because they're "fashionable".

Maybe, maybe not. But at least 2 of the 3 points that you made are not true of JavaScript. Also, the whole "write code faster" has always been perplexing to me. The speed at which you write code is never significantly different between any two mainstream languages to really matter in the end, especially since the lifetime of a piece of software is dominated by maintenance.

And I don't know what you mean by "High level language". Java is a high level language.

>To me, "deny[ing] clever solutions" translates to "this code is going to be a pain in the ass to write because I'm going to waste time fighting the language to get what I want."

There's clever and then there's too clever.


"Also, the whole "write code faster" has always been perplexing to me. The speed at which you write code is never significantly different between any two mainstream languages to really matter in the end, especially since the lifetime of a piece of software is dominated by maintenance."

One of the very few bits of relatively solid software engineering that we have is that line count for a given task does matter. Fewer lines written by the programmer to do the same thing strongly tends to yield higher productivity. (Note the "by the programmer" clause; lines autogenerated... well... correctly autogenerated tend not to count against the programmer, which is an important part of doing Java, or so I hear.)

And remember, if this were not true, we'd be programming entirely differently; why do anything but assembler if line count doesn't matter? You might be tempted to thing that's some sort of rhetorical exaggeration, since it sort of sounds like one, but it's not; it's a very serious point. If line counts are actually irrelevant, then we'd never be bothering with high-level languages, which up until fairly recently have the primary purpose of doing a whole bunch of things which, in the end, reduce line count.

(Slowly but surely we're starting to see the mainstream introduction and use of languages that also focus on helping you maintain invariants, but that has still historically speaking been a sideline task and niche products.)


> Note the "by the programmer" clause; lines autogenerated... well... correctly autogenerated tend not to count against the programmer, which is an important part of doing Java, or so I hear.

Sorry, but unless you have an architect dissect the problem to exaustion and freeze the architecture after it, no piece of software creates the correct autogenerated code, and those lines still have to be changed by the programmer. Several times.

And if you have an architect dissect the problem to exaustion and freeze the architecture after it, that's already a bigger problem than dealing with all that autogenerated code. No win.


If your autogenerated files have to be changed by the programmer, they're not autogenerated files, they're templates.


Do you have a citation for that?

Because I'd totally buy that the number of expressions matters.

But I really doubt actual lines matters much.


In theory, I have a citation. There have been actual studies done that show roughly equal productivity in several languages as measured by lines of those languages. However, I can't google them up through all the noise of people complaining about line counts being used for various things. And I phrased it as "one of the very few bits of relatively solid software engineering" on purpose... that phrase isn't really high praise. You can quibble all day about the precise details, not least of which is the age of the studies in question.

Still, I do stick by my original point... if you think lines of code are irrelevent, it becomes very difficult to understand the current landscape of language popularity. A language in which simply reading a line from a file is a multi-dozen line travesty is harder to use than a language in which it's two or three, and that extends on through the rest of the language. I know when I go from a language where certain patterns are easy into a higher B&D language where the right thing is a lot more work, I have to fight the urge to not do the lot-more-work, and this higher level "how expensive is it to use the correct pattern?" is a far more important, if harder to describe, consideration across non-trivial code bases.


I wasn't attacking your comment. Just curious about the citation since it doesn't intuitively sit right with me I guess.


That's a distinction without a difference.

1) How do you count "expressions"? is (b+sqrt(b * b - 4 * a * c))/(2 * a) one expression or 14?

2) Assuming reasonable coding style and reasonable definition for what an "expression" is, the variance of the measurement "expressions per line" will be very small - thus, "number of expressions" and "number of lines" are statistically equivalent as far as descriptive power goes.

I don't have a citation, although I do remember this conclusion mentioned in PeopleWare - specifically, that "number of bugs per line" tends to be a low variance statistic per person, with the programming language playing a minor role. I might be wrong though.

But I can offer my personal related experience ("anecdata"?) - when you ask multiple people to estimate project complexity using a "time to complete" measure, you get widely varying results that are hard to reason about. However, when you ask them to estimate "lines of code", you get much more consistent results, and meaningful arguments when two people try to reach an agreement. YMMV.


I feel like you probably haven't coded IO in a language like c# (in earlier versions anyways) or Java if you think I'm playing semantics.

Expressions are distinct from Compositions, and both influence LOC. I wouldn't suspect that Java software is of generally lower quality than Ruby code on average for example even though in Java you might see a Reader around a Buffer around a Stream instead of Ruby's `open`.

I guess what I'm getting at is what you might loosely call boiler-plate. Java has a lot more boiler plate. Which could easily result in 2X higher LOC. Having worked with more Ruby than the average bear I feel very confident being skeptical of the assertion that Ruby libraries are generally of higher quality/fewer bugs.

I think your last anecdote is more getting into Five Whys territory, and it's probably reasonable to expect a greater degree of consensus then.

Final note: Scala is typically less verbose than Ruby by a fair margin (at least if you leave off imports). Idiomatically usage is also Functional to a significant degree in a way that no Ruby library I've ever seen comes close to. So does that automatically mean that Scala is the superior language? (Well of course it is ;-D, but is that the reason?)


The question is simple, and it's about math and statistics.

How do you count lines? On unix, "wc -l"; if you insist, sloccount, but "wc -l" is a good approximation.

How do you count expressions? The fact it will take you a few paragraphs to answer (you haven't, btw) indicates that it's a poor thing to measure and try to reason about.

I've done some IO code in C# (mostly WCF, bot not just), and I still think you are playing with semantics as far as statistics is concerned.

Figure out an objective, automatable way to count your "expressions" or "compositions" or "code points" or "functional points" or whatever you want to call it. Run it on a code base, and compute the Pearson r coefficient of correlation. It's likely to be >95%, which means one is an excellent approximation of the other.

And I have no idea what you were trying to say about Scala. I wasn't saying "terser is automatically better". I was saying, (and I'm quoting myself here: "number of bugs per line" tends to be a low variance statistic per person, with the programming language playing a minor role"). Note "per person"?


So backwards first I guess. "per person". Ok. But given the range of programmers I guess that's not an incredible surprise. Yes the person is more important than the language. I'd buy that.

I guess "expression" seems semi-obvious to me since it's a standard rule in SBT. Variable assignments, return values and function bodies might get close.

  val a = 1 + 1
That would be an expression. Instantiating a DTO with a dozen fields, using keyword arguments and newlines between for clarity would be a single expression to me.

An if/else with a simple switch for the return value would be an expression for example. A more complex else case might have nested expressions though.

It takes some charity I suppose; one of those "I know it when I see it" things. I don't do a lot of Math based programming though. It's all business rules, DTOs, serialization, etc. So maybe not something that could be formalized too easily.

I guess where I'd intuitively disagree (and would be interested in further reading) is that LOC as a measure just doesn't feel like it works for me.

Considering only LOC to implement a task it's likely: Java, Ruby and Scala in that order (from most to fewest). But in my personal experience bugs are probably more like: Ruby, Java, Scala from most to fewest.

Hopefully that helps clarify and not just muddy what I'm trying to express further.

What confuses me is that you appear to be claiming that fewer LOC should correlate strongly with fewer bugs, but then go on to say that terser is not automatically better (in this context (sic?)). Maybe I'm reading more into it than you intend, but I'm left a bit confused.


> one of those "I know it when I see it" things.

Which is a confusing use of the term "expression", since it is very well defined when talking about languages - in fact, most formal grammars have a nonterminal called "expr" or "expression" when describing the language.

Your description, though, more closely correlates with what most languages consider a statement.

Regardless, it's just pure statistics - if you calculate it, you'll notice that you have e.g. 1.3 expressions per line, with a standard deviation of 1 expressions per line - which means that over 1000 lines, you'll have, with 95% confidence, 1200-1400 expressions -- it wouldn't matter if you measure LOC or "expressions".

> What confuses me is that you appear to be claiming that fewer LOC should correlate strongly with fewer bugs, but then go on to say that terser is not automatically better (in this context (sic?)). Maybe I'm reading more into it than you intend, but I'm left a bit confused.

What I'm claiming is that, when people actually measured this, they found out that a given programmer tends to have a nearly constant number of bugs per line, regardless of language - that is, person X tends to have (on average) one bug per 100 lines, whether those lines are C, Fortran or Basic - the variance per programmer is way larger than the variance of that programmer per language.

Now, PeopleWare which references those studies (where I read about that) was written 20 years ago or so - so the Java or C++ considered wasn't today's Java/C++, things like Scala and Ruby were not considered. However, I'd be surprised if they significantly change the results - because those studies DID include Lisp, which -- even 20 years ago -- had everything to offer that you can get from Scala today.

So, in a sense - yes, you should write terse programs, regardless of which language you do that in. If you wrote assembly code using Scala syntax, and compiled with a Scala compiler - Scala is not helping you one bit.


No, his point is that you don't have to fight the language. The course in the language is obvious, it's just long and kind of ugly. It's wading, not fighting.

My problem is that the solution in a slightly more expressive language is equally obvious, so I have to fight my own rage at java for its stuttering and clumsy excuse for closures. Fighting the language, on the other hand, is when I try to write a conditional or loop in sh.


> Many newer or more fashionable languages enable some very elegant programming styles, but this comes with a concern about the elegance of one's code, which can easily result in the programmer spending more time thinking about elegance than about functionality.

This isn't a bad tradeoff though. Elegance means generally gains in maintainability, possibly with lesser costs in the actual development. And thinking about elegance in code is the first step to writing better, more maintainable code.

> For example... should I use map or reduce here, or maybe an iterator, or... oh fuck it, I'll use a for loop. It turns out that the old-school for loop works just as well despite earning you precisely zero style points.

I usually stick with for loops because they are clear. Remember:

elegance is about simplicity and clarity. If you sacrifice either of these, you are reducing the elegance of your code.


The "worse is better" argument is in the context of Unix and C and cannot be separated from that context, otherwise it is meaningless.

And a lot of thought went into Unix, as evidenced by its longetivity and long lasting tradition of its phylosophy. To date it's the oldest family of operating systems and at the same time, the most popular. Anybody that thinks the "worse" in the "worse is better" argument is about not carrying, is in for a surprise: http://en.wikipedia.org/wiki/Unix_philosophy

Even in the original comparisson to CLOS/Lisp Machines outlined by Richard Gabriel, he mentions this important difference (versus the MIT/Stanford style): It is slightly better to be simple than correct.

But again, simplicity is not about not carrying about design or the implementation and in fact the "worse is better" approach strongly emphasises on readable/understandable implementations. And simplicity is actually freaking hard to achieve, because simplicity doesn't refer to "easy", being the opposite of entanglement/interwiving: http://www.infoq.com/presentations/Simple-Made-Easy


"Worse is better" can easily be separated from that context, though I would admit that most people do it incorrectly.

"Worse is better" is, ultimately, an argument against perfectionism. Many of the features of Unix could have been implemented in a "better" way, and these ways were known to people working at the time. But it turns out that those "better" options are much more difficult to implement, harder to get right and are ultimately counter-productive to the goal of delivering software that works. We can set up clear, logical arguments as to why doing things the Unix way is worse than doing things another way (e.g. how Lisp Machines would do it), but it turns out that the Unix approach is just more effective. Basically, although we can invent aesthetic or philosophical standards of correctness for programs, actually trying to follow these in the real world is dangerous (beyond a certain point, anyway).

I think that's pretty similar to the OP's argument that, whilst Haskell is clearly a superior language to Java in many respects, writing code properly in Haskell is much harder than doing so in Java because, probably for entirely cultural reasons, a programmer working with Haskell feels a greater need to write the "correct" program rather than the one that just works. Java gives the programmer an excuse to abandon perfectionism, producing code that is "worse" but an outcome that is "better".

I think I know what you're getting at, which is that a comparison between Unix and the monstrous IDE-generated Java bloatware described in the OP is insulting to Unix. On this you are correct. But for "worse is better" to be meaningful, there still has to be some recognition that, yes, Unix really is worse than the ideal. Unix isn't the best thing that could ever possibly exist, it's just the best thing that the people at the time could build, and nobody has ever come up with a better alternative.


I do not agree. "Worse is better" emphasizes on simplicity - and as example, the emphasis on separation of concerns by building components that do one thing and do it well. It's actually easier to design monolithic systems, than it is to build independent components that are interconnected. Unix itself suffered because at places it made compromises to its philosophy - it's a good thing that Plan9 exists, with some of the concepts ending in Unix anyway (e.g. the procfs comes from Plan9). And again, simplicity is not the same thing as easiness.

> Haskell is clearly a superior language to Java in many respects, writing code properly in Haskell is much harder than doing so in Java

I do not agree on your assessment. Haskell is harder to write because ALL the concepts involved are extremely unfamiliar to everybody. Java is learned in school. Java is everywhere. Developers are exposed to Java or Java-like languages.

OOP and class-based design, including all the design patterns in the gang of four, seem easy to you or to most people, because we've been exposed to them ever since we started to learn programming.

Haskell is also great, but it is not clearly superior to Java. That's another point I disagree on, the jury is still out on that one - as language choice is important, but it's less important than everything else combined (libraries, tools, ecosystem and so on).


I think Worse is Better can be used by either side. You seem to be on the "Worse" side, ie. the UNIX/C/Java side, and claim the moral of WIB to be that perfect is the enemy of good. That's a perfectly fair argument.

However, on the "Better" side, ie. the LISP/Haskell side, the moral of WIB is that time-to-market is hugely important. It's not that the "Better" side was bogged-down in philosophical nuance and was chasing an unattainable perfectionism; it's that their solutions took a bit longer to implement. For example, according to Wikipedia C came out in '72 and Scheme came out in '75. Scheme is clearly influenced by philosophy and perfectionism, but it's also a solid language with clear goals.

The problem is that Scheme and C were both trying to solve the 'decent high-level language' problem, but since C came out first, fewer people cared about Scheme when it eventually came out. In the mean time they'd moved on to tackling the 'null pointer dereference in C problem', the 'buffer overflow in C' problem, the 'unterminated strings in C' problem, and so on. Even though Scheme doesn't have these problems, it also doesn't solve them "in C", so it was too difficult to switch to.

Of course, this is a massive simplification and there have been many other high level languages before and since, but it illustrates the other side of the argument: if your system solves a problem, people will work around far more crappiness than you might think.

More modern examples are Web apps (especially in the early days), Flash, Silverlight, etc. and possibly the Web itself.


> The problem is that Scheme and C were both trying to solve the 'decent high-level language' problem, but since C came out first, fewer people cared about Scheme when it eventually came out. In the mean time they'd moved on to tackling the 'null pointer dereference in C problem', the 'buffer overflow in C' problem, the 'unterminated strings in C' problem, and so on. Even though Scheme doesn't have these problems, it also doesn't solve them "in C", so it was too difficult to switch to.

C is quite odd in that the programmer is expected to pay dearly for their mistakes, rather than be protected from them. BTW it wouldn't be as much fun if they were protected.

Regarding Scheme, it has withstood the test of nearly forty years very well.


C is unique because it's really easy to mentally compile C code into assembler. Scheme is more "magical".

The more I learn about assembler, the more I appreciate how C deals with dirty work like calling conventions, register allocation, and computing struct member offsets, while still giving you control of the machine.

On the other hand, some processor primitives like carry bits are annoyingly absent from the C language.


My understanding was that C did not have tremendous adoption by '75.


> For example... should I use map or reduce here, or maybe an iterator, or... oh fuck it, I'll use a for loop.

Use the least powerful tool that solves the problem. A map is less powerful than a fold (reduce), so use that if you can. A fold is probably less powerful than a for-loop, so use that if you can. An iterator is probably less powerful than a for-loop, so use that if you need the more streamlined accessing of elements rather than a fine-grained, indexed loop.


But I know how to write a for loop in 15 different languages, and all of those other solutions work differently! If I have a problem that I can solve with a for loop, I should probably just use a for loop and move on to bigger problems, rather than remind myself exactly how iterators work in $LANGUAGE.


But I know how to write a GOTO in 15 different machine code dialects, all of those other solutions work differently! If I can solve with a GOTO, I should probably just use a GOTO and move on to bigger problems, rather than remind myself exactly how loops work in $LANGUAGE.


And the point is?

Goto has a problem, it breaks the structure of the code. For does not have this problem. Thus, you should avoid goto, and there is no reason to avoid for. The fact that you knows how to write both changes what?


Actually Goto doesn't have a problem; And it wouldn't matter in modern languages that deny access to that low-level instruction.

In basic you can use your labels and goto statements to create a hierarchy of dispatching routines. And if you're very disciplined about which global variables you use as flags to control the execution flow, i.e. which goto statement gets selected next, and which variables to hold return values, you could write a decent program.

The danger lies in that it allows the programmer too much range to produce low-quality code. The negative effect includes too much attention to the implementation of abstractions, instead of its usage. The existence of the goto statements can undermine other abstractions offered by the programming language.

Goto doesn't break structure of the code, programmers do. (I guess that's the whole reason we stopped using it: reducing the risk of making crappy software)


Certainly. Use of goto to implement structured programming is structured programming, but if you're implementing control flow structures that are provided by your language anyway why are you bothering to use goto? The result will be slightly less readable and slightly less maintainable. There remain a few places where a commonly used language doesn't implement the control structure that we'd want to use and goto can be a reasonable choice - the most common example is using goto to implement (limited) exception handling in C.


The point Dijkstra was trying to make is that humans are inherently incapable of dealing with that kind of detailed complexity, and still reliably make useful programs. That's why he proposed that goto should be excluded from all higher-level programming languages.

In my comment structured programming refers to using structured syntax to generate goto statements, so you don't have to see them or implement them yourself. It should free the programmer of considering those alternative ways of controlling program flow. Presence of goto statement points to a flaw in the language design.

To answer your question: Because the basic language I'm referring to, was on my TI84-plus calculator, and it only had an if-statement (no if-else!).


"The point Dijkstra was trying to make is that humans are inherently incapable of dealing with that kind of detailed complexity, and still reliably make useful programs. That's why he proposed that goto should be excluded from all higher-level programming languages."

There is a very simple isomorphism between each of the typical control structures (sequencing, choice, and iteration) and its implementation with gotos. It's an easy mechanical translation, in either direction. I don't think Dijkstra was making any claim that spelling these control structures with goto radically increased the difficulty of programming. The important thing was using reasonable control structures (and only reasonable control structures) in the design of your program. Obviously, having the language do it for you is preferred much like any other mechanical translation - but that's not the key point.

"It should free the programmer of considering those alternative ways of controlling program flow. Presence of goto statement points to a flaw in the language design."

I don't disagree with any of that.

"Because the basic language I'm referring to, was on my TI84-plus calculator, and it only had an if-statement (no if-else!)."

That's still an example of using goto to implement missing control structures, not using goto when the control structure you want is present.


The point is that different languages have different features and even those with similar features may place different emphasis on which to use. By writing code for the 'lowest common denominator' like this, you're missing the advantages of whichever language you're using.

The most obvious symptom would be with libraries; even though GOTOs, loops, recursion and morphisms are equivalent, if most libraries expose APIs in a different style than your code, you'll have to sprinkle translation code all over the place.

It also makes a difference for static analysis, eg. in IDEs and linters. For example, Iterators might give you more precise type hints or unhandled-exception warnings, which makes that style safer (all else being equal).

Of course, the other point is that there are no such 'lowest common denominators'. GOTO certainly isn't, since Python, Java, etc. don't support them. For loops aren't, since languages like Prolog, Haskell, Erlang and Maude don't support them. Recursion isn't supported in COBOL or OpenCL, and took several decades to appear in FORTRAN. Morphisms require first-order functions, which rules out FORTRAN and C, and until very recently C++ and Java. It may or may not be possible to build such features from the other ones, but even so that's clearly working against the language and causing massive incompatibility with everyone else.

Clinging to particular features like this will only blinker us to the possibilities which are out there. In this case, clinging to for loops implies avoidance of at least Erlang, Haskell and Prolog. These languages have a lot to offer, and are/look-to-become the go-to solutions* in the domains of concurrent, high-confidence and logical/deductive code respectively. Clinging to inappropriate concepts dooms us to 'reinventing the square wheel'.

* No pun intended ;)

PS: I actually think recursion and morphisms are safer, more structured and conceptually simpler than for-loops. See http://lambda-the-ultimate.org/node/4683#comment-74319


You have a point about least powerful tools (simplicity) but there is also clarity.

For loops have the advantage, particularly in languages like Perl, of being quite clear.


> In Java, as in any language, you can over-design, or you can under-design, or you can design just what you need

One important thing to note with any language is that the culture matters - a lot. For instance, Tcl and Tk let people easily create GUI's, but there was no UI culture. Most of the main books didn't dedicate even a page to how to make a good looking, functional GUI. You can produce some godawful messy code in Ruby if you put your mind to it, but there's a culture of not trying to be too clever, and also of writing stuff that's easy(ish) to read later.

Since I'm not really a Java guy at all, I don't know enough to comment in an authoritative way, but I sneak a glance from time to time, and my guess is that there may be a bit of a culture of over-designing things.


I almost made a similar point but didn't want to be too long-winded. But you're exactly right on. Certain idioms develop in every language. In Java there's a culture of favoring verbosity over cleverness. In fact, I think you'll find some Java programmers who hate the cleverness that is often common in other more terse languages. And they'll throw "clever" around as an insult much the way "verbose" is used against Java.

I think overall these cultures don't indicate too much of an underlying problem and aren't too harmful, except when assimilating developers from other language. But to your point, these barriers to entry likely exist for nearly every language (and if not, other barriers do), so on balance, it's largely a wash.


It's also quite the mystery how this culture happened, given that it's the norm to use big libraries for dependency injection, AOP, transactions management, protocol wiring, data binding and so on, big and complex libraries that many times rely on bytecode manipulation at runtime to workaround the verbosity problem or other deficiencies in the language itself.

I think that the Java ecosystem being so massive and Java being so popular, it also attracted a lot of mediocrity, since the usual bell curve applies. That said, I know Java developers that definitely don't like verbosity and that don't follow the usual "best practices" as some kind of dogma. In the end, it all depends on the work environment you end up in.


> It's also quite the mystery how this culture happened,

(I assume that you are not being sarcastic - my sarcasm detector did not go off. In case I missed the sarcasm, apologies).

It is not at all a mystery, if you had been there in the late '90s / early 2000s : Java was actively marketed to management as a language that's usable by mediocre programmers to produce usable results ; "See, unlike C/C++, there are no buffer overflows, no dangling pointers, no undefined behavior, no platform dependencies, no memory leaks - your projects now are in bad shape because of these, but when you switch to Java, all of these problems will be a thing of the past". And those same people who bought the marketing hype, were also mostly believers in the "programmer is a replaceable cog" religion, which is still prevalent today in many places (not just Java), and during the .com days was actually somewhat truthy in the sense that people switched jobs so fast that you had to build your culture in such a way that people are replaceable, because programmer turnover at many companies was exceeding 20%/year.

Today's Java culture is rooted in those days, and it has an underlying "make it painful to not be stupid-simple" directive, and "make it easy to replace the programmer" requirement. Though these are often satisfied, my impression is that overall, it does not make projects more likely to deliver (on time or at all), or higher quality. But it is what it is.


Your story is about how Java culture developed its anti-intellectualism. I think there's some truth in it.

But i think there's another significant factor. Java came as a successor, of sorts, to C++. Early Java programmers were former C++ programmers, and its designers had C++ experience (as well Smalltalk, Lisp, etc experience). As a result, many features of Java are responses to pain felt in C++. Feeling the pain of memory leaks and use-after frees? Garbage collection. Feeling the pain of segfaults and incomprehensible pointer chains? No pointer arithmetic (the party line for years was that Java had no pointers, only references). Feeling the pain of baffling symbol use in APIs? No operator overloading. Feeling the pain of aeons-long build times? Compilation to bytecode, with linking left to runtime.

I believe this pain response carried over into the culture. The pain in C++ didn't just come from the language, but from the fact that people did clever things with it (template metaprogramming had been invented - discovered? summoned? - in 1994). There was a strong association between cleverness and pain. Cleverness was therefore taboo in the brave new pain-free world of Java.

There was another point in the grandparent, and in the original post, about Java culture's tradition of verbosity. I think this is also a post-C++ thing. In C++, it takes a lot of writing to get anything done. In Java, it takes a bit less. You feel like you're moving faster when you write Java. So, as an ex-C++ programmer who has the endurance to write a lod of code, and is feeling the exhilaration of moving fast, what do you do? Write as much code as possible?

There has certainly been a huge shift back towards concision in Java, but it's not something that's completely permeated the community. There are people today still learning Java from books written by people who learned in the '90s and who internalised that logorrheic style. We'll probably never move entirely beyond it.


> It's also quite the mystery how this culture happened

Maybe the language didn't have powerful tools in its core? What if getters/setters weren't mandatory, like in Scala, wouldn't we write more tiny expressive classes to hold intermediate values?

Think about this example of Dependency Injection: Why did we have to use DI? Because using statics as a dependency injector (like in Play) prevents from unit testing. How could we have solved it? By allowing inheritance override on statics. We don't have it, therefore we have Spring.

Even the Servlet API is criticized for being inexpressive [1]. They've done it so wrong that every single framework reimplements url routing (Struts, Spring MVC...) resulting in those horrible /myserviceaction.do patterns. In turn, those patterns deny RESTful apis, prompting for yet-another-framework for REST resources. We just became Enterprise [2].

I personally have no wonder why we became verbose. But it's understandable, since Java was built using community processes.

[1] http://blog.lunatech.com/2011/12/08/wrong-with-servlet-api [2] https://github.com/EnterpriseQualityCoding/FizzBuzzEnterpris...


Because it's hard to add layers of abstraction later in Java, experienced developers learn to add a bunch of extra ones early so there'll be there if you need them?


No. It is inexperienced programmers that add abstraction everywhere.

Abstraction has a huge mental cost (I ignore the execution speed cost, because it is mostly irrelevant these days for most projects).

You need more than one case to abstract over. Abstracting "just in case" is, much more often than not, very leaky - that is, requiring you to keep in mind the details of both the abstraction and the abstracted - and requires revision when you actually need it.

Abstraction is not magic, not cost free, although Java culture is to assume that it is both. (Same criticism for C++, BTW; C culture tends to err the other way -- which, in my experience, is the better error to make)


>In Java there's a culture of favoring verbosity over cleverness.

I've been programming in Java (gulp) all my professional life of about a decade. I can totally attest to this.

Just couple of days ago in one of the code reviews I gave a lengthy explanation for using StringUtils.equals("foo", someVariable) over "foo".equals(someVariable). I could have instead just said what you've stated; prefer verbosity over cleverness :)


I am java programmer and liked java culture. And I see nothing wrong with "foo".equals(someVariable).

If code review ends with lengthy explanation over why not use "foo".equals(someVariable) or some other maintenance/readability/effectivity/other important consideration non-affecting issue, I see something wrong with code review.


I agree... StringUtils is awesome, but "foo".equals(someVariable) is not a case where it's use is important.


In this case code review is actually the reviewer trying to boost ego and stamp authority on reviewee



And there's the culture I was talking about! I'm not a fan of stuff that's too weird and magical, but "foo".equals(bar) is shorter and just as clear to me, if not more so.


"foo".equals(bar) is definitely clear and shorter. However it is "clever" in the sense that it circumvents NPE with some sort of trick. Those who are new to Java don't get why and how it avoids NPE. To make matters worse they may even reverse the arguments for better readability, bar.equals("foo") which is a ticking NPE time bomb!

StringUtils.equals(a, b) on the other hand clearly states in its interface document that it is null safe and there's no NPE trap.

There, I said it :). That was my lengthy argument for using StringUtils.equals!


Of course, StringUtils is not in the core API; the first idiom "foo".equals(bar) works in Java everywhere, StringUtils.equals("foo", bar) will only work in projects that pull in org.apache.commons. Personally I prefer using facilities in the core API over third party APIs, even if the third party API is included anyway for some essential function.


I've been coding in Java for over 10 years. If I ran across pretty much any project that wasn't including Apache commons I'd be startled. I get your point, but some libraries are in such common use that for all practical purposes I consider them part of the language API.


Understanding that there is the possibility of an NPE and reading enough of StringUtils contract to understand that it deals with requires approximately the same level of "cleverness" as understanding the "foo".equals(bar) behavior.

Code reviews should focus on more important things than this, IMO.


But the problem with your use of StringUtils, IMHO, is that it hides intent. If bar is expected to potentially be null, test for null. If bar should not be null, throw an NPE, and the fix the calling code.

Making code that shouldn't need to be "null safe" masks errors, and ultimately makes them harder to track down.


As you may have guessed from my previous comment, I hate it when people write "foo".equals(bar). This kind of a clever trick is an accident waiting to happen unless you're really, really ok with null being a valid value for bar (and more often than not, you're not ok with that). If null is not a valid value for bar, then it's better to catch it and deal with it as early as possible.


Wow, sounds like a productive code review.


By the way, nobody should ever use "foo".equals(someVariable) anyway. If anything, it should be someVariable.equals("foo"), unless you really know what you're doing and are ok with null being a valid value of someVariable.


Either style would be fine as long as you check your preconditions.


> but there's a culture of not trying to be too clever

I love Ruby dearly, but I can't agree with this. There's a lot of being "too clever" in Ruby land. Not in terms of writing fiendishly convoluted but brilliant code (though we do have a nice subset of "code as art", like _Why's Camping micro-framework), but in terms of caring too much about elegance over things like speed and memory usage, until it comes back to bite us.

My pet peeve at the moment is Bundler and Rubygems, which while they are fantastic productivity tools when you start out, has the serious issue that the way they grow the load path means that the time to require a file grows exponentially with the number of gems. 100k+ failed stat calls on startup just to try to be "too clever" about not making people specify more clearly exactly what they are loading is no fun.

Many of these "too clever" things then starts to dictate app design. E.g. if you want to make a command line app out of a large tool that depends on a long list of gems and Bundler, you better think long and hard about making sure you defer all loading as much as you possibly can, or alternatively start up the app in the background and make a small command line app that talks to it over a socket...

For my part, I have a longer list of other Ruby practices I've come to detest than most, probably, as I'm working on a "as static and ahead of time as possible" Ruby compiler, and there's a lot of Ruby idioms that massively complicates things because Ruby developers are used to the malleability of the language, and so will often alter fairly central parts of the language even when there may be conceptually simpler ways of achieving the same that just doesn't look quite as clean.

Again Rubygems is a great example: It replaces "require". That's one of the means it uses to munge load paths. On the surface it was a great thing: it let Rubygems let people pretend Rubygems wasn't there, and yank out gems in favour of manually installing the files if they wanted to. But it came at the cost of contributing to the load path/stat mess I mentioned above.

Some of these things are in fact great no matter what, and exactly why we love our languages. But some of them come with ugly warts that we grow blind to because we understand the underlying reasons for why things have been done the way they have.

I think all languages grow these kind of warts. To me, Java has more of them than most... Java from beginning seems to have grown a culture of ceremony and abstraction that due to the language easily (but not necessarily) ends up with very verbose code.


Guy works on Java projects for 3 years. In this time wasnt able to produce a high quality project, blames language.


Well, the Guy is pretty clever. I think, given the tools, he could produce high quality work.

http://hop.perl.plover.com/ - seriously clever.


Yes, the guy is very clever at putting down things that he didn't really like. The phrase "damning with faint praise" comes to mind, but upon reflection it doesn't really apply in this case. He's really damning it with criticism badly disguised as praise.

Yes, I know that he says that he is negative and his praise comes across badly (he gives an example of him damning his blogging software). I suggest, if he is being sincere, that the problem is his attitude. If he isn't being sincere, then I'd say that he's not as clever as he thinks he is, or he just never had any passion for Java and so he actually hates it as he never mastered it.

In terms of boilerplate code, I'd love to see his exact examples. With annotations, I've found that plenty of Java code is actually not verbose at all. If he churned out thousands of classes when actually he could have reduced the number of classes to 50, then that's an example of poor management and poor workmanship, certainly not a problem with the language.

Here's the real issue: the article conflates a corporate environment that likes to measure productivity in klocs (or from the number of classes you create) with the tool that is being used. He even admits that he could have actually produced better code, just that nobody cared to correct him when he produced crap quality code.

However, when it comes down to it, the real phrase that comes to mind is:

YHBT. YHL. HAND.

The guy is trolling you all.

(Different languages have different failure modes. With Perl, the project might fail because you designed and implemented a pile of shit, but there is a clever workaround for any problem, so you might be able to keep it going long enough to hand it off to someone else, and then when it fails it will be their fault, not yours. With Haskell someone probably should have been fired in the first month for choosing to do it in Haskell.)

The languages he praises so highly are now put down. Correction: in each case, he's dissing the programmers who wrote the crappy code and caused the project to fail, and he's damning the terrible management that didn't have good enough leadership and oversight to prevent the project failure.


>>In terms of boilerplate code, I'd love to see his exact examples.

Seriously, this is likely asking proof regarding the presence of sun during broad day light.

>>With annotations, I've found that plenty of Java code is actually not verbose at all.

Again. Are you really serious?

>>If he churned out thousands of classes when actually he could have reduced the number of classes to 50, then that's an example of poor management and poor workmanship, certainly not a problem with the language.

You certainly haven't ready his book 'Higher Order Perl'.


You certainly haven't ready his book 'Higher Order Perl'.

Indeed. And?


I find most annotations in Java are workarounds for missing language features. If you need a piece of functionality you're better off working in a language that provides it natively, so that all your tools will understand it.


Seriously Clever for A 10 year old book on perl basics is probably pushing it.

No disrespect to the guy though I'm sure he is and general respect for publishing a book like that.

I guess just limited exposure to Java


> Seriously Clever for A 10 year old book on perl basics is probably pushing it.

Why is the age of this book relevant? It's not an overview of the basics. it's a 500+ page exploration of Perl's functional programming features, a topic rarely (if ever) covered in this depth.


The other day I surprinsingly found more praise for HOP from a coder I admire: http://books.google.com.br/books?id=2kMIqdfyT8kC&pg=PT89&lpg...

Time to re-read it, because when I first read it, I had to skim some parts. Probably applies to Python.


That's like calling SICP a book on Scheme basics.


Sometimes it is the tool that's to blame, you know.

Even the best master carpenter wouldn't be able to build something as simple as an end table if he were given a pickle to use as a hammer, and an onion to use as a saw.


The tool seemed to be the right tool or at least good enough for the master carpenters of Hadoop, Elasticsearch, Cassandra, Zookeeper, Neo4J etc etc... and Google, Netflix, Twitter etc etc...


That's a terrible analogy. More along the lines of using a sledge hammmer to crack a nut is probably more appropriate here. Java is very powerful but sometimes using it for very simple programs can be overkill

Also, have you heard the saying: A bad workman blames his tools


Yes, I have heard that saying. It's a rather stupid one, because sometimes the tools are to blame, even when in the hands of the best expert there is.


> There's more, but I'm arguing with a person who believes that a question about stdin and stdout is a proper gateway for measuring skill toward any programming problem ever.

Well, it sure did a good job of identifying a lack of skill of folks on HN, so maybe it isn't quite so crazy after all.


I think the writer of the article is clearly suffering Ahmdal's Condition. Many of the problems he states are clearly personal experiences, and could not be applied to a more broader environment. Just like saying: a whole sport is bad, becuase one team is not performing well.


> There's more, but I'm arguing with a person who believes that a question about stdin and stdout is a proper gateway for measuring skill toward any programming problem ever.

Apparently you missed his second sentence:

> The first question on the quiz is a triviality, just to let the candidate get familiar with the submission and testing system.


The "Har-har-so-many-classes-and-don't-even-get-me-started-on-variable-names" argument is true.

Sure, you can overdesign in any language, but some inner language characteristics of Java lead to it more than other languages do. Copying input to output is a statement, if you think about it in in a logical, human way: "I want you, program, to perform the action of copying this to that". But Java, unlike some other languages forces you to say: "Make a class, make a main function because YOU need it, Java, and only after please do what I really want and copy that input to the outout".

In python, for instance, I'm in control of whether making a StreamCopier class, a copy_stream() function or just get to the core of what i want and write "print(input())".

Generally speaking, in most dynamic languages if I want to access the to_string() method of something passed to a function, I can pass a list, an int, a Duck object and till it has a to_string() function I'm good to go, or I can even monkey patch to_string() and call it a day.

In (too) many cases Java the language forces you to make instances over instances and implementation of interfaces and the like just to access the data you need in the way a framework or a method wants, not you, and often this is frustating because you see your data "just there" and the language fights against you preventing you to acccess it in an easy way. I need to call to_string() of SomeClass but I have an instance of SlightlyDifferentClassStream? Good luck with that, maybe the only possible way is to create a DifferentClassConverterProxy just to have a DifferentClassTranslator, extend TraslatorStream, feed it with my SlightlyDifferentClassStream and have something compatible with SomeClass.

This is not only true to Java, some of these "problems" arise from it being a statically typed pure OO language (a very good thing on my book when it is not implemented in a dumb way) that is put to shame by a cleaner implementation of the same principles like the one seen in C# that, on top of all, also offers a powerful dynamic programming, lambdas and so on.

Also, Java suffers from an enterprisey background. I consider a language environment to be a very relevant part of a language itself, and Java has promoted the proliferation of ridiculous bahamut frameworks with a freaking large number of classes and instances "just in the case someone needs to extend it".

These are facts, not "Har-har-so-many-classes".

Take a sane implementation of a web framework like Django. It is an opensource project born from the needs of a small group of developers. You don't have "so many classes" if you only write those that you really need to solve of your problems, and the project progresses from there to embrace the everyday, real world needs of a larger group of contributors. Most Java frameworks I've worked with really give me the tangible perception of a large group of monkey developers programming them following line by line a technical specification bible of thousands of pages written by a council of architects with business people yelling at them "mooooore, mooooore, we need more of all of this so that we can sell to ANYONE!"


> So yes, I enjoyed programming in Java, and being relieved of the responsibility for producing a quality product.

Wow. What a pompous thing to say.

How does this have 91 points and is on the front page of HN?

Per the "most horrible answers were in Java", personally, I think he's missing selection bias, in that many newbie programmers who wouldn't know what they were doing in any language, think they "know" Java because they took a class on it.

I have no problem reading critiques of Java, not concise/etc., and like Scala/etc. as much as the next guy, but "relieved of the responsibility for producing a quality product"? Wow.

I suppose I'll add the disclaimer that if his 3 years of putting in time writing Java were doing J2EE, then I'll understand where he's coming from. But that's not the language.


>> So yes, I enjoyed programming in Java, and being relieved of the responsibility for producing a quality product.

> Wow. What a pompous thing to say.

I don't think it's pompous, I think it's actually self-deprecating. I think he's confessing to his perfectionism - to the fact that, if given a tool like Haskell he'll pursue some perfect, idealised program, because if you're not going to produce more beautiful code in Haskell then why are you using it? Java serves as the antidote here because nobody ever judges Java programs on their terseness, expressiveness or subjective 'elegance'.

When he says 'quality product' he's talking about the aesthetic qualities of the code, not the end result. When he says he's relieved of having to think about this, he means he can focus on shipping rather than perfection.


On the oyher hand, some people that argue for their use of strict typing on Haskell's level say that it is because they are not good enough programmers to not waste time on runtime type errors, another form of self-deprecation.


Self-deprecation or realism...


> How does this have 91 points and is on the front page of HN?

Because HN is full of Java hate, where all cool guys would never touch the JVM.

Meanwhile the world moves along and JVM based products get deployed every day.


hey there.

i hate java. i hate java as much as i can. and i love hating java. java must be hated. java must be estranged. java must die. but definitely not the jvm. that's a beast i love. i use clojure and fantom on the jvm.


And Scala! don't forget Scala.


Forget the 'cool guys', the best software engineers love Java.


My internal me was screeeaming just to put "citation needed" here (I've never done it before, this is not a good time to start). And while I agree with you on a certain level; I don't think "Java engineers" are the best software engineers per se. Neither that the best software engineers love Java 'without exception'.

I agree [with you] that Java invites you (nay, urges you) to use good engineering practices.


But you love Java for its exceptions.


It's sad I can't edit my message anymore. Thanks for lightening it up.

At least exceptions ensure (...) - at least in Java - that errors are forced to be dealt with in a certain way.

bulte-rs - orkoden: 0 - 1; I love Java for it's exceptions (and "the IDE's" telling me to catch them all).


You're assuming facts not in evidence.


Agreed. His background is Perl after all, so the idea of 'one way to do something' and 'readable code' is extremely foreign to him. I'd take his advice on programming with a grain of salt.


"It is a Perl guy, so ignore his opinion" :-)

I learned more about you than about Dominus with that comment.

Here is a link to his well received book, by the way: http://hop.perl.plover.com/book/

(But the book you should probably look at to understand the world better, is "Perl best practices".)

Edit: fuzzix -- you are certainly right about most useful modern books, I was making a point to what I comment on. The idi.. cough.. Which insinuated that readable code is an unknown concept to typical Perl programmers.


I would look at Modern Perl before Perl Best Practices. While Perl::Critic bases much of its criticism on PBP, quite a bit of it turned out to be not so useful.


To me I think he's clearly qualified to make all sorts of judgements about Perl. For opinions on Java, I'll look to people who have written well reviewed 500+ page books on Java.


No - my statement is pretty clear. Someone who likes Perl is not going to be appreciative of 'one way to do something' and 'readable code'. Those properties are the basis for modern language design and they are properties that most people want in a language. Taking his opinion means also taking his assumption that the properties of Perl ('interesting code') is a good property for a language to have. Hence my advice to take any opinion from a Perl guy as coming from a Perl guy - don't disregard it, but make sure you don't inadvertently put Perl in your language design. Because Perl in your language design is going to give you unreadable code.

If you're going to argue that Perl is readable code then I'm going to have to disagree with you extremely after having been in a position to maintain some Perl code before.


I wrote a blog post which entirely disagrees with you that got around 11000 hits to date (many from here on HN).[1]

It is true that one way to do something is foreign to Perl culture. It is also true that the best Perl code is pretty unreadable if you expect to read it as a C/C++/Java programmer. Perl is a very different language, and good Perl code reads very differently.

I am not going to get in a language war of Perl vs Java. Both languages have their places. I prefer Perl on the server. I prefer Java on clients I have no control over because at least I can hope it has a working JVM.

Readable Perl code is just different from Java. Let's take an example:

     package Foo;
     use Moose;
     use PGObject::Util::DBMethod;
     with 'PGObject::Simple::Role', 'Baz';
 
     has id => (is => 'ro', isa => 'Int');
     has name => (is => 'ro', isa => 'Str');
     has description => (is => 'ro', isa => 'Str');

     dbmethod int => (funcname => 'foo_to_int');
     dbmethod get => (funcname => 'get_foo');
     dbmethod save => (funcname => 'save_foo');
There is nothing unreadable about that. It provides a package with declarative specifications for properties and accessors (all of which are read only) and methods which delegate to PostgreSQL stored procedures (more information of which is probably further clarified in the Baz interface.

There is nothing inherently unreadable about Perl.

[1]http://ledgersmbdev.blogspot.com/2014/02/in-praise-of-perl-5...


Thanks for the reply - I haven't touched Perl in years and it's definitely much more readable now. It looks like a lot of work has gone into making Perl more readable, especially Moose.

The perl code I've seen in the wild generally doesn't look like that though. Is that more an issue of the age of most Perl code? Or are some people still writing 'old fashioned perl' even if it isn't the recommended way anymore?


Part of the problem is that one-off scripts are different than applications. One off scripts don't need to be big or particularly maintainable, but the problem is people get in the habit of writing Perl that way.

So it's complicated.

Also keep in mind that a lot of CPAN modules out there began back in the 1990's and are still being developed today. I know. I am now a maintainer of several, and the code isn't often that pretty.

But this is the benefit of being encouraged to think about elegance of code: one improves.

But one can write amazingly beautiful, clear, and elegant Perl or one can write rubbish (after all good Perl doesn't take much work to maintain). Most of my work is either maintaining rubbish or (I hope) writing nice, maintainable code.


Just to clarify, re-reading this it looks a little confused. The rubbish I maintain is not stuff on CPAN but legacy code inherited from another non-CPAN project. The legacy modules are sometimes annoying in some ways but they are serviceable.

Some of the non-CPAN code I have (the rubbish) is sufficiently unmaintainable that the only way of dealing with it is to refactor with a chain saw and avoid touching it otherwise.


There was a testing revolution over a decade ago. Then came Moose (2007ish?). You have syntax extensions now. And so on.

See e.g.

http://search.cpan.org/~mauke/Keyword-Simple-0.02/lib/Keywor...

http://search.cpan.org/~mauke/Function-Parameters-1.0401/lib...

What seems to happen the coming years in Perl 5 (unless the Python language trolls manages to discredit it enough to kill it :-) ) is probably gelling around language [syntax] extensions and Moose going to the Perl core.

But it will still keep backwards compatibility.

(And I write horrible Perl code most days myself, as one line liners. :-) Best damn shell functionality on this planet imho, along with Emacs command line editing.)


I love Perl. I also love Smalltalk. I have written professional code in both. Clean, lovely OO code that is easy on the eye.

Your opinion is a complete generalisation and makes you sound profoundly ignorant. Language choice != coding ability.


Your statement is pretty clear, which is useful as far as it goes. But it's also wrong, and that's pretty significant too.

1. I like Perl.

2. Imo, if someone doesn't see advantages to TSBOAPOOOWTDI then they don't understand it -- and the same is true of TIMTOWTDI, TIMTOWTDIBSCINABTE, and many other useful generalizations.

3. I love readable code.

I think the above three statements also apply to Larry Wall (the creator of Perl) and many others in the Perl community.

Imo "Someone who likes Perl is not going to be appreciative of 'one way to do something' and 'readable code'." is not a useful generalization.

If the universe applied different rules to me than everyone else I might advise folk to take any opinion from someone who generalizes without sufficient respect for the dangers of generalization as coming from someone who generalizes without sufficient respect for the dangers of generalization. But it doesn't, so I won't. :)


>>[Perl is not readable]

>>after having been in a position to maintain some Perl code before.

Sigh... You "supported some code" and know how modern Perl looks like?! :-)

Could you carefully explain e.g. the problems with Moose and the latest [syntax] extensions you find on CPAN?

To start with.

Making wild claims like that without support makes you seem like a language war troll or a complete asshat. To be taken seriously, show that you know what you make such big claims about.

Edit: And since you have nothing to say about the subject matter -- what is it in the Python culture that brews such fanatic language war trolls? (I assume you're a Python guy?)


You do realize you're making this in defense of someone starting a language war, right? You may be in the wrong thread. Or its only bashing a modern language when you're bashing Perl?


>>You do realize you're making this in defense of someone starting a language war, right?

No, I am questioning your specific claims about things you obviously have no clue about.

That was a pathetic attempt to change the subject. I stop waiting for a serious answer from you now. Bye.

Edit: And if you're a high schooler, sorry if I'm brusque. There is too much of this language war garbage on HN, I want to keep the quality up.


If you want to keep the quality up, then please be more civil.


You're totally correct. I have just been trolled too much. The first few dozens of times I was polite.


The concept of choosing to be relieved of the responsibility from doing "the right thing" is not really new:

http://www.jwz.org/doc/worse-is-better.html


I don't feel that users of Java are "relieved of the responsibility from doing 'the right thing'" for the following reasons:

1) Java has the paradox of choice problem. Unlike say Python, there are so many ways of structuring code based on the extreme flexibility of available frameworks and so many different libraries that it gets complicated fast due to sheer number of choices, unless you're really experienced and you already know what you want ahead of time.

2) Java's verbosity necessitates heavy abstraction through many patterns. Which patterns will you choose for your given problems? It's still somewhat easy to write spaghetti unless you have both the experience and discipline to prevent it.

Python isn't perfect, but its one way to do something philosophy leads to "relieved of the responsibility from doing 'the right thing'" unlike Java. People who use Ruby kind of experience this as well if they use Rails.


When writing Python you're constantly making a lot of choices that you simply don't get in Java. e.g. "do I use a tuple/dictionary for this thing or make a class" - in Java there are no tuples and no dictionary literals, so you always make a class. "Do I group this bunch of functions into a class or leave them freestanding?" - in Java you don't have freestanding functions, so you always put them in a class. "Loop, map, or list comprehension?" - Java only has one option (at least until recently). "Several functions or one function with a lot of keyword arguments?" Even the tools seem a lot more wide open - with Java these days maven is pretty much standard, your only real choice is eclipse vs intellij. In Python I still couldn't tell you what's the right tool to package with (though pip/virtualenv is more-or-less becoming the standard for dependencies), and there seem to be dozens of IDE/editor options.


"When writing Python you're constantly making a lot of choices that you simply don't get in Java. e.g. "do I use a tuple/dictionary for this thing or make a class""

In Java you're going make similar choices. which collection implementation am I going to use? Which hash implementation am I going to use? and so on, but that's not really the big problem.

Java's strong type system also comes into play. Yes it's easier to read, debug, and performance get's a boost, but you have a think more about designing it (abstract classes, interfaces) and refactoring is harder. Yes things like generics have made this a little easier.

The best comparison you can make about the languages comes down to IDEs. You don't really need an IDE for Python, while an IDE is pretty much a must for Java.

The hardest part about java isn't the language itself; it's the philosophy. In my 10 years of experience with java outside of academia in places like telecoms, banks or eccomerce sites, you're likely going to be using some dependency injection framework for Java. Do you use Spring or Guice? ( Most likely) If you use spring, do you use annotations or XML schema? If you use annotations how do you structure it? What's the scope of the bean classes? The list goes on and on.

Java's (meaning the ecosystem as a whole) biggest strength and weakness is its extreme flexibility and choice that it gives its users.

Conversely Python (meaning its ecosystem as a whole) doesn't give you much of a choice so there's less decisions that have to be made.


> In Java you're going make similar choices. which collection implementation am I going to use? Which hash implementation am I going to use? and so on

I've never seen a case where that choice was important; you can just use ArrayList and LinkedHashSet everywhere. Whereas the python examples are actual tradeoffs, and your code will be worse if you pick them wrong.

> you have a think more about designing it (abstract classes, interfaces) and refactoring is harder.

Disagree. Refactoring is easier in Java because your IDE can tell you what's broken (and even make a lot of changes for you); in Python you have to hope your test coverage is good enough.

> you're likely going to be using some dependency injection framework for Java. Do you use Spring or Guice? ( Most likely)

Choice of libraries is important in any language, and IME it's much more possible to get it wrong in Python. I have a perfectly good website backend from a few years ago, only it's built on TurboGears which is now defunct. PIL was about as big and popular a Python library as they come, but I understand it's now unmaintained. There are about 5 different XML parsers, in Python as it is in Java.

> If you use spring, do you use annotations or XML schema? If you use annotations how do you structure it? What's the scope of the bean classes? The list goes on and on.

Those are choices you make, but you make them once at the start of the project, and again it doesn't matter so much if you get them wrong because you can trust that all the options will be maintained for a while. In Python with no explicit IoC container you still have to solve the same problems, so you end up making the same kind of decision again and again.


> I've never seen a case where that choice was important; you can just use ArrayList and LinkedHashSet everywhere

You shouldn't. Sometimes you should be using CopyOnArrayList. that's one example.

> Refactoring is easier in Java because your IDE can tell you what's broken

It's harder because there's more code to sift through and it's easier to have bad design.

> Choice of libraries is important in any languag

It's not really the choices of libraries that's the big killer in Java. It's the freedom of usage of those libraries. Spring is a good example of this.

> Those are choices you make, but you make them once at the start of the project

This is easy to say for veterans. Not so much for newbies.


Read that article, as it's not talking about the same thing. A lot of thought and design went into both C and Unix. "The right thing" is relative to the tradeoffs you find acceptable or not.

TFA is talking about simply not caring. Which I find an awful opinion. There's a really big difference between a well designed, well built Java product/library and a badly designed, poorly built one. The Java ecosystem being so massive, you can notice examples of both everywhere and the difference is night and day.

Java does in no way relive you from the responsability of producing quality. At times it makes it harder, as you need a lot of knowledge and extremely good taste to do the right thing, instead of sucumbing to "best practices" that are perpetuated as myths or instead of adopting poor libraries, just because they are popular.

Example: java.util.Date / java.util.Calendar versus Joda Time.


The best thing the author ever wrote was the proof the avocado has an extra-terrestrial origin: http://www.plover.com/misc/avocado


To be honest, reading the article, he sounds like he is depressed. He seems very bitter about a programming language that he doesn't even have to use every day.


Now Mark Jason Dominus is often wrong when it comes to his political opinions, but he's rarely wrong about programming, and your tone is just way too negative for seemingly no reason. Of course he's missing selection bias. That kind of goes with being MJD. So?


How does this have 91 points and get to the front page of HN? I suspect that you know the answer to this already. Hacker News is home to a bunch of elitist language bigots who think that if you don't code in their own coolest-language-ever you are some kind of second class citizen.

There's no surprise that this is the culture here. Need I mention the term "blub programmer"? Could anything be more elitist and offensive?


So I was a Java developer from 2006-2012. I still occasionally touch Java at work.

Off the top of my head, right now, in this text box, without google, I can't write the code to copy standard input to standard output.

System.out.println(System.in)? That's wrong. You need some kind of InputReader, or BufferedInputReader, or Buffered...something. I think System.in is an InputStream, so presumably I need to do new InputReader(System.in)? Or BufferedInputReader? Should I buffer the output?

Ok, let's do some API googling (ie, not StackOverflow).

So I was... pretty wrong. I think what you want is:

public class CopyInput { public static void main(String[] args) { while (true) System.out.write(System.in.read); } }

I haven't tried to run this though.

In my defence, I have never, in all my years of working, had to ever read standard input, let alone write it out again.


I was about to say the exact same thing: I'd probably walk out of an interview if remembering how to read stdin was part of it, because it comes up maybe once every couple of years in real work, better to just Google it.

Don't tailor from-memory standard library questions to bits of code that almost nobody ever has to use.


I think that is sort of the point of the article though (or at least one of the points...). Dealing with stdin/stdout is not uncommon in itself, though it is probably rare in java. Choosing the right tool for the job is kind of important in most aspects of software, so doing a simple script in say python or a few lines of C or whatever is important - it shows a familiarity with the environment, not just the language(s).

For example:

Whenever I'm dealing with a data processing pipeline I tend to design things as if they would be used from a unix command line. I tell the team to make their classes etc as if they would be wrapped with a simple chunk of code to parse a text stream then serialize to it later. During the course of testing, I almost always will then write those tools to allow simple command line pipes to push test data around. It helps prevent reliance on certain toolsets and certainly helps stop us from conflating bugs in our code with bugs or operator error about a pipeline framework. Later we can go back and more closely integrate with the chosen operating environment, if performance demands it. On the other hand going from a situation where code is written to the environment and migrating to another environment can be very daunting, difficult and error prone.


It depends. If the job is Unix/Linux based then I'd say you definitely should know how to do stdio.


> Don't tailor from-memory standard library questions to bits of code that almost nobody ever has to use.

So seriously? You almost never need to read & write to streams?


No. I almost never need to read from stdin, as I said.

Again, most of us write real apps, not command line utilities.


Yeah, I've spent 10 years as a Java developer and I'd have to think about how to do it in Java. It just isn't a problem Java was designed to fix - sledgehammers, nuts and all that.


Seriously? You've never read from or written to a stream?


I've never copied one stream to another by hand, byte by byte. There's a method in one of the apache commons libraries for that. But most of the time the only reason I'd ever read an input stream is using a CSV parser or a JSON parser or the like, in which case the idiom is usually to just hand the stream to the parser.


> I've never copied one stream to another by hand, byte by byte.

Yeah, but you've probably done a read and write independently. This is just about joining the two together.

> There's a method in one of the apache commons libraries for that.

Yup, and that's great to use that, and you'll no doubt end up with more efficient code most of the time if you do. But... you still should be able to write correct code for reading bytes from one stream, writing to another, without losing data in between.


> you still should be able to write correct code for reading bytes from one stream, writing to another, without losing data in between.

With access to Google and the library documentation sure, but I don't use those APIs often enough to be worth keeping in memory.


> With access to Google

Why the heck would you need Google

> and the library documentation

It's an online quiz.

> sure, but I don't use those APIs often enough to be worth keeping in memory.

You don't worry about it, but it tends to accumulate in one's mind anyway.

Here's the thing though. People are posting solutions on HN... and getting it wrong.


Of course. These days though I don't tend to get much closer than something like Properties.load(InputStream). I really would have had to think about it.

I guess I've been stuck in webapps and NoSQL libraries for a while now.


Where does HN find these guys... It's amazing how many long term "java" developers there are here.


I honestly am at a loss for this. Reading & writing data from a stream without losing any information seems like a pretty basic thing you'd want to know how to do in your chosen programming language.


The reason for this is that if you're doing IO in Java you'll usually use some kind of buffered input reader rather than stdio. I've written a command line in Java that reads commands, runs native commands, streams output to the console and even that doesn't need to read stdio as a stream. It's not that Java can't do that, it's that there are better idioms for most problems.


> The reason for this is that if you're doing IO in Java you'll usually use some kind of buffered input reader rather than stdio.

The methods are the same with a buffered input reader.

> I've written a command line in Java that reads commands, runs native commands, streams output to the console and even that doesn't need to read stdio as a stream.

So, the thing that you are struggling with is how to get to System.in? But no problem doing System.out?

That's not even what the various code samples that have been posted have failed.

> It's not that Java can't do that, it's that there are better idioms for most problems.

Java's actually very good at doing IO efficiently, and as has been demonstrated, the code for this is quite simple. Sure, you wouldn't implement "cat" in Java, but you would read and write to sockets & files with it, which provides all the knowledge one might need to figure out how to solve this. As has been pointed out, the Apache Commons libraries make it so this is a two line program with no branching or looping. That's about as simple as it gets.


I'm not struggling with it at all. I'm an ex C,C++ programmer, i'd do a while not eof read loop.


Which seems pretty amazing to me. Granted I'm not a Java developer, but it seems like such a basic feature of any language.


But how often do you read stdin in Java? It's a language that is used heavily to build web apps and (though less so) desktop GUI apps. I also programmed in Java for nearly a decade in the late nineties till about 2004 and have never used System.in for anything.


I'm writing desktop GUI apps in C# these days. Not the same as Java, but still many people would probably ask the same thing about C# as you've asked about Java: How often would you read stdin in C#? I use it at various prototyping stages of my applications. Oh, I need to handle a new file format. Ok, write the parser and classes to contain the file contents. Then create a quick console app that lets me read in a file and pose questions to it (give me the contents of data block 0x0840, what's its time tag, etc.). Now I need to run analyses on the file, and there's another file format that contains the queries. Create a quick console app that lets me combine those two pieces and a simple text interface to explore it. Then, once the pieces work, I plug them into the GUI. Now, do I do this every week? No, because after a certain point it's all about the GUI and other interactions. But it's a great way (for me) to prototype, and I'd use this approach regardless of the language for most programming tasks.


> But how often do you read stdin in Java?

Doesn't matter. For this problem it is just a conveniently available InputStream. How often do you read from an InputStream? I'd hope the answer to that isn't 0.

> I also programmed in Java for nearly a decade in the late nineties till about 2004 and have never used System.in for anything.

Okay, then change the problem to implementing this interface:

    public interface Copier {
        public void copy(java.io.InputStream in, java.io.PrintStream out);
    }
That doesn't change the problem.


Sure I am familiar with input streams, but we are talking about doing something from memory. One quick Google search leads to an "oh, of course that is what you would do" moment, but I am just noting that even in my Java hey day I would have needed that Google search before writing the requested code.

Your interface does not change the problem, but it does phrase the problem in a more familiar way and would have been much more easy for me to respond to (in my Java hey day that is).


> Sure I am familiar with input streams, but we are talking about doing something from memory.

It's an online quiz. You don't have to do it from memory.


And it IS a basic feature of Java. Just not one that is used much, and therefore not one that has a particularly clear or succinct idiom.

Java didn't grow up in the world of unix where scripts that read from stdin and write to stdout get chained to produce pipelines. It grew up in the world of long-running stand-alone applications that communicate over sockets (like web applications). Java, because of the virtual machine, has a particularly slow start-up time and would be a poor choice for implementing mini pipeline components like this anyway.


It's definitely a basic feature of any language, including Java. I have no idea what the people upthread are talking about. From memory (may not compile) (edit thanks to cbsmith)

import java.io.*;

public static void main(String[] args) {

  byte[] buf = new byte[4096];

  while (true) {
    int numRead = System.in.read(buf);
    if (numRead < 0) {
      // hit EOF, terminate
      System.out.flush();
      System.exit(0);
    }
    System.out.write(buf,0,numRead);
  }
}


just for the lolz, here is another way to do it without a buffer and using some java 8 features (lambdas) to avoid having to wrap the code in a try catch (technically you still are but it looks prettier IMHO):

    public class SimpleJavaTest 
    {
        public interface RunnableEx
        {
            // can't use Callable<Void> because that run method needs to return a value
            public abstract void run() throws Exception;
        }

        public static void main(String[] args) 
        {
            // write standard in to standard out:
            uncheck( () -> {
                int c;
                while( (c = System.in.read()) > -1)
                {
                    System.out.write(c);
                }
            } ).run();
        }

        public static Runnable uncheck(RunnableEx r)
        {
            return () -> {
                try
                {
                    r.run();
                }
                catch(Exception e)
                {
                    throw new RuntimeException(e.getMessage(), e);
                }
            };
        }
    }


> it looks prettier IMHO

... And here's a fine example of Java culture.

2 lines do stuff, everything else is fluff, and it is considered prettier.

BTW: I did not run this specific code, but dropping the buffer is likely to make this code take much, much more CPU (unless HotSpot is much better these days than it was in 2010 when I last used it). That's another pillar of Java culture - care not about performance.

Disclaimer: I didn't test this, and any mention of performance requires testing, rather than reasoning. I don't have a Java compiler handy anymore, or I would test it.


dropping the buffer makes it perform substantially slower - there are some benchmarks listed elsewhere on this thread


I'm a big believer in benchmarking before saying things, but I think we could skip the benchmarks when asking the question "does 1 4096-byte read or 4096 1-byte reads complete faster".


I would agree ... except that I've seen cases in which it didn't make a difference.

e.g. if the File implementation had an internal buffer (C stdio's "FILE " does), and the read from that* buffer was inlined (from my past experience up to date as of early 2011, HotSpot doesn't, but LuaJIT does), it might not make any difference.

Seriously, LuaJIT does things I've never thought I'd see a compiler (JIT or AOT) for any language (dynamic or statically typed) do. I used to reply to "sufficiently smart compiler" with "one hasn't appeared yet, despite at least 3 decades of waiting". But LuaJIT has appeared.


You need a test for the exit condition (read returning back -1) and you need to flush stdout.


Right, thanks. Either way, it's not rocket surgery.


Agreed. Then again, you did mess it up the first time, so I guess it is an effective test. Everyone uses the buffer, which I find amusing.


> In my defence, I have never, in all my years of working, had to ever read standard input, let alone write it out again.

Yeah, but there is nothing different from this and having to copy data from an input stream to an output stream. If you can't remember how to do that, it kind of raises the question of what you were doing in Java.

Not knowing if you need to buffer is actually fine in my book. That's an optimization thing. Not knowing how to make the code correct though...


Crud apps my man, ones where the framework did all the IO for you.


You probably never use the command line. People like me who use pipelines on *nix [1] every day write these sorts of filters all the time.

[1] http://en.wikipedia.org/wiki/Pipeline_%28Unix%29#Pipelines_i...


I don't know why so many cool programmers hate Java. Sure the library is too big and sometimes is hard to find the proper stream class. Or the Date class is antiquated.

Ignoring that, what I like about java is that my programs are clear, and that once I compile my programs, they just work. I can refactor my code, and the compiler makes sure that I don forget anything. I don't love it they way I loved Perl or C. But when I write Java, I feel confident that my program is not going to break; and even if it breaks a stack exception is going to point me directly to the offending line.

Of course you don't have to use 25 Design Patterns at the same time. But lets not blame Gosling for that, that is the fault of the "Enterprise Architects", I still think you can write good tight java code.


Its largely because the solutions come out too verbose in Java. It takes far too much code even for simple tasks. After a while it gets on your nerves.

I have no problems with Java. But I have routine fits of frustration during P1 issues when I have to dig piles and piles of code to analyze simple things. Often its like there are 9-10 hierarchy of classes each doing something very simple and passing the burden of implementation to things below and down there you see more and more verbosity, boiler plate, getters/setter and pages of exception handling.

In some way Java too is a write only language.


> Often its like there are 9-10 hierarchy of classes each doing something very simple and passing the burden of implementation to things below and down there you see more and more verbosity, boiler plate, getters/setter and pages of exception handling.

Totally agree.

Dijkstra wrote "a case against the goto statement" (which was renamed "Goto considered harmful" by the editor of the journal) when he saw these things happening to flow control because of GOTO.

These days Java (and C++, to a lesser but similar extent) has just as much "class spaghetti" as the "goto spaghetti" of the old days - and many programmers have no idea that it can actually be better.


> I don't know why so many cool programmers hate Java. Sure the library is too big and sometimes is hard to find the proper stream class. Or the Date class is antiquated.

The library is too verbose and over-architected. My idea of a good match would be the Java language with Python's libraries.


Java hate is overblown, but compared to say Scala (I have professional experience with both) it makes a lot of relatively trivial things (like basic data-holding classes and operations upon them) more painful and verbose than they need to be.


On the flip side (and I too have professional experience with both), Scala can very easily make simple tasks really hard to read later.

There's a balance. Neither is optimal. (I think C# comes very close to what I'd call "perfect", but it doesn't run anywhere I'd want to use it.)


I agree - in particular, for advanced Scala libraries it can be hard to understand the source, which can make them hard to use without thorough documentation which is often lacking.


I also have professional experience with both Java and Scala, and I always pick Scala when I have the choice.


Forget the Enterprise Architects, it's things like having to build a pipeline of 3 objects to just append a string to a file that gave Java its bad name.


That's the whole point. Pipelines are made up of interchangeable components. When a code base has to survive what it was supposed to do today to what it has to do 5, 10 or 15 years from now, you want those pipelines.


I disagree. For that I only need the possibility of replacing something simple with a pipeline, not necessity. The pipeline I am forced to build today may not only occur unnecessary - it runs the risk of having split the components along the wrong lines.


Using that logic, an oil company shouldn't bother building a pipeline, they should just load the oil into barrels and walk it out on mule's backs.


During elementry school, did you ever have to do math using blocks?[0]

Sure, it's a fine way to do math, and you get all the same answers, and it's not really much slower. But I didn't like doing it then and I wouldn't like doing it now.

Programming in Java feels like doing math with blocks. I have all this code that needs to be there, and I understand why it's there, but I still find its inclusion frustrating. Java is a fine tool, and can be used to build great stuff, but it's not to my taste.

[0] http://i.imgur.com/PSd5k7q.jpg


Do you feel is the strong typing system that gets in your way? I feel that some times. But I know that later is going save me for making mistakes when I have to refactor my code.


I also really like strong typing, for the reason you stated, though there are also lots of things I like about a "more dynamic" approach (duck typing is great, for instance).


Java has a laughable type system. Type erasure is everywhere.


"Look, he couldn't even copy stdin to stdout, LOLOL".

The most probable reason for the failure is because Java is not used in industry that much as a shell programming environment. The most common use case for Java is as a server-side, long-running server, with C++-like performance and without C++-like headaches. I work with algorithmic trading engines in my job. Though I trade personally these days and the orders are small, the engines I implemented for banks typically handle $250m worth orders per stock exchange. I have NEVER, not in a single instance in the last 5 years, have had to read from stdin in Java. Why : in my particular niche in my industry, input comes via a socket that is abstracted away by an engine that implements the FIX (Financial Information eXchange) protocol. For all practical purposes, in the finance industry, the FIX library / server IS stdin and stdout. This doesn't make a colleague of mine who forgot how to access stdin in Java a nincompoop. It's a failure to remember what is rarely used, not a failure to perform basic job duties. See the difference?

I believe the author is / was a Perl expert? That's why he thinks being able to read stdin is "so basic". I challenge the author to confront his prejudices : hire a few candidates who fail the stdin / stdout test. If they are allowed to do the damn work instead of jumping through your Perl-shaped hoops, they might have actually done quite well on the job.

Similarly, many Perl programmers would fail a test devised by someone who is used to Java.


There seem to be a couple questions here which could cast things in a decidedly better/worse light.

If the test is open book, then "I can't manage to look up how to deal with stdin and stdout in Java" is decidedly more damning than "I can't remember".

Further, if there is a requirement that you use the same language for all of the answers, choosing Java when you know Java best makes sense. If not, choosing for that problem Java (when you don't firmly remember how to do normal IO!) is itself a bad decision, when you likely would have got full marks for

    #!/bin/sh

    cat


JVM is the best like the Swiss Army of creating good superficial impression.

Want to show off to the web CRUD hipsters, you can go Groovy, Grails and Play framework.

Want to show off to the Enterprise architect astronauts, you can go Hibernate, Spring, ElasticSearch and JBoss.

Want to show off to the Big Data ivory tower academics, you can go Weka, Lucene, Hadoop and Scala.

Want to show off to the esoteric language hipsters, you can go Clojure, JRuby, Jython, JScheme and Rhino.

Sun Microsystems and applets on IE6 live on in my big heart warmed up by hot cup of cappuccino beans.


Some of the things you mentioned I would categorize as non-essential, but putting Lucene and Hadoop into the same category seems unreasonable to me. These tools do solve real world problems and they are absolutely non-superficial. If anything such lists show how versatile is the JVM.


Don't forget Armed Bear Common Lisp (Lisp on the JVM)


Never understood the hate towards Java on HN. Java is not perfect by any means but every one of the other languages he mentions in that article (Perl, Python, Ruby, JavaScript) also has their shortcomings. People focus way too much on technology and not enough on the results.

It reminds me of a great post by Coda Hale a couple of years back where he was explaining why he switched back to Java from Scala (which was hot shit on HN at the time):

http://codahale.com/the-rest-of-the-story/

"The world has yet to take me aside and ask me for my opinion of it, and in the past few years I’ve found that it’s far more profitable to build things rather than tilt at windmills."


I think part of the hate may come from people trying to code Java in a text editor rather than an IDE. Java's verbosity/class-heaviness really doesn't bother me at all when I'm in an IDE. OTOH, coding java in a text editor is pretty frustrating.


That's a fair point. Programming in Java was what made me move from Emacs to Netbeans. Well, at least for Java, HTML, Javascript etc.

It's an environment that needs an IDE because of the large number of libraries and the large number of features delegated to libraries rather than the core language (which is an architectural decision that I approve of, BTW)


I really respect Coda Hale for the work he has done on Dropwizard, and shares his views on Java vs. scala.

As for the java-hate - in my 10 years of java-programming it has always been there. They used to complain about it not being portable enough, not performant enough, GC-pauses, start-up times, and lately that it's to verbose and non-functional. But java has persisted, and only improved over the years. I'm happy I've been able to (mainly) program in the same language for so many years. It makes me realize just how long it takes to truly master a programming language (or perhaps programming in general). I guess that's why they say "..it takes 2 weeks to learn a programming language, and 10 years to master it". It's an art, and the language is your tool.


"lately that it's to verbose and non-functional"

I was complaining that Java was verbose in 1999.


I never used to understand it either until I got more exposure to other languages. Now I can't understand how anyone would voluntarily pick Java over other languages.

The codahale post is old news referencing issues which have already been fixed.


> Never understood the hate towards Java on HN

... it's not on HN. It's among the users to languages that aren't Java or C#.


I wouldn't put C# and Java together. The former is significantly better more often compared to Scala then Java nowadays. Java is still a great language though.


A language is a tool.

Some people obsess over the tool itself. My tool has these 10 features, and it was hand crafted and it can solve 8 Queen problem in 2 lines and it has monads or a cute puppy for a mascot and so on. People who use tools obsess about them, and it kind of becomes the end not just a means to an end. They would read about the new features in the language, they would run over tutorials in their free time and so on. There is a good amount of pleasure derived from fawning over the tool itself.

Others don't care about the language or the framework, they just want to job to be done and get paid. It turns out that some part of code they are working on is written in FORTRAN, COBOL, or (oh the horror!) MUMPS, that won't bother them much. They'll just figure it out and fix the issue. The pleasure comes from getting paid and solving the business problem at hand.

Both are extremes and there is a continuum between the two on which most people fall. Neither extreme I think is healthy and at the same time there are good parts about each one --

Someone who cares about the tools they are using will probably also use the tool better and more effectively. Someone who is picky about the type of hammers and their design and knows the history is also probably pretty good at hammering nails in. They might spend $1000 for a titanium hammer but hey they will be good at it.

Also someone who cares to get the job done and derives pleasure from solving a business problem, might just be the one keeping the product/company/startup afloat. Making money and delivering products is the top goal of most companies. This often means what seem to be boring, old, broken, unfashionable frameworks or tools. But so what, look we are still getting a paycheck!

There is a validity in both.


I'm sooo sick of the tools analogy when it comes to programming languages. A programming language is not simply a tool like a hammer. A hammer does one thing well, a programming language is a means of expression. It's more akin to a natural language like Spanish than a "tool" as we generally think of the word.

Yes, in the broadest sense of the word, Spanish is a tool. So is a programming language, but the analogy is stretched and stale. The image it puts in people's minds is not super helpful. And if we must use such a broad word as tool - let's acknowledge that it's really a collection of tools (every API, bit of syntax, and language feature is a tool).


> It's more akin to a natural language like Spanish than a "tool" as we generally think of the word.

I still say it is a tool, just a tool with many buttons and settings. English is a natural language that I learnt (it was my 3rd natural language) primary to read and understand computer related materials (books, keywords, instructions). It was absolutely a tool just like a hammer is a tool to drive a nail in. It turns out to be a very beneficial tool but a tool nevertheless. I would never bother spending time learning the etymology of words or finding all the tenses or declensions (but some do and derive great pleasure from that).


I don't think you read what I said - I said it is a tool in the broad sense of the word (one as you put it with many buttons and settings - I pointed out those APIs, syntax, etc) - it's just a shitty analogy because the broad sense of the word means something different than the common use of the term "tool".


It’s an optimisation problem. Both classes of people, at a fundamental level, want shit to get done. It’s a question of whether you want it done right (for some definition of “right”) or whether you want it done now.

Most of the time, there’s a balance to be struck in the tool you choose—if we did it in Haskell or OCaml or Scala, it’d probably be more correct and easier to maintain in the long run; but we are a startup that is likely going to run out of funding in a month or two, and if we hack it together in Ruby or Python or Node then no one will fault us and we can rewrite it in a good language later if we have the budget of time and money.

So while Haskell might be my primary language at work and play—and I see the tremendous value it holds for experienced developers—when beginners ask me what language they should learn, I invariably say: “Java”. At this point, it’s the essence of mediocrity, the dead middle of the bell curve; you can go better or worse from there.


Just curious if you have any actual proof that Haskell or OCaml or Scala would really be more correct or easier to maintain than the other languages you mentioned? It feels like the kind of statement that really does need hard proof and not just hand waving. For Haskell in particular, if you change assumptions in your type system you are going to have a lot of work cut out for you which seems to go against 'easier to maintain'. Not that I've ever seen any real numbers which is a big problem when making statements like that.


> Just curious if you have any actual proof that Haskell or OCaml or Scala would really be more correct or easier to maintain than the other languages you mentioned? It feels like the kind of statement that really does need hard proof and not just hand waving.

While I don't have actual proof of anything (and I doubt parent has either), removing classes of issues is a big win. Not to say that any of the three languages you list is a silver bullet, they all come with a long list of drawbacks. You just have to pick which drawbacks you dislike the least.

> For Haskell in particular, if you change assumptions in your type system you are going to have a lot of work cut out for you which seems to go against 'easier to maintain'.

What do you mean by this? If your types are fine-grained enough (which you can absolutely do in Java, at the cost of a lot more code), and you change your specification, you will need to change your types. But if you did not have fine-grained types (in any language), you're going to be fixing errors at runtime instead. If you're not going to leverage the type system, you may as well work with Python, you'll get the same effect with a lot less code.


If you're adding in new information into the type system, you'd have to go through each part of the system that uses that type and update it to use the new information correctly. If it was something like javascript, you'd just add the new information somewhere and it would be passed through without changes to where you'd need it. Making a statement like 'Haskell is easier to maintain' doesn't seem to ring true in this case. If you have unit tests for both your Haskell and javascript code then they should both be equally correct after the change.

Now maybe Haskell really would be easier to maintain - but it's something that needs proof and hard evidence. To use a car analogy for the GP's post, "I could buy a Ford because it is safer and easier to repair, but sometimes I'd just buy a Chrysler because it's cheaper and easier to purchase". There's nothing wrong with this statement if you have real numbers on the two different cars - but making the statement based on some kind of assumptions on what you think the two languages will or won't do is pure folly.

If I start a project tomorrow in Haskell or Python, the GP's post is saying my Python project will be completed quicker but be less correct. He is also saying that the amount of time I spend maintaining my Python project over the next few years will be higher than time spent maintain a same-feature Haskell project. There is absolutely zero evidence I have seen anywhere that this is correct and from experience it depends far more on the project itself and where the strengths of the language (and libraries!) lie than the language itself.


> If you're adding in new information into the type system, you'd have to go through each part of the system that uses that type and update it to use the new information correctly.

It depends on what you do. If you (say) add a new field to an existing type, this will screw up pattern matching. It's a bit inconvenient, but typically not a big deal. On the other hand, if your previously non-nullable String field becomes a Maybe String, you're going to have to update every location where you access it... and that's exactly what a reasonable programmer wants to do. Unfortunately, you can't encode this in Javascript, so you're going to resort to 'grep' (good luck if you have the same field name for several objects), or forget about it and get runtime errors. Sure, in theory you can rely on your Javascript unit tests to detect the error, and you may be right in practice, or you may be out of luck.

Generally speaking, Haskell has its fair share of problems, but it's not one of them.

> If I start a project tomorrow in Haskell or Python, the GP's post is saying my Python project will be completed quicker but be less correct.

It definitely matches my own experience (it obviously depends on your proficiency in both languages, and Haskell will have only marginal advantages if you use String everywhere because types are hard).

> He is also saying that the amount of time I spend maintaining my Python project over the next few years will be higher than time spent maintain a same-feature Haskell project. There is absolutely zero evidence I have seen anywhere that this is correct and from experience it depends far more on the project itself and where the strengths of the language (and libraries!) lie than the language itself.

It's the kind of thing for which it is extremely hard to get any evidence for (and not only in Haskell vs Javascript debates), for obvious reasons. The best you can say is that the stronger guarantees of correctness offered by a strongly-typed program ensure that it makes it much easier to detect regressions. On the other hand, if, say, there are no third-party libraries for the problem domain at hand, strong typing will not magically fix that.


You are absolutely right about the technology fetish, I think most of us here have been guilty of it. On the other hand, you do have languages which help with writing more correct programs, and others which hinder you (I'm looking at you, PHP and Javascript).


> In Java, you can forget about doing it in the cleanest or the best way, because that is impossible. Whatever you do, however hard you try, the code will come out mediocre, verbose, redundant, and bloated

(reason for liking Java)

Well, here are some other reason for liking Java: https://medium.com/i-m-h-o/da3b2c180e9c

But I really believe the future of software is affected more by having better programmers rather than having better languages.

Yes, it helps to have cool languages like Julia, Nimrod, Rust, even Scala (which I think is the best statically typed JVM language out there). But I tend to think that a good developer will be good in any language, Haskell, C, Java or LOLCODE.

I think that a good developer "forced" to use Java will find tools that bring out the good parts of the language: Dropwizard, Play framework / Akka, even latest versions of Spring and Java EE are much more convention over configuration and lightweight than before. They will use Guava and Apache commons to streamline common Java verbosity and bloat. They'll use Java 8 which allows much more robust and concise syntax to common problems.

But I don't think that the language is the problem. I think that a great developer will be able to tackle any problem with the language itself easily, because the real problem is not how to implement X, but more what is X. Designing the software correctly is the main problem at hand, not how to move input to output. This is what stackoverflow is for.

One last point about this all "I hate/like Java" type of post

1) If you write Java for your day job and hate it, either try to introduce something else (Clojure / Scala / Java 8 / Kotlin) If you can't introduce any of these then either accept your fate or just quit and find another job. If you can't find another job, then perhaps you deserve to stay and write Java so stop whining, it's not that bad (and one day you'll get to use Java 8).

2) If you don't write java for your day job but as a freelancer - start getting other non-java gigs. Not so many non Java jobs? well, continue to write Java bashing blog posts until people will stop using it I guess...

3) If you don't write Java for your day job or as a freelancer, then just stop complaining about it. There are million of other things (not just programming languages) out there worth complaining about. They can use your writing talent for a better cause.


But I really believe the future of software is affected more by having better programmers rather than having better languages.

On average, I believe the quality of programmers will decline, simply because as they demand rises more and more people will come to the field that don't do it out of interest/passion but because it a profitable profession. Not that this is already the case today - I think it's a trend that's already begun and will probably continue this way.


This was already happening during the first dot.com boom. The time I studied CS. In Belgium banks would provide cars and a paycheck while you were still graduating in your last year.

This attracted a lot of people who weren't really passionated about programming.

Now years later I was hiring during the bank crisis and I interviewed CS students who got their degree but couldn't find a job. I couldn't understand that if you have your degree and are out of job in 2 years you haven't written a single line of code on a hobby project. But these people exist.

These people will be always there in any field.


> one day you'll get to use Java 8

Only new Java projects would actually use the Java 8 features anytime soon. Existing Java codebases will stick to their Java 6 installation for a long while. It could be years before lambdas are used to any significant degree. Look at how long it took for many Java 1.4 shops to actually upgrade to Java 5, let alone encourage developers to use its features. Even today a lot of places still allow and even encourage use of collections classes without the generics specified.


You can start using Java 8 as soon as you are willing to move. I don't think there are many breaking changes in the new version. Just give it a few months to be really stable. Last week , it took me around ten minutes to move one application from 6 to 7.


"You can start using Java 8 as soon as you are willing to move."

As long as you aren't writing Java code on Android, in which case you're still stuck at Java 6 (with a very small subset of Java 7 features that were added 3 years after Java 7 came out).

Not the fault of the Java language, really, but it is incredibly annoying how unclear Google has been on the future of their dialect of Java post the Oracle lawsuit.


I have been waiting for servlet 3.0 on App Engine since 2010

https://code.google.com/p/googleappengine/issues/detail?id=3...

The funny thing is that google's own GWT requires it now.


It's worth noting that Android 4.4 does support pretty much all JDK7 features (multicatch / try with resources / etc), finally. I've run into the occasional missing method from some of the newer class libraries, but nothing too serious.

Of course, if this is any indication it'll be several years before its safe to use even JDK7 features considering Android's update trends, let alone JDK8 ...


This has made me use C++ for my hobby graphics programming on Android instead.

The NDK has all the support I need, when coupled with something like Qt/SDL/openFramerworks and C++11 feels better than Google's Java fork.


you should consider using Scala on android. works really well for my projects.


Which scala android project are you using?


There are quite a few breaking changes in Java 8, here's the official compatibility guide: http://www.oracle.com/technetwork/java/javase/8-compatibilit...

One worth noting: If existing code is using Proxy.getProxyClass and the Constructor.newInstance method to create a proxy instance, it will fail with IllegalAccessException. RFE 4487672


Why would you ever do that..? If your production code is using those kind of proxies, it must be one terrible mess. First order of businesses would be to fix the code at the very least.


This used to be the right way to create dynamic proxies. Why one would do this is another discussion. For completeness here's a link to the usage in Spring: https://github.com/spring-projects/spring-framework/search?q...


The Spring code is still fine. You didn't quote the entire advisory:

> If existing code is using Proxy.getProxyClass and the Constructor.newInstance method to create a proxy instance, it will fail with IllegalAccessException if the caller is not in the same runtime package as the non-public proxy interface. For such code, it requires a source change to either (1) call Constructor.setAccessible to set the accessible flag to true, or (2) use the Proxy.newProxyInstance convenience method.

To end up in that specific condition is fairly difficult and usually means something is wrong with the code to begin with. It's still a very easy fix in any event (adding in 1 extra line of code before the call).


"java.lang.UnsupportedClassVersionError: Test : Unsupported major.minor version 51.0"

(yeah, shouldn't happen in the best of all possible worlds, but may still happen anyway)


Java had major shifts in the language in 1.2 and 1.5. Java 8 isn't one of those cases. I think you'll find adoption won't be terribly slow.


I would put lambdas in the "major shifts in the language" category.


> I would put lambdas in the "major shifts in the language" category.

Recompile away, stuff just works. Add lambdas to your code, compile. Still works. This isn't like 1.2 where you were adding collections classes (arguably for the first time) or 1.5 where you encounter generics.


Java input output api is cumbersome, so is its date api. That is why you use Apache commons and yoda time to deal with them.

He sounds like someone who never learn Java in the first place. Neither he learned what it is good for, what it is bad for. I think that if you interview in some language, you should understand what kind of problems are easy to solve in that language and which are hard or never done.

If I have to write libraries less input output example in interview, my code will be ugly and long. I'm not saying that interview should not ask these questions, it tests whether the candidates ability to find his way around api he uses rarely. It is needed skill, so no problem there.

If I have to design complicated system, adjust it to changing requirements and maintain it for years, I feel most comfortable in Java. Type safety, tools and language itself allow me to focus on the system itself and how systems parts cooperate with each other.

He also sounds like somebody who is biased against whoever uses java, because he worked once in bad java shop. It sounds that it really does not matter what those people would do, they used java, so they are incapable in his eyes.

"In Java, you can forget about doing it in the cleanest or the best way, because that is impossible. Whatever you do, however hard you try, the code will come out mediocre, verbose, redundant, and bloated."

Guess I was right about learning Java.


The date api is refreshed in java 8, the spec lead was creator of joda time.


> But I really believe the future of software is affected more by having better programmers rather than having better languages.

Better programmers will likely make better languages. Undoubtedly a good programmer can learn to program in any language, because its the same process, but languages do certainly have their limitations.


> LOLCODE

Sorry, no. That's just too far. Imagine being handed a codebase of LOLCODE.


LOL

I hope to God no one ever was or will be handed a codebase of LOLCODE for a "real" commercial system.

But if anyone ever does, it will sure be fun to hear about it in a blog post.


To be fair, it probably wouldn't be too hard to write a parser/translator to turn into into a more readable language. Unlike something like Perl of C++, where you'd pretty much have to be superhuman to write a parser single-handedly.



Then in this case we should write a production system in LOLCODE, hitting a PostgreSQL database full of user defined functions written in plbf....


You may well be thinking, "Can it really be that hard to figure out how to open stdin?" Having just attempted it, I can confirm that it's hard. Shockingly hard. I could have easily failed that test.

I consider myself a fairly competent programmer, and I have a little knowledge of Java, including building GUIs and so on. Here's the Hello World program off the top of my head:

    public class HelloWorld {
      public static void main(String[] args) {
        System.out.println("Hello World.");
      }
    }
So to answer the interview question, I just need to figure out how to open stdin. At a guess, maybe something like System.in.readln will do it? Let's look in the class libraries.

Bingo! I've found System.in, and it says it's an InputStream. So now I just need to find ... ah ... there are three of them. Is System.in a java.io.InputStream, org.omg.CORBA_2_3.portable.InputStream, or org.omg.CORBA.portable.InputStream? The documentation doesn't say.

Well, I have a vague idea of what CORBA is, and I don't think I'm using CORBA here, so it must be java.io. Good first problem solved. Now, what methods can I use?

    abstract int System.in.read()
    int System.in.read(byte [] b)
    int System.in.read(byte [] b, int off, int len)
 
These methods just read bytes! I want a String, or at least a char. And Java uses multibyte characters. Can I get away with casting bytes to chars? I don't know. Maybe it was one of the CORBA methods after all? Or maybe ... I'm stuck!

I figured it out in the end, but only through pure luck: I need to wrap System.in in a java.io.InputStreamReader.


> "Can it really be that hard to figure out how to open stdin?"

? In most runtimes stdin is opened before you start. Certainly is true in the case of Java.

> So to answer the interview question, all I need to do is open stdin.

No, you also need to "open" stdout. More importantly, you have to figure out how to read and write to streams without losing data (which you'd think would be easy but evidence indicates otherwise).


Sorry, I'm getting the terminology wrong. I mean "read from".


I think the author takes it a bit too far: It's possible that the Java programmers have valuable skills that we could use, despite their inability to produce even a trivial working program in a short amount of time.

The fact that I/O management is cumbersome in Java doesn't relate to the ability of Java programmers to provide value.


But doing a simply copy of stdin to stdout is actually quite easy in Java.


Yes, but it requires knowing library functions that many Java programmers wouldn't have used for years and would need to look up; you just don't interact with console in Java-world not nearly as much as, say, in Perl-world.

Doing it through, say, docs.oracle.com to check what's the stdin interface - that is quite easy, but writing it on a whiteboard from scratch would fail many Java developers.


> Yes, but it requires knowing library functions that many Java programmers wouldn't have used for years and would need to look up; you just don't interact with console in Java-world not nearly as much as, say, in Perl-world.

People keep thinking that there are special functions involved in this solution for interacting with the console (which by itself is telling). Let's pretend it was just copying from an InputStream to a PrintStream. Those are both interfaces that most Java developers should be quite familiar with.


Where would your code see an InputStream? "Interacting with console" is overly specific, yes. However - the other two usecases are handling files and network sockets, and in the java world, you generally don't touch streams of bytes or lines even in those cases.

The unix approach of treating most things as streams of bytes or lines tends to be abstracted - if I'm writing socket-based communications, then an InputStream exists there; but if I'm making web requests, then it's through a library where no streams are exposed. If I'm writing a logging library, then a PrintStream exists there; but if I'm doing proper logging, then again, it uses a library that doesn't expose any streams to a public interface, just the appropriate methods. Custom file formats would involve handling streams directly, but general practice in line-of-business apps nowadays is to never do that - you just [de]serialize to json/xml/something else standartized; and again that's done by a library that uses streams but doesn't expose them.

These things are trivial to do in java, but what I'm saying is that java is very often used in large codebases that don't touch raw streams at all. Students in java lessons use InputStream&PrintStream, but after starting work, promptly forget what the methods are called as they're used rarely.


> However - the other two usecases are handling files and network sockets, and in the java world, you generally don't touch streams of bytes or lines even in those cases.

I think you live in a pretty rarified Java world.

> The unix approach of treating most things as streams of bytes or lines tends to be abstracted - if I'm writing socket-based communications, then an InputStream exists there; but if I'm making web requests, then it's through a library where no streams are exposed.

Except if you want to handle large amounts of data efficiently, you end up using streams to read and or write those web requests.

> Custom file formats would involve handling streams directly, but general practice in line-of-business apps nowadays is to never do that - you just [de]serialize to json/xml/something else standartized; and again that's done by a library that uses streams but doesn't expose them.

...and I guess you never have to debug anything or understand what is happening underneath?

> Students in java lessons use InputStream&PrintStream, but after starting work, promptly forget what the methods are called as they're used rarely.

Then why do you think they teach 'em? Aren't all subsequent concepts built on top of them?


> Except if you want to handle large amounts of data efficiently, you end up using streams to read and or write those web requests.

No you wouldn't, unless you're working on some tiny project where using raw servlets might be justified (though I'd be highly suspicious of anyone doing raw servlets). Most any other framework will provide higher level abstractions such as Play's Iteratees or variations of sendfile to handle the streaming for you. You're job is usually just to configure the proper marshaling setup so that your higher level object(s) delivered by your controller methods are properly serialized by the lower levels of the framework.

Occasionally you may need to actually write marshalling code but you should definitely be wary that you're doing it wrong if so.


Why? What on earth would I ever do with an InputStream or a PrintStream, other than passing them from (say) a http library to a json-parsing library?


I dunnoh... reading data, writing data. Even if you only ever talk to a database JDBC has API's for accessing data through a stream. It's kind of the default way to handle networking and file data...


Sure, but what am I going to do with that data? It's very rare that I'd want to twiddle the individual bits; 99% of the time a stream is an opaque handle that I pass through to an ObjectInputStream or an XML parser or write to an http response.


If you're writing mediocre code and are happy with it, the problem might not lie with the language.


Personally, I like my current mediocre code taskings. There's nothing innovative at all about the programs I'm writing for work, and it's fantastic. It's my mental vacation for a few months (they're not small programs, but they aren't difficult either), staving off the inevitable burnout.


If mediocre code gets the job done faster and it's reasonably maintainable, but you avoid it due to code-idealism, the problem might not lie with the language.


If your code is "reasonably maintainable", I don't think you can call it mediocre code.


Calling out a well-known and highly intelligent polymath author is a good way to appear a little silly.


> I was a professional Java programmer for three years

3 years is about the amount of time it took me to get a false sense of security in my understanding of a language, imo. After 3 years i thought I knew Java. After 3 years I thought I knew Ruby. I was wrong.

BTW- if I were to receive a question on my application asking how to read stdin and write stdout, that would be a bad sign.

> So yes, I enjoyed programming in Java, and being relieved of the responsibility for producing a quality product.

Many quality products I use have been written in Java.

> It was pleasant to not have to worry about whether I was doing a good job, or whether I might be writing something hard to understand or to maintain.

Though, I understand the intended meaning of "Java is too verbose", You can write code in Java that is easier to maintain. Java is not an excuse for poorly written code.

> The code was ridiculously verbose, of course, but that was not my fault. It was all out of my hands.

You get paid to write maintainable code that does what is intended. It was never "out of your hands".

I no longer like Java as much, but I don't think Java ever was a reason to sit back and let bad code happen.


Java is a likable language. It's not a lovable language.

It occupies a weird middle ground where you give up the capability to code to bare metal or easily interface with C functions in exchange for nearly being able to quickly and succinctly create powerful abstractions.

There's not a whole lot wrong with it, but I don't think I've ever been using any other language and missed something from Java. But when using Java that experience is common and I find the amount of effort you have to put in to not write repetitive code is annoying and results in overly complex architectures.


> It occupies a weird middle ground where you give up the capability to code to bare metal or easily interface with C functions in exchange for nearly being able to quickly and succinctly create powerful abstractions.

I'd say this ground is well and truly occupied by Common Lisp. Java occupies some other ground which is neither quick nor succinct.


Another story to go with the author's story about people who fail a code test because of Java:

At MIT, in 2003, there was a class called 6.171 Software Engineering for Web Applications. The final project was, in fact, to design and make a web application. It was very open-ended and it could be in any language you wanted.

This particular course soon stopped being taught, because a large number of people failed the final project. They could not make an application that functioned in any way by the time the course was over. (Keep in mind: 2003. There weren't really any easy answers to web applications.)

Many people had chosen to do their project in Java, because that's what they had learned and used in the prerequisite course. And in particular, over half of the people who chose Java failed.

(By the way, the course on web applications is now the main project-based course on software engineering at MIT, and it is now taught in Ruby.)


> (By the way, the course on web applications is now the main project-based course on software engineering at MIT, and it is now taught in Ruby.)

Which one is that? As far as I can tell, the "new" 6.170 (6.S197) uses Python (or at least used it in the fall of 2011).

(I gather by its new subject number that it is no longer a part of the core requirements, being largely supplanted by 6.005, which ironically appears to use Java. Of course, when I took 6.170, we used CLU. Boy, am I old.)


I haven't actually interacted with the MIT curriculum in years, so I might be missing the point, but I was looking at the syllabus for 6.170 Spring 2013: http://ocw.mit.edu/courses/electrical-engineering-and-comput...

I guess it would be true that 6.005 is the "main" software engineering course now, not 6.170.


What was wrong with using JSP? I dont see any thing in it that would stop a whole class full of MIT students (aren't hey supposed to be smart) from finishing their project


If I interviewed for a company where a serious interview question was to copy stdin to stdout I'd probably laugh and walk out. Stating the obvious: this company obsesses over the most efficient way to re-invent the wheel... Good luck getting any real work done or learning any career portable skills!


Have you ever interviewed programmers? Have you ever attended a job interview?

(1) This is an interview question, he's not asking you to be a professional stdin->stdout copier

(2) this isn't a micro optimisation problem:

    > The first question on the quiz is a triviality, just to let the candidate get familiar with the submission and testing system
(3) A surprising number of programmers can't program. Basic warmup questions are a good thing and can save a lot of your time http://blog.codinghorror.com/why-cant-programmers-program/

(4) Even if this isn't his attempt, there's genuine depth to even this problem. If you do the obvious

    a = get_byte()
    while a != EOF:
        send_byte(a)
        a = get_byte()
There's a surprising amount of work going on behind the scenes that you can dive into discuss how I/O works, when would you do this using blocking vs. non-blocking I/O, when should you do your own buffering or let the language/library do it for you. But it's short enough to get started to get to those points very quickly.

(5) I wouldn't hire/not hire someone based on style alone, but you can get a pretty good feel for it from short programs that actually work. If the tasks or snippets on which you interview are too short to be complete programs, you miss a lot of organisation and "shape of the code" kind of stuff from their style.

(6) Interviews are tough as an interviewee. Professional programmers aren't professional job interviewers and often it can be hard to read their mind to see what they are looking for in a particular question. "It's open ended, take it where you like" usually doesn't mean what it says on the tin. As far as going from "here's a question" to "here's an answer" with little ambiguity this is one of the better FizzBuzz type questions I've seen.


> (3) A surprising number of programmers can't program.

This reminded me of the interview for my first job. I was basically between a rock and a hard place so part of what follows was clearly, in retrospect, a bad sign, but the choice between job and no job but $20k in debt to cover, I took the job.

The shop was a C shop, embedded systems. When I went in to interview I was virtually a shoo-in (just learned I'd misspelled this for years as shoe-in) for the job due to a college classmate already working there and recommending me. The code segment was straightforward. It presented several "implementations" in C of functions to swap the values in two variables. That is:

  int x = 10;
  int y = 20;
  swap(x,y); // after this point x == 20 and y == 10 should be true

  void swap(int x, int y) { int t = x; x = y; y = x; }
Now, that obviously won't work in C. One of the implementations was correct, 3 were wrong (like the above snippet). I was asked which were wrong, why they were wrong, which was correct. I answered them all (note: this was a week 2 or 3 programming assignment when I took a course using C as an undergrad, it really seemed obvious to me), and the interviewer was surprised. I was told, "You're the first person to get all these questions correct."

Like I said, warning signs. Alarm bells should have been going off. If the interviewer tells you that, that means that your colleagues couldn't answer those questions. That only occurred to me about 4 weeks later once I'd finished all my reading and project familiarization and was getting into really working with my new peers.


My initial impulse would've been

#!/bin/sh

cat


You can go one better: just symlink cat and avoid an unnecessary exec.


Per the article, this didn't sound like a serious question, but a "get used to the system we use" question. Like a test system implementing a multiple choice exam asking a ridiculously stupid first question to get the user familiar with the interface, like: "Is your name: A. Jtsummers; B. Bob Summers; C. John Doe; D. The Queen of England". And expecting the user to correctly pick their own name.


Have you conducted interviews yourself? It is harder than it seems.

There are a lot of nervous and shy people. Easing them into the process with an 'icebreaker' question gives them a chance to open up for the real questions.


The author made it clear that the question's primary purpose was to see if the applicant could use the system. They assumed, incorrectly as it turned out, that the problem was about as trivial as hello world.


Wrong. The reason FizzBuzz (as another example) is a widely used interview problem is that it is a middle ground: a problem that won't be a time-waster on competent candidates, but will completely stump idiots.


Maybe not reinvent, but you'd damn well better understand how the wheel works or at least be able to deduce it, maybe even be curious about it as an engineering problem. Forget even that somebody's still got to build the wheels of tomorrow (last I checked my car and the car I am likely to buy decades from now still has wheels so I imagine its proper function is still an ongoing engineering issue for somebody), but say you work on a problem and it turns out the root cause of an issue lies in the proverbial wheel - are you going to hem and haw and say that this piece is beneath you? Or are you going to fix the problem?


Given the solutions that people have posted on here which have demonstrable problems, I think it's a pretty good question as an opener. And as someone said already, it gives you a lot of scope for drilling down into the nitty-gritty.


A corollary: we use Java and Python mostly, and when interviewing candidates, the vast majority of them choose to give their answers in Java. I don't ask particularly hard questions, but I find that I usually get a good sense of the candidate by just talking to him as he's working out the answer.

This one guy chose Python, which delighted me since it's so rare. He then proceeded to produce answers much more easily than the average candidate, and he was able to make changes as I suggested them without too much effort.

We hired him but had to let him go after a month, when it became clear that he wasn't as good as I'd thought he was. Turns out it was Python that just made him look smart.

I like Java for actually getting things done, but it's such a vast ocean of TMTOWTDI that interviewing in it seems harder than in smaller languages.


"I am reminded of Dijkstra's remark that the teaching of BASIC should be rated as a criminal offense. Seeing the hapless candidate get bowled over by a question that should be a mere formality makes me wonder if the same might be said of Java."

Well, actually it has been said by Dijkstra again. www.cs.utexas.edu/users/EWD/OtherDocs/To%20the%20Budget%20Council%20concerning%20Haskell.pdf


Hum. If that's a positive spin on Java, you can count me unimpressed.

I have a side project I'm maintaining (all in) Java that I spent a weekend on about two months ago. Frankly, it was ridiculously easy to splat out code with Eclipse (and not get anything done, but that's mostly my fault).

Several fundamental issues showed up:

- lack of first-class functions

- lack of extensibility on extant classes

- grossly mutating state

- Loads and loads of code that obscured what the logic of the program was

Project Lombok is a heinous hack to try to get around these issues (somewhat). So is, afaict, AOP.

I think for the right project - and with ruthlessly non-generic design - Java is the right tool. Otherwise you wind up with reams of wasted paper and mental overhead grappling with the sheaf of low-entropy code you've got.


> Hum. If that's a positive spin on Java, you can count me unimpressed.

I think you missed the point of the article.


> Project Lombok is a heinous hack

I couldn't agree more. Lombok was the worst thing to ever happen to our codebase.


I used to feel Java was a pain to write when working with Java 6. Many 250 characters lines and everything.

Java 7 kinda divided the line length by a factor 2 with the diamond operator, the try-resource and so on.

I've programmed some Java 8 since yesterday and it actually feels like a pretty decent language to express ideas. Nothing like Java 6.

I know most of the Java haters have been struggling using it day to day in settings were it was awful (over engineering + Java 6, hello, here is the gun).

But I feel like those haters should really try to think fresh on this one. Try Java 8 and its new features and decide for yourself _afterwards_.


Is this what he's looking for?

  public static void main(String[] args) throws IOException {
      int ch;
      while ((ch = System.in.read()) != -1)
          System.out.print((char) ch);
  }


You know, I kind of thought he was joking that Java programmers messed this stuff up.

I was so wrong.

Three things wrong. The first is pedantic: you need to import IOException. The second though is one of those "scream out loud at the universe* questions: why the heck are you calling print() instead of write()?

Finally, do you think maybe you might need to worry about buffering of System.out's stream?


> Finally, do you think maybe you might need to worry about buffering of System.out's stream?

print() won't flush to the console unless you pass '\n' and auto-flushing is on. It will flush its internal buffer, but that's okay! The default System.out uses a BufferedOutputStream anyway:

    FileOutputStream fdOut = new FileOutputStream(FileDescriptor.out);
    ...
    new PrintStream(new BufferedOutputStream(fdOut, 128), true)
So using print() instead of write() shouldn't cause any extra system calls, although there many be a small CPU cost.

Obvious, right? Don't you love java.io? :)


> The default System.out uses a BufferedOutputStream anyway

Yes, which is the _problem_, not the _solution_.

To clarify: the problem isn't that the code makes too many syscalls.


So what are you saying is the problem then? That writing to the BufferedOutputStream for each character is CPU-inefficient and he should be doing this?

    public static void main(String[] args) throws IOException {
        byte[] buffer = new byte[1 << 12];
        for (;;) {
            int nRead = System.in.read(buffer);
            if (nRead == -1) return;
            System.out.write(buffer, 0, nRead);
        }
    }
If so, fair enough, but it's reasonable to go with the simpler solution if you're not given any particular performance requirements.


No, I'm saying you are already using a BufferedInputStream and a BufferedOutputStream. That is the problem: you aren't guaranteeing your output matches your input.

Try doing this:

    dd if=/dev/random of=test_file bs=4000 count=100
    java YourClass < test_file > test_output
    diff test_file test_output


Here's the simpler, likely slower, but nonetheless actually correct solution: https://gist.github.com/cbsmith/9755809


Oh, gotcha. The code above with "byte[] buffer" also works since write(byte[], int, int) flushes.


I mean, if you really want to be pedantic...

> you need to import IOException

    try {
        ...
    } catch(Exception ex){ ... }
;-)

Edit: Or if you want to be just as pedantic, and not write an obnoxious catch-all

    try { 
        ... 
    } catch (java.io.IOException ex) { ... }


I'm not sure why you'd want to use a catch, but I agree there is more than one way to solve the problem. I was merely pointing out one thing that would prevent the code from passing the test framework.


Sorry, I didn't think I was telling you something which you didn't already know. I just saw a hole in the pedantry and felt a strange need to point it out. Now that I've typed that, I realize what a horrible person I am.

As to why you'd want to use a catch, I think it's kind of sloppy to let the user see an uncaught exception. You might as well just do

    public static void main(String args[]){ 
        try {
            ...
        } catch (Exception e) {
            System.out.println("This is horrible software, and you shouldn't use it.");
        }
    }
... as that's what many users already see when exception goes uncaught.


I could see a try/catch with a System.exit(-1) maybe, but arguably the default handler does the right thing (returns an error code and writes diagnostics to stderr).


He wrote the code to show you that you don't need to import IOException you can use a catch all to avoid a throws statement in the method header.

This is bad code as he said.


> He wrote the code to show you that you don't need to import IOException you can use a catch all to avoid a throws statement in the method header.

You're conflating two things. I could just as easily have caught Throwable or Exception and not needed the import. The question is: what would you actually put in the handler that would be so much better? The actually provided sample code is much worse, as it prints to System.out instead of System.err, and it doesn't report an error to the parent process, so you muddy up the output (imagine if the file you were copying actually ended in "This is horrible software, and you shouldn't use it.", how would you even know) and there is not a terribly easy way to detect an error happened.

> This is bad code as he said.

It's not bad code to declare a static exception as escaping your method, particularly if you don't have any logic for handling it. It's bad code to have an exception handler that doesn't.


Isn't there a Steam.copyTo(Stream) method somewhere? Or was that .NET?


Yes, Apache Commons has IOUtils.copy and Guava has a similar ByteStreams.copy.


Looks like I don't get the job.


Here's the thing though... Those mistakes you made aren't language specific. What languages don't expose you to making similar mistakes?

There are some, but they aren't the ones people usually think of when they say, "I'd always use X instead of Java".


It mostly is language specific though isn't it? Take python: IOError is a global, one main way to print, you don't have to worry about the output buffer.


Oh no. Python has lots of ways to print. You can actually pretty easily hose that part up in Python, and Python doesn't require the flush because it automatically closes/flushes the stream. In general (as in, if you didn't know you were using sys.in and sys.out) in Python you'd want to use a "with" statement to ensure that really happens.

So, for example, if you are in Python3, sys.in and sys.out are by default in text mode, which is not going to end well for anyone.


Well, if I were making the test framework for this test, I'd only send ASCII text to the programs.

It's a trivial test, so I'll give you trivial input and not have you worry about "what if someone gives me random bytes".

The test isn't to demonstrate you know off the top of your head how to deal with an edge case. But if it were and I were allowed to pick my language, I'd pick bash and just write

  cat


> The test isn't to demonstrate you know off the top of your head how to deal with an edge case.

The only edge case is knowing you need to flush the output, which is relevant whether you send ASCII text or not. The other edge cases around ASCII are just whether you use the correct API's or not.


Question: I see all this discussion about buffering and encoding and such.

In python, this seems to work: import sys sys.stdout.write(sys.stdin.read())

What is being abstracted away/handled by python that I'm not seeing?


.read() will read the entire contents of the file object into memory. That's fine if you're just doing `io.py < smallfile.txt`, but if the file is huge you'll run out of memory. And stdin is an infinite stream (eg. `io.py < /dev/urandom`), your Python code will never print anything. If you write `for line in sys.stdin: sys.stdout.write(line)` instead, you'll solve these problems by buffering line-by-line.


> If you write `for line in sys.stdin: sys.stdout.writte(line)` instead, you'll solve these problems by buffering line-by-line.

Except no. Then you run in to problems with a binary file that might not have a line break.


Nothing beyond an automatic flush/close of stdout on exit. The python code above is also reading the entire file in to memory before writing it out to disk, which isn't exactly wise. Aside from the addition of a flush at the end, the correct Java code for this is simply a loop with an exit test over a read() & write() though, which is really no different from the proper Python code.


There seems to be confusion on how to write this properly, so don't feel bad about missing this (or alternatively feel bad for all of humanity). I believe this Java code I posted elsewhere handles things just fine: https://gist.github.com/cbsmith/9755809


Any language right? Here's classic C++.

  #include <iostream>

  int main ()
  {
    std::cout << std::cin.rdbuf();
    return 0;
  }


In perl:

while (<>){ print };


    perl -pe ""


I have a healthy love/hate relationship with Java - but that's the first thing I thought of too. Alternatively you could read into a byte[] buf and not have any casting.

This is valid JDK 1 code. You only need to read the Javadoc for InputStream and PrintWriter. I think the author needs to check his biases.

Now Javascript... (just kidding).


> I have a healthy love/hate relationship with Java - but that's the first thing I thought of too. Alternatively you could read into a byte[] buf and not have any casting.

Okay, that kind of points to the problem. You don't see anything wrong with that code beyond the casting (and avoiding casting is the wrong reason to read in to a byte[].. which opens up some additional complexity too).


Sigh so half of an interview is to probe the candidate's knowledge and the other half is to understand their thought process.

So my ideal impl in java would be:

  public class F {
    public static void main(String [] args) throws Exception {
      byte[] buf = new byte[256];
      int len = -1;
      while ( (len = System.in.read(buf)) != -1) {
        System.out.write(buf, 0, len);
      }
      System.out.flush();
    }
  }
In python

  #!/usr/bin/env python -u
  import sys

  while True:
    buf = sys.stdin.read(256)
    if buf:
      sys.stdout.write(buf)
    else:
      break
Which both feel roughly the same. Although I kind of wish python would let me assign within the expression - just in this case. The similarity boils down to the fact that they're sitting on top of posix read and write calls. There's not much variation you can get from that.

The differences get down to fd stream flushing. Python we enable unbuffered io with -u, java we must flush. In C++ we can count on all FDs being flushed in atexit.

You certainly can muck this problem up for binary files and different charsets if you're not careful. But the OP's solution still works in many cases. A fine first answer. Any time a candidate writes code for you during an interview, the first answer will most likely have issues (stress, trying to finish quickly to impress, etc). I care very little about that stuff if there answer is roughly functional. After they have their first stab out, then you go thru the 'why did you do it this way?' 'what corner cases might you be concerned about?' and so on. The answers to those questions will let you make a good estimate of their knowledge and competence.


So, my first follow up would be: why are you bothering to read in to a byte array, and why a 256 byte array?

And to be clear... the correct answer is that in Java it basically buys you nothing but some extra overhead initializing an array and in Python it merely saves you trips through the repl loop. Either way the choice of 256 bytes is arbitrary.


I'd phrase it in a nicer way in an interview, but on a forum: You're wrong. The explicit buffering afforded by read(byte[]) & write(byte[], off, len) vs. read() & write(int) saves demonstrable time as file size increases. 256 - yeah arbitrary :) tweak as necessary.

on my junky MBP running java 6 -

read(byte[]) & write(byte[], int, int):

  real	0m6.588s
  user	0m0.876s
  sys	0m1.406s
vs. read() & write(int)

  real	0m16.836s 
  user	0m10.624s
  sys	0m4.245s
vs the mighty cat - < in > out:

  real	0m4.609s
  user	0m0.015s
  sys	0m0.458s
For a 150 mb file.

I'd say read(byte[]) / write(byte[], int, int) performed pretty well.

Now some points.

System.in is an InputStream, System.out is a PrintStream. Assuming anything more about their buffering nature is wrong. AFIAK there is nothing in the java spec that spells out any other explicit behaviour. So if you want to ensure your code works properly, code against the interfaces you're given, not the implementations you expect are sitting behind them.

This question (and stdin/stdout/stderr questions in general) are heavily biased in favour of devs comfortable of working in a Unix-y environment. If you're fine with short changing people that use that other OS that's cool. But if you want a 'trivial' demonstration of programming skill - a FizzBuzz type of question is much less biased way of achieving that.


Okay, I stand corrected that you can achieve some win due to doing buffering yourself. I guess it does depend on your implementation (though I'd think the JIT would eventually get around to eliminating most of that overhead you see), though you should still have buffering as per the platform's standard runtime, which I'd expect in most cases would be not insignificant.

> So if you want to ensure your code works properly, code against the interfaces you're given, not the implementations you expect are sitting behind them.

A test which the original code, and your initial review, failed on multiple fronts (failed to handle character encoding issues and failed to flush the output stream before exiting).

> This question (and stdin/stdout/stderr questions in general) are heavily biased in favour of devs comfortable of working in a Unix-y environment.

I honestly find it hard to believe that Java devs would have their skills sets be terribly different regardless of platform they run on, and I really would think people would learn the basics of working with streams regardless of the platform they work with, but I grok the gist of what you are saying.

> But if you want a 'trivial' demonstration of programming skill - a FizzBuzz type of question is much less biased way of achieving that.

FizzBuzz doesn't really test for knowledge of the language's libraries though. I always thought the purpose of these tests was more to validate that the programmer had successfully coded in the past, rather than any demonstration of skill.


>(though I'd think the JIT would eventually get around to eliminating most of that overhead you see)

150000000 iterations of a loop would warm the coldest JIT. JIT speeds up bytecode execution - but if you trace the execution of this code you'll see we often end up crossing between JVM land and native land via JNI to push output to write(2) or pull input from read(2). Crossing the JNI boundary is not free and cannot be optimized with JIT (there's lots of data copying back and forth - maybe some memory pinning etc). Using explicit buffers, you pay this overhead less often. Increasing the buffer size reduces the overhead - but of course with diminishing returns. Using read()/write(int) you pay this penalty more frequently. In general it's almost always a bad idea to use single element methods when the interface also exposes bulk methods and you want to do bulk work.

> A test which the original code, and your initial review, failed on multiple fronts (failed to handle character encoding issues and failed to flush the output stream before exiting).

Mea culpa - OP & I had similar idea and a quick scan of the code seemed generally correct. If I was interviewing him / her I would spend more than a second reading the provided code.

> I honestly find it hard to believe that Java devs would have their skills sets be terribly different regardless of platform they run on, and I really would think people would learn the basics of working with streams regardless of the platform they work with, but I grok the gist of what you are saying.

I'd still argue that stdin/stdout/stderr are instances of IO streams. I'd hope all programmers are relatively confident with manipulating IO streams. But these particular instances are really familiar to people working in Unix and uncommon to Windows people. I'd rather have the candidate thinking about streams and not fumbling in the back of their mind kind of remembering what stdout is.

> FizzBuzz doesn't really test for knowledge of the language's libraries though.

Granted - but I think this thread illustrates that there are a ton of other aspects besides standard library knowledge in play when you dig into this question. If you want to know if the person sitting in front of you can program, FizzBuzz. If you want to know if they grasp the language's standard libraries - ask them a question that requires using collections. If you want to see if they understand the platform their running on do something with File IO.... and so on.


> I'd still argue that stdin/stdout/stderr are instances of IO streams. I'd hope all programmers are relatively confident with manipulating IO streams. But these particular instances are really familiar to people working in Unix and uncommon to Windows people.

So, if I reframed the question as "copy the contents of one file to another" and asked them to write a class that implements:

    public void copy(java.io.InputStream in, java.io.PrintStream out);
Would that change the solution?


Slightly off topic: but I just wanted to thank you for continuing the thread in this fashion. I found it educational (in a 'how to teach/how to interview' way)


I think so. Or something like that. Might be better to use "write" instead of "print", to avoid character encoding issues.


> Might be better to use "write" instead of "print", to avoid character encoding issues.

Ya think!?


public static void main(String[]args) throws IOException {

  Scanner sc = new Scanner(System.in);

  String line;

  while((line=sc.readLine())!=null)

     System.out.println(line);
}

What's so verbose about it?


So, that actually is going to have problems with binary data in the file. However, you are correct it doesn't need to be terribly verbose (not as efficient as it could be, but that is easily fixable): https://gist.github.com/cbsmith/9755809

It's amazing though how often the solutions posted are just wrong.


You didn't import Scanner, and java.util.Scanner (what I assume you are using) does not have a readLine method. :)


The JVM is an awesome piece of tech - it still boggles my mind that runtime performance can get <2x native C speed for many applications.

Java itself less so. It is just far too verbose. I found that I would get completely frustrated coding in an editor without code generation. Using Netbeans to generate most of the boilerplate code was the only way I stayed sane coding in Java. I cannot imagine having to write lots of production Java code in a "normal" editor. If my IDE can figure out what I'm trying to do, so should my programming language.


Interesting polemic. The arguments aren't super compelling to me.

Ultimately java is just a tool. The culture around it shapes the macro picture but still.


Following up, the main argument is "java doesnt have hashes built in".

Well neither does C, C++ as well. Nor does Scala actually.

There are a lot of good things about Java, namely the JVM, interfaces, great libraries, reflection and code generation, etc, etc. For example libraries like jmock/etc.


C++ does include hash functions. Of course, most will just use std::unordered_map.

http://www.cplusplus.com/reference/functional/hash/


That isn't part of the language, that's part of the standard library. Just the same as java, which has a standard hashing function, AND hash table/map.

Ultimately, where does the language end, and the standard library begin? Whats the real difference? Yeah, so C++ and Java don't have built in syntax for hash tables. That's it?


"Whatever you do, however hard you try, the code will come out mediocre, verbose, redundant, and bloated"

That's, unfortunately, common misconception and excuse for writing bad software in Java.

Java has its own problems (as if other languages does not have them), rigid objects creation that has to be overcome by dependency injection, nulls that has to be overcome by using Optionals, badly designed time/date handling standard libraries, etc.

However there is a huge number of good quality, well design libraries that make Java shine.

- Guava gives us functional-like programming style plus nice collection handling - Joda Time solves problems with date/time handling - Lombok (projectlombok.org) cleans up code from getters/setters clutter - Need a nice, collections only library, there is plenty of them (for instance GS Collections) - You need to create super performant server? Just use Vert.x, Distruptor library, you are getting nice APIs and a lot of functionality for nothing - You want to write nice, concurrent code without using low level primitives, no problem, go with Akka - You need to orchestrate actions flow in your (huge) application - do it easyly, elegantly and almost without coding using Apache Camel - Google Guice dependency injection library provides nice workaround for a rigid object creation strategy in Java

And so on, and so on. Java ecosystem is changing, there are new ideas that refresh language, despite its rigidness (no macros).


Use Java when you're paid by the hour.

Use Clojure when you're paid by the project.

Use Haskell when you need to ensure job security.


I knew a guy named Steve who ensured job security using Delphi. He wrote this malevolent and semi-sentient framework called The Framework, that managed to combine the ugliness of COBOL with the programmer-friendliness of... well, COBOL, now I think of it. Delphi has a couple of different flavours of inheritence built in -- object inheritence, controls owning subcontrols, TFrames embedded in TForms -- but The Framework increased that number somewhat. From memory, I think I counted nine different kinds of hierarchy, all intermingling in weird and utterly undocumented ways.

For all I know, he's probably still working there. The poor, poor bastard.


Have you submitted your story to thedailywtf?


I often considered it, but this was over a decade and a half ago, and I don't recall enough details to make a good story out of it.


> Use Haskell when you need to ensure job security.

I'd say, Use Java when you need to ensure job security. And it is.


I'd say use C++ because it seems nobody writes in it any more. For ultimate job security, use MFC!


Still heavily used in games and finance! Though, you're right about MFC.


Ah yes, very true. I am not in games or finance but I use it day to day. Doesn't seem to be the language of the month and a lot of hatred is directed towards it, which I cannot fathom entirely.


TL;DR

"There are many bad java programmers, hence java is bad. Even I was bad at it. Saying that I like java is just sarcastic link bait."


He doesn't like java.


You can like something that you don't think is particularly useful or even productive. I like Brainfuck and Whitespace, but I'd never write code in them unless I was really depressed.


He does. People are missing the point.


No, pretty sure he was being demeaning to the point no developer would read this and want to work in java.


Yeah, not getting his point. Sure, you wouldn't want to work with it, but the reality is that letting go of the idea of writing perfect code helps you just getting it done. It's sickening, but it's a valid point.


Which is not a viewpoint from someone who "likes" it. They have an objective, professional point of view. He doesn't like it though.


I've been programming in java for years and I still cannot make a program that compiles on the first try, unless I 'cheat' using an IDE or peeking at the API.

I couldn't even do the 'read-from-input write-to-output' thing mentioned in the example.

It's true it's because of the language: I've believe Java is more 'Enterprise' and sophisticated.

But I KNOW that because of this I tend to write better code (at least more standardized and better structured) in Java than in other languages I've had the same practice with, such as C, C++ or PHP, even though I don't need any IDE or manual of any kind to write such a simple program on those. And, no matter the language, it takes me the same time to do it right.

My conclusion is: If you like to 'hack together' a tool Java is probably not the best choice of language; If you want to build some big project on a team and keep it robust and maintainable using efficient libraries, it is.


What I liked about Java is: * The IDE * Staticness and strictness * The JEE deployment model. I think this is overlooked and unappreciated. Having a standard way to specify database connections, queues, etc is very nice. I had never stored any passwords in the source repository


I'm a professional Java dev and couldn't do this from the top of my head without System.* documentation.

But if I can choose the language it's easy...

    #!/bin/bash
    cat
If people can't come up with a different solution they probably just suck...


> I'm a professional Java dev and couldn't do this from the top of my head without System.* documentation.

All that documentation will show you is that you use System.in to access standard input, and that it is an InputStream, and you use System.out to access stand output, and it is PrintStream. I'm sure an interviewer would be happy to provide you with those prompts. Could you solve the problem with that help?


Nope, Had to look at the javadoc for those two classes too. I guess you don't memorize this stuff when Ctrl+Space in eclipse constantly tells you what you need.

And it probably doesn't help that the java.io package is also really horrible.

read() returns byte[] but print() accepts only char[], so you'd need to cast or use write() which accepts byte[], but you need to pass it an offset and the length too: write(byte[] buf, int off, int len). Eh, that's just stupid. I'm happy eclipse tells me that stuff...

If I had to do this at an interview in Java, from the top of my head I'd probably ask if it's okay if I use Apache Commons IOUtils and try to IOUtils.copy().

But again, the article talks about language of your choice. I was just a bit shocked when I was able to write this in C and perl without autocomplete even if I practically never use C or perl.

Makes me wonder if it's really the IDE's that make you not memorize the standard library. Or if Java's standard library is really that horrible.

I think it's a combination of both.

Or maybe because I usually don't deal with this stuff in Java while implementing boring business logic.

Also, C:

    #include <stdio.h>

    int main(){
        char strbuf[1024];
        while(fgets(strbuf,1024,stdin)){
            puts(strbuf);
        }
   }
Perl:

    perl -e 'while($a=<>){print $a}'


Fwiw, as pointed out elsewhere you can simplify the Perl 5 to:

    while (<>){ print };
Fwiw, in Perl 6, the canonical incantation is:

    .say for lines
where:

* thing.method calls method 'method' on thing (where thing is an object, or something that can behave as an object, which, in Perl 6, is any value). In the code above the 'say' method prints its arg, followed by a \n, to stdout.

* If thing isn't specified, the current topic (aka "it" aka $_) is assumed, so .say is being called on "it".

* 'for' sequentially sets the current topic ("it") to each of the items in the following list of things.

* 'lines' lazily reads lines (next chunk of text up to the next \n) from somewhere. The default somewhere is stdin.

A simpler incantation is:

    slurp.say
which slurps all of stdin as a list of lines and then says them all, but slurp isn't lazy, so it'll wait till it's read all of your stdin before writing any of it to stdout.


> Had to look at the javadoc for those two classes too.

Well, I'm a little surprised, but I wouldn't begrudge an interviewee a peek at the javadocs. You should be pretty familiar with the API's and how to use them correctly though.

> read() returns byte[]

Actually, it returns an int. You're thinking of read(byte[]) I think... but it also returns an int.

> print() accepts only char[], so you'd need to cast or use write() which accepts byte[], but you need to pass it an offset and the length too: write(byte[] buf, int off, int len). Eh, that's just stupid. I'm happy eclipse tells me that stuff...

Eclipse won't tell you that a) you shouldn't use print(char) because you aren't trying to format a character. The offset & length is a pretty common Java idiom, and actually important for getting this right.

> If I had to do this at an interview in Java, from the top of my head I'd probably ask if it's okay if I use Apache Commons IOUtils and try to IOUtils.copy().

That's perfectly legit.. if you remember to flush the output stream.

> But again, the article talks about language of your choice. I was just a bit shocked when I was able to write this in C and perl without autocomplete even if I practically never use C or perl.

The C code is fine enough, though doesn't correctly detect when an error occurs.

The perl code is going to potentially chew up a lot of RAM with that input.

I agree though that Perl & C make this pretty easy to do right, and Java (along with a surprising number of higher level languages) doesn't.


So, this discussion has already demonstrated how Java programmers get this stuff wrong. Kind of head shaking stuff. Here's a simple implementation. No exception handling logic (and some would no doubt quickly with allowing throws of Throwable), on the assumption that anything that actually generates an exception would actually be the kind of thing you'd not want to recover from, and it uses synchronous, single byte at a time (buffered) reads & writes, but it actually runs reasonably efficiently as compared to using most scripting languages.

https://gist.github.com/cbsmith/9755809


> on the assumption that anything that actually generates an exception would actually be the kind of thing you'd not want to recover from

In that case it does exactly the wrong thing, by declaring checked exceptions in the method signature, which client code will be forced to deal with.

If you don't even understand how checked exceptions are supposed to work then no wonder you are hurting yourself on the language.


> In that case it does exactly the wrong thing, by declaring checked exceptions in the method signature, which client code will be forced to deal with.

> In that case it does exactly the wrong thing, by declaring checked exceptions in the method signature, which client code will be forced to deal with. > > If you don't even understand how checked exceptions are supposed to work then no wonder you are hurting yourself on the language.

It's an error that you don't have a way to recover from. Reporting an error and leaving it to the client code (or in this case, the parent process) to deal with it, exactly as specified by the declaration of the method. That's actually exactly how exceptions are supposed to work.


1. Functional programmer makes bold factual statement. So "bold" that it is almost perceived as trolling by imperative programmers. 2. Hacker News' usual suspects fail to grok remotely what he is even saying. 3. GOTO 1.


Its pretty much a fact of communication that if you can't make yourself understood, you've failed at it.


It's a common expression though what his post was based upon.

"[You don't get paid to write software as a Java dev, you get paid and judged by the amount of lines of code you write.]"

If Java and other imperative devs haven't heard that before then they need to consume more sources of knowledge and widely held opinions.

The quip about being dissolved of any real responsibility as a Java dev is stereotypical of course. And it doesn't solely apply just to Java but to the general mindset shared by the various big imperative enterprise languages. It is _possible_ to use such languages in a _cleaner_ fashion (with lots of hoop jumping) but rarely does the opportunity present itself in such enterprise environments because of politics and backwards thinking. Additionally, _why_ should you have to incorporate countless additional libraries and techniques just to "bend" the language to what it should have been in the first place? That's what everyone ridicules JavaScript for doing.

The best, most experienced programmers will naturally gravitate toward functional practices; which is basically SOLID in OOP land. Unfortunately, at the moment, only the luckier (or pluckier?) ones will actually find themselves in a job that lets them use a real functional language that deprecates the need to jump through so many hoops just to achieve a maintainable and immutable (and all the other good stuff) design.


Maybe instead of grinding out code he hates, he should find something he enjoys coding, or maybe don't code at all.


I have a decent amount of experience coding in big codebases using C#/Java - compiled, strongly typed languages - on the one hand, and Ruby/Javascript - interpreted, loosely typed - on the other. When I'm working in a big, unfamiliar codebase, I'm grateful for the help that the IDEs (Visual Studio/Intellij/Netbeans) can provide with refactoring, dead code detection. Would it be correct to say that because of the so called verbosity of Java, the IDEs are that much better at giving you instant feedback that you have broken something?


Hey I'm trying to find someone who can build a shed for me, and all these qualified structural engineers I have interviewed don't even know the difference between a pozidrive and a phillips screwdriver. What the HELL do they teach them at University these days!

(sigh) I despair, honestly. Please folk, take the time to see how some of the largest scale software on the planet runs on the JVM. Horses for bleedin' courses folk.

Personally I write my software in Bash, Java/Kotlin and Javascript - depending on the context each of these have proven to be excellent technologies.


Reminds me of the old joke: Knock knock. "Who's there?" ......(pause)......... "Java"


Why do these kinds of posts produce so much noise on HN? Its not because HNewsers are silly or stupid. Its because programming languages really do matter. However, experience with Java isn't enough to have a wise perspective on Java and experience with Javascript isn't enough to have a wise perspective on Javascript. All languages aren't alike and all languages aren't equally good (even if Turing equivalent). University exposure to various programming languages is helpful, but programming in a language amongst other experienced programmers is more valuable.

I learned C while in grad school, but I really learned it at IBM while working on the Unix kernel. That gave a useful perspective on the strengths and weaknesses of C. So I can say that Java has some real advantages over C for certain kinds of work while C is the right language for kernel work.

Here, on HN, I've learned that Java programmers like programming in Java and that Javascript programmers love programming in Javascript. But please don't pontificate on how wonderful functional programming is in Javascript if you haven't got a reasonable amount of experience with a real functional language (OCaml, Haskell, etc.). Or that Java is better than Python because [whatever] unless you've really programmed in Python too.


  ...this question comes up so often you'd think they'd just figure out that the
  chaining of streams is somewhat difficult and either make helpers to create various
  combinations or rethink the whole thing
http://stackoverflow.com/questions/309424/read-convert-an-in...

  java is a blue collar language
James Gosling, The Feel of Java http://www.dfjug.org/thefeelofjava.pdf

Java is indeed very workmanlike. Straightforward, predictable, reliable.

My observation is that the Java standard libraries (packages) are like a highly skilled and conscientious C programmer trying to write OOP. The don't want to cut off too much power from you. And that means you have to specify everything - exactly what kind of list do you want? In scripting languages you don't, because most of the time it really doesn't matter. (BTW: If you read some of the source, it is often written in a C-like style rather than Java-like).

The low-level style of libraries is at least half the reason Java has lost out to Python/Ruby etc. But also for its wins.


I like the piece but I'm afraid some people will just read the title on the front page and assume that someone reputable is actually endorsing Java.


...except he is endorsing it.


Maybe he's not "someone reputable"?


If java was brainfuck, and we still got the JVM out of it, it would all be worth it.


My first thought when I saw that trivial programming question was to write it in Brainfuck.

My second thought was to come here to the comments to see how long it would be before the attempts started.

My third thought was, if you're allowed to use any language at all, I'd go for code golf. Not sure it's possible to do it in fewer characters than:

    #!/bin/sh
    cat


#!/bin/cat -


Clever. You win one internet.


I bet someone is going to post an article named "Why I hate Java"


I like Java well enough and I don't know any other way at the moment to make an app that can run both on the desktop and over the web, on all major platforms, in exactly the same way.

However it is true that reading a stream from stdin is a big pain. More importantly it's also true that students learn an over-objectified style of Java programming in school which generally produces very bad code. Students tend to create too many layers of objects and to over-use inheritance, resulting in an inflexible structure with many opaque interlinks between objects. In the real world specs change at any time, and one must program in such a way that unanticipated features can be added. Deciding what blocks of code should be broken into shared routines remains, in my view, something of an art form, and one that the current Java-based programming curriculum does little to develop.


> But it is not a language I would choose for answering test questions, unless maybe the grade was proportional to the number of lines of code written

Related to this: At Uni we had a terrible lecturer for our first year C programming courses. We had a team project that none of understood. A few days before the submission, when we didn't have anything that would compile, someone got hold of the marking system. The marking was automatic, and up to 5 of the 25 marks were awarded for the code:comment ratio. We deleted the lines that were causing it to fail to compile, then copy n pasted the whole code below again, but as one giant multi-line comment, securing us a 1:1 code:comment ratio. We got 7/25, 5 of which was for "Legibility and good coding practices".


Watching videos of Petr Mitrichev, who is arguable the best sport programmer, I feel that Java does allow you implement that logic in your brain, fast enough. I feel this is enough attention a mere tool should be given to, given that the main goal is actually solving the problem.


There's only one way to solve a given problem in Java? Tell that to my AbstractFactoryPoolBean.


Don't forget to use Generics.


But when you learn Java, there aren't any powerful language features you can use to solve many problems.

I know it's not pure, but Java does force you to think in an object-oriented way. And there are problems where that's an advantage.


Every Java file consists of a load of package imports followed by a class definition. In Java, packages and classes are not objects.

The majority of a class's definition is spent defining methods. In Java, methods are not objects.

A large amount of method code is spent looping, branching and exception handling. In Java, loops, branches and exception handlers are not objects.

Scattered liberally throughout all of this, there are type annotations. In Java, types are not objects.

Could you please explain how Java forces me to think in an object-oriented way?


Riiiight. So by that metric, C++ and Smalltalk are OOP either. What exactly is OOP in your world?


C++ isn't particularly OO.

Smalltalk is, since numbers, booleans, etc. are objects, classes are objects, methods (and functions in general, known as "blocks" in Smalltalk parlance) are objects.

Likewise, the only control flow mechanism is the method call. Branching (if/then/else) is a method of the boolean objects. Loops are done via the "times" method of numbers. It's turtles all the way down.


Got to "But when you learn Java, there aren't any powerful language features you can use to solve many problems. Instead, you spend your time learning a body of technique for solving problems in the language. Java has hashes, but if you are aware of them at all, they are just another piece of the immense Collections library, lost among the many other sorts of collections, and you have no particular reason to know about them or think about them." and then I stopped.

If you have to be taught about dictionaries (1), you're a crappy programmer anyway.

(1) I will accept the denomination "map" also, but they are not called "hashes" damnit.


Over the years I've found Java to be too chatty and a pain in the ass to use directly. I mainly use Groovy now which also runs on the JVM.

Just yesterday I wrote some extensions for Solr, you can do it in Groovy (hell even Javascript).

There's very little point in writing things in pure Java these days. Especially if you're just using Java libraries and gluing pieces together, use some scripting language that compiles to bytecode.

Here's one cool thing about Java: The documentation is probably the best of any development ecosystem out there. APIs are wonderfully documented. For an extreme dose of hilarity, compare Javadocs to Nodejs documentation.


I agree that there are plenty of warts to be found in various well-known Java libraries, and that there is a shocking amount of poorly architected Java out there, but I can't agree that it's the language's fault.

Java has become the language of "the noob." For a very large percentage of programmers, it's the first language that's presented to them. As such, there are endless examples of Java projects written by people who are still struggling with basic flow control, let alone project architecture. Some of these projects are written by smart people, and they're genuinely useful. As an example, think of the research biologist who's writing code for her thesis - code for which there's probably no measurable market.

Because it's the language of the noob, it's really easy to gain the kind of experience this author is describing. The kind where you just glide along, "turning the crank" not really learning anything for a long, long time. People come away honestly believing that because they've worked with the language for so long they're experts. That there's nothing left which will surprise them, or change the way they work in this language.

I've seen a lot of cases where these people bump up against some really cool things and never realize it. Because they don't understand it they quickly write it off as bad code/architecture/naming.

One of the most common examples I've seen of this is when people pull class names from the Spring application framework core, parading them around with no context and no knowledge of why they exist or why they might be incredibly useful, let alone why they sometimes make Java a joy to work with.

"AbstractSingletonProxyFactoryBean??! I don't care what it is, that's just stupid!" [1]

How ignorant.

> After all, you produced 576 classes that contain 10,000 lines of Java code, all of it seemingly essential, so you were doing your job.

I can agree that Java is a verbose language, and sometimes it's really difficult to express an abstraction succinctly as compared to some other languages. I can agree that it's also easy for people to run wild and over abstract. However, I think it's just as likely for people to mistake well abstracted code for needlessly verbose code.

> And nobody can glare at you and demand to know why you used 576 classes when you should have used 50...

Without a specific case, it's hard to figure out what's really going on here, but if this is a primary target metric, I'd say you're prioritizing the wrong things.

> ... because in Java doing it with only 50 classes is probably impossible.

Just throw separation of concerns out the window and you can write just about anything in one class.

1: http://docs.spring.io/spring/docs/2.5.x/api/org/springframew...


> I can agree that Java is a verbose language, and sometimes it's really difficult to express an abstraction succinctly as compared to some other languages. I can agree that it's also easy for people to run wild and over abstract. However, I think it's just as likely for people to mistake well abstracted code for needlessly verbose code.

See, this is a mistake. Abstraction is not an end in itself; it's there to make life easier and to make things easier to undestand.

Bloat is one of the many things abstraction is supposed to reduce; if an abstraction is increasing cognitive overhead and reducing the signal-to-noise ratio, then we've got ourselves a problem.

For reasons that I cannot comprehend, Java seems to be the only language that seems to have excessive abstraction as a requirement. Not as a convenience, not as a nicety, but as a requirement to do things like Dependency Injection, Inversion of Control, Unit Testing, Mocks, logging, etc. The fact that you need to use entire frameworks to accomplish the functionality that in many languages is covered by a couple functions in a built-in module is a mystery to me.

I'm not the wisest expert, but I have over a decade in a field and I've worked in all sorts of different software, and Java (along with some very badly written .NET systems) seems to be the only technology that requires me to grok a dozen files to understand basic workings of MVC systems with the most elementary of logic. And the thing is, 90% of that code is absolutely useless to the extent of the use cases proposed by the system, and none of its potential extensibility is actually used. You have to ask yourself how come neither Python nor Haskell have these monstruous frameworks.

Sure, it's a lot better now, after having caught up 8 years behind the rest of technologies, but the foundations are notoriously complicated for things that are effortless with other tools.

This points to me towards the idea that Java abstractions have simply been poorly chosen.


> See, this is a mistake. Abstraction is not an end in itself; it's there to make life easier and to make things easier to undestand.

This idea is often repeated in programmer's forums, yet I think it's simply not true. Abstraction is not there to make things easier to understand, it is there to empower you _after_ you understand the abstraction. It is there to pack the knowledge you _already have_ after working with some system, and being able to port it to similar systems.

Let's take math as an example. Understanding the "+" operator is much harder than understanding that if you have two apples and you get another one, then you have three apples. However, once you grasp the "+" operator, you can quickly solve a bunch of related problems where the "+" operator applies.

This does not mean that I disagree with the rest of your comment though. Java's (classical) means of expressing abstractions are fairly limited - basically, you need _interfaces_ for everything. In most cases, this leads to a lot of cognitive overhead that obscures the essence of the abstraction being used.

A simple example of this problem is the lack of function values and lambda types (pre jdk8). Due to these missing features, people expressed the same abstractions by defining interfaces that define some method, and then providing implementations as anonymous classes. That is, instead of defining a function type and then just implementing it, you now have an interface and weird class declarations in your code. To the untrained eye, this is much harder to understand. As a result, some of the programmer's cognitive effort is now spent on this, and hence it becomes harder to understand the more useful part of the code (which is likely some kind of abstraction).

Just to shed some more light into this, consider Haskell's _monads_. Monads are very hard abstractions to grasp. They are much harder to understand than just "doing whatever you want to do and call it a day". However, they become extremely useful once you understand the abstraction and become available to use it without the cognitive overhead.


> Abstraction is not there to make things easier to understand, it is there to empower you _after_ you understand the abstraction. It is there to pack the knowledge you _already have_ after working with some system, and being able to port it to similar systems.

I think some abstractions exist for either reason, and some for both reasons, but either way thanks for saying it this way - you definitely put it better than I did.


In Python there simply doesn't seem to be a DI/IoC container available. So you end up half-assedly writing your own, or just using spaghetti code. I'm tempted to suggest that Python doesn't have these frameworks because it doesn't bother to support writing large programs. Likewise with testing, there's the unittest module but it's not really integrated with a build system or anything else. Java does in fact have some standard library functions for logging, but no-one wastes their time learning them - if all your real projects are going to use log4j, why not just use log4j all the time?

There are certainly unnecessary parts - there's a hell of a lot of bloat in spring - but just doing without frameworks entirely is not the answer.


In Python there simply doesn't seem to be a DI/IoC container available.

Sure there are. There's even a version of Spring for Python. It's just that nobody uses them. You're free to guess why.


Because they are useful only after the project grows big. People who use them on small projects do it out of habit.

Which only confirms what parent comment wrote.


> In Python there simply doesn't seem to be a DI/IoC container available. So you end up half-assedly writing your own, or just using spaghetti code.

There are plenty available, just not one dominant one.

> I'm tempted to suggest that Python doesn't have these frameworks because it doesn't bother to support writing large programs.

That probably approximate the truth, in that the brittle "large program" architectural style that IoC/DI approaches seek to mitigate the harms of is probably less popular for large systems where Python is used (or at least the Python parts) which favor smaller programs interacting through standardized communication protocols for constructing large systems.

> Likewise with testing, there's the unittest module but it's not really integrated with a build system or anything else.

Since there is no "build" with python, the fact that its unittest module is not integrated with a build system isn't surprising. There are plenty of tools that support integrating unittest (or other python testing frameworks) in workflows, including utilities that do testing on every file save (similar to autotest in Ruby). Obviously, in the absence of a build step, they aren't integrated into a build system, but the workflows are often better because of the absence of a build step.


> In Python there simply doesn't seem to be a DI/IoC container available.

Well, last time I needed something similar on Python, I just loaded the settings from the database, included the wanted code at runtime, and inserted the desired symbols at the local context. Took a bit more than a day to write, and was very appliction specific, but saved a lot of trouble with those crazy constructors that appear on Java.


I am no Python expert, but there must be modules (easyinstall etc) for inversion of control etc in Python too, as there should be in all scripting languages. (Is it really in the std lib of Java?! Ah, maybe I'm not that surprised...)

(I am from a Perl environment which have a bit emphasis on tools/libraries and language extensions, because it is easy to do with automatical tests on most every platform/Perl version.)


It's not in the standard library for java, no, but Spring is very popular and "standard" in the same way as e.g. Rails in ruby.


Ok, I get what you're saying.

Well, it is quite natural that the tools for large code bases are more common in Java. Not only because Java is traditionally used on conceptually larger projects... :-)

Anyway, as a Perl guy I can tell you that DI is commonly used in Perl, if nothing so because of the testing culture.


> For reasons that I cannot comprehend, Java seems to be the only language that seems to have excessive abstraction as a requirement. Not as a convenience, not as a nicety, but as a requirement to do things like Dependency Injection, Inversion of Control, Unit Testing, Mocks, logging, etc. The fact that you need to use entire frameworks to accomplish the functionality that in many languages is covered by a couple functions in a built-in module is a mystery to me.

Modern PHP has cloned that. See for instance the Symfony 2 framework.


> For reasons that I cannot comprehend, Java seems to be the only language that seems to have excessive abstraction as a requirement. Not as a convenience, not as a nicety, but as a requirement to do things like Dependency Injection, Inversion of Control, Unit Testing, Mocks, logging, etc. The fact that you need to use entire frameworks to accomplish the functionality that in many languages is covered by a couple functions in a built-in module is a mystery to me.

So what are you proposing? That everything has to be built in into the platform itself, hidden away from me? In fact, Java not doing so is a big advantage, because it allows me to change the implementation. Otherwise, everything has to be done by the platform developers. I would depend on their resources.

In the Java world, if something is worth to become part of the platform (either a new language feature such as try-with or a new core API or runtime library like Joda Time), it is discussed within the JCP. This created an ecosystem, where I can plan for the next 10 years and not just for the next it-does-everything-dynamic-language cycle-jerk.

BTW. If you want to reduce repetitve tasks, you should use code generation. That's the best abstraction and Java is a nice simple language to generate.


> See, this is a mistake. Abstraction is not an end in itself; it's there to make life easier and to make things easier to undestand.

> Bloat is one of the many things abstraction is supposed to reduce; if an abstraction is increasing cognitive overhead and reducing the signal-to-noise ratio, then we've got ourselves a problem.

I whole-heartedly agree, but I'd caution readers not to conflate abstraction with bloat. If it's bloat, chances are you're abstracting the wrong details, and working on a system suffering from poor cohesion.

> For reasons that I cannot comprehend, Java seems to be the only language that seems to have excessive abstraction as a requirement. Not as a convenience, not as a nicety, but as a requirement to do things like Dependency Injection, Inversion of Control, Unit Testing, Mocks, logging, etc. The fact that you need to use entire frameworks to accomplish the functionality that in many languages is covered by a couple functions in a built-in module is a mystery to me.

I'm not sure I follow. Are you saying that you think it's wrong or inconvenient that these features aren't somehow natively supported by the language? That is, that you'd like to have a grammar/syntax around each one of these things which ultimately boils down to a bunch of hidden runtime anyway?

Maybe it might help to clarify that I'm defending Java the language, not Java the runtime library.

> Java ... seems to be the only technology that requires me to grok a dozen files to understand basic workings of MVC systems with the most elementary of logic

I do agree that web frameworks in Java are one area where there's just tons of bloat, but I still don't think that has anything to do with how the language is designed. My advice: get a better MVC. There's nothing about MVC that says you must use a framework. (Edit: if it's the container that you're fighting with, there are tons of other options out there aside from the mainstream few).

Also, check out the network code in the Linux kernel sometime. Beautiful code, but holy shit do you need to shave a whole heard of yaks the first time you go to read it. The cognitive overhead is high, but each one of those abstractions is there for a reason.

> This points to me towards the idea that Java abstractions have simply been poorly chosen.

Again, given that I'm talking about the language alone, not the runtime library, if you still uphold your response I'd ask you to consider who's fault it might be that the abstractions used in your program(s) are poorly chosen. It's quite rare that it's the fault of the language.


> Java has become the language of "the noob." For a very large percentage of programmers, it's the first language that's presented to them.

Isn't that one of the design goals of the language? IMHO, the simplicity of Java and the small number of concepts is commendable, but I feel it could have provided identical features in a less verbose way. Take accessors. If Java had Python-like properties, the average size of a Java codebase would shrink by 50 % (though having public final fields everywhere as a convention would work equally well for non-API code - I probably changed a field getter into a computed property once, and it certainly wasn't in public-facing code). Same with qualified imports.

That said, if you have an IDE, you can fold a lot of cruft away. And the business logic part of the code is often very readable (as long as it has been written by decent programmers, obviously).

One thing that puzzled me about the article is the rant about HashMap not being special. There is no special syntax for writing maps, but there is no special syntax in Haskell either, and I haven't seen anybody up in arms about it. If you don't know when you should reach for HashMap, your lack of knowledge in basic data structures will not be fixed by having special syntax. The Java collections library (enhanced by Guava) is one of the best things about the standard library.


There exists now a JEP for collection literals.

http://openjdk.java.net/jeps/186


I'm taking the SCJP this week... oh, so much simplicity x/


> Java has become the language of "the noob."

Along with PHP (for different purposes, though).


Why I like it when people are overconfident pompous noobs on the internet


I think this is a very advanced trolling... X-D

A great application of Poe's Law[1]

[1] http://en.wikipedia.org/wiki/Poe's_law


I couldn't care less what language I am using. I care about the quality of the libraries. That's what helps you get shit done fast. And JVM has a shit-ton of high quality libraries.


It's ironic how this kind of Java bashing comes from a Perl guy.


Java is one of the best, most elegant choices for certain types of problems. For other types of problems, it is horrid. This can be said of many programming languages.


Very misleading title for a great article. I will probably point people to this in the future when they ask me why I don't like Java (yet continue to use it).


> My current employer uses an online quiz to pre-screen applicants for open positions. The first question on the quiz is a triviality, just to let the candidate get familiar with the submission and testing system. The question is to write a program that copies standard input to standard output. Candidates are allowed to answer the questions using whatever language they prefer.

I wonder if "cat" would count as a valid answer.


I hate Java, and loved this article. Not what I was expecting. Now excuse me as I have 213 more classes to write each containing about 2 lines of real code.


"What are you? A RAM salesman?"


I love this. Keep calm and java on.


Is it me (not a native english speaker) or isn't this article all but sarcastic?


i certainly wouldn't want this Negative Nancy paying me any compliments ;)


He has a point. Once I used this online technical interview questions. Understanding the submitting output of the program through using java took longer than the Python or Ruby.


Copy standard input to standard out? It's like 5 lines of code in Java

Scanner scan = new Scanner(System.in) while(scan.hasNext()) { System.out.println(scan.nextLine()+"\n") }


Oh good lord. This one adds in Scanner!

How do you think this code will handle binary data? How do you think it'll handle a simple two line text file?


There is no one solution fits all. Especially if the problem is imaginary.


> There is no one solution fits all. Especially if the problem is imaginary.

Yeah, but there are solutions that actually solve the problem.


what exactly is the problem? you cant just make up problems and ridicule the code for not solving them when the coder had no idea he was supposed to be solving them.

This solution (and whatever solution you might have in mind) likely both dont solve all sorts of problems. for instance what if there is a power failure and you lose all the output? is it persisting to disk? what if new input methods that arent even thought of it come to exist 5 years from now, is it compatible with that?


> This solution (and whatever solution you might have in mind) likely both dont solve all sorts of problems. for instance what if there is a power failure and you lose all the output? is it persisting to disk? what if new input methods that arent even thought of it come to exist 5 years from now, is it compatible with that?

You are missing the point. The code doesn't work. I'd love to hear any interpretation of the problem where that code actually is correct.


Are you, like, paid for litres of Java hate per second or something?


I'm not hating on Java. Java has perfectly good ways of solving this simply.

My hate is on the Java code being posted here. It's horribly bad.


Nice article! After reading it I can tell I like Java too.


The issue with java is not the language. The language is great.

The issue is the totally over-engineered APIs


I like java since I learned it.


I really love the JVM. Inversely its related to Java™. I hate it passionately.


This post is nerdbait. Not hn material.


"cat -" :)


what is this, fonts for ants?


I feel like I'm taking crazy pills! I invented the piano key necktie! What have you done Derek?!



What is this, monitors for titans?


I love Java :)


Isn't it all just

  StreamUtils.copy(System.in, System.out)
I'm pretty sure there is a function that just copies an input stream to an output stream in commons-lang.

Still I agree with him - I won't write Java in a test like that. It is almost impossible to write Java fast without an IDE to autocomplete everything for you. There are way too many names to remember. I can't even tell you what package StreamUtils are in or if it's StreamUtils or FileUtils.


It is pretty much just that... although I think StreamUtils doesn't flush so you'd need to add that.


I love Java because when you add 'script' to it you have a pretty sweet language.


What, JavaFX Script?


Typical ESL Java programmer


nice article!


Nice article!


AGREEEE


:)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: