Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I was a total raving Python evangelist for the first 12 years of my coding career, and then got a job as the CTO for a Python based startup that had been running absent a technical leader for 5 years, with relatively junior coders making all the decisions. I still love Python, but I now have a completely different attitude to hyper-dynamic languages (like Python, Ruby, Clojure, Elixir, etc). In my new opinion, they are great for anyone making small projects, and good for disciplined, experienced teams making larger projects, but are really double-edged swords in large projects. If your team really knows what they're doing and can take advantage of the flexibility without creating a mess, they can let you move really fast and make elegant DSLs and so on. But if you let a team of juniors do whatever seems like a good idea, with nobody calling the shots who understands tech debt and the large-scale architecture problems, the mess that can be made is staggering. I never thought I'd say this before, but I would have been happier stepping into C++. :-/

Sure, this is a problem of people, not language. But there is something to the argument that absent discipline and experience, these can be dangerous. One can detangle a dog's breakfast in Java Spring a lot more easily.

(ever seen "import gc" in a Python program? yes, that means what you think it means..)



This is my experience as well. Incidentally "this is a problem of people, not language" is one of my pet peeve statements (not that I'm mad at you, of course!) specifically because I've had it used against me enough to defend choosing, imo, bad tools. Everyone thinks they have developers that are good enough that they don't need the bumpers up when they bowl. But honestly, it's just almost never true (and if it is, it won't be for long as your company grows).

One company I worked at had "we hire the best!" as its slogan, and they used it to justify so much bad practice. No code reviews (we just don't hire people who write bad code), no unit testing (same reason), python used for huge sprawling 10+ year old codebases that's on its third or fourth generation of maintainers, and no documentation anywhere. Everything is a spelunking expedition, and there's no type safety or anything to help you figure it out. There are more than a few thousand-plus line functions that make PyCharm chug every time your touch a character.

On the flip side, the Java and C++ codebases were similarly abused (no tests, no docs, four generations of maintainers/oral tradition, no code reviews ever) but are at least two orders of magnitude easier to traverse because the type system keeps everyone honest across decades in a way that dynamic languages just do not.


I actually do due diligence on tech firms now for a job, and you're 100% right that the tradeoffs change dramatically when the company grows. Which doesn't mean that using the best thing possible for your expert coders at the beginning is wrong. At the beginning, you need every advantage you can get, and if you have expert programmers, give them the sharpest knife you can find! But yeah, once your company is hiring hundreds, or even dozens of devs, and is being sold every handful of years from one investment fund to another, the same set of tradeoffs no longer necessarily apply and .NET and the JVM look pretty good.

I'm writing some Java at the moment for an Android app, and it's just killing me after doing Scheme and Python, but I have to admit, if that start-up had used Java or C#, they might have been better off in the long run.

Hiring really good, disciplined people is super, super hard. My current thought is that if you don't have a super-star CTO who can attract and keep real A+ coders, don't use advanced languages. And if you do have that CTO, do it, but make sure you are architecting to make the rewrite in C# or whatever doable when you sell your company. If I were starting my own business now, I'd use Clojure or Scala or F#, and design things so that in 5 years new owners had a decent path to switch to C# or Java wherever they want.


> I'm writing some Java at the moment for an Android app, and it's just killing me after doing Scheme and Python

Android isn't Java though. It's some frankenstein contraption made from a mutated version of Java from a decade ago.

> I were starting my own business now, I'd use Clojure or Scala.

I personally wouldn't ever choose Scala for anything. It's tooling is horrible (Sbt is a special hell, and scalac is as slow as a C++ compiler).

I'm currently in a code base like you describe, scala core with a switch to Java. It's not fun. Were stuck with these legacy library and framework choices that don't fit into the Java ecosystem and it slows down development considerably.


> ... and scalac is as slow as a C++ compiler

Scalac is as fast as Javac if you don't overuse Scala-specific features which are known to compile slowly - implicits and macros. But it is unfair to use these features and than complain on compile speed or compare it to a language that doesn't offer similar feature set.

Also, in our Java/Scala project it turned out that Scala+Bloop was way faster (10x-100x) at incremental compilation than Java+Gradle. Bloop is based on SBT libraries so there are some things that SBT does right.


Thanks for the compliment. But, as the author of Bloop, I must say that Bloop is definitely not based on sbt and it departs a lot from the design decisions sbt made.


Android is the worst implementation of Java projects - horrible architecture or lack thereof and really bad api design. If anyone wants to learn Java, focus on modern Java language and micro services based frameworks and you can get best of all.


Interesting, I've only read Scala. Personally I like Clojure but thought I should mention it as I see businesses mixing Scala with Java successfully in my work.


You can, but interoperability is not that great. It can work, but often it takes a lot of massaging/converting even basic data types.

Kotlin has wayyy better interoperability with Java


totally agree about sbt being gratuitously complex.

i wonder if Java is judiciously choosing the non crazily digressive parts of Scala, as they both evolve.


That is what Goetz said is the plan for Java. Move slow, let other languages experiment with language features, and then take the best features and implement them better.


Well, so far the “better” still remains to be seen... waiting for type-classes to represent monoids, functors, applicatives, monads and foldable.

So far I’ve just seen a complex and wild stream API that mimics it in practice, without all the mental model behind it.

Oh, and what’s this nonsense with Java Option?! Really?


We're yet to see if monoids, functors, and monads are worth it :)

For better I'd say you should look at things like virtual threads vs async/await or kotlins when vs switch expressions.


If you program modern Java or Kotlin map and flatMap are everywhere. I'd say the debate whether functors and monads are useful is over.


Java or Kotlin support functors or monads as design patterns, i.e. this is how a map/flatMap works with an Optional or a Stream.

The question is: if this support is enough or should the type system be enhanced so you can abstract over/compose a functor and/or a monad ?

The answer to this question is IMO still open.


No they are. Seriously, the mental effort I had to endure to hold all these wild APIs in my head! In hindsight it’s infuriating; the gatekeeping, the patronizing condescendence when I was struggling with what are mathematical trivialities.

Screw that... I should have been told sooner


Nice : as an ex Scala person for a few years, seems smart.


Sbt optional right? Can use maven or whatever else of you want.


Androidifed Java is the worst thing you can deal to be honest.


Can you just use Kotlin to avoid Android's Java, or are there instances where you need to dip into Java get stuff done?


Kotlin is like a tape on the ugly Android architecture, little helpful but still a monstrosity compared to iOS.


ok that makes me feel better. God it's awful.


Please use Kotlin on Android. Java on Android is a de-fanged and lame tiger with half the modern standard library and language features missing. Blame both Oracle and Google for this.


Having finished my first little app for Android in Java, I will certainly be switching. :-)


Why would you make technical decisions based on what hyponthenictal new owners would prefer in a hyponthenictal acquisition in 5 years?


Because that's how founders make money. The vast majority of exits now are purchases by private equity firms, and they will do a thorough technical due diligence and look at those decisions and what it will take to run the company the way they want to. (I do those diligences for a living) If you're doing a tiny lifestyle company with no exit plan, you wouldn't. Which ironically, is exactly what I'm doing on my own time, so I use Scheme for that. :-)


Most exits are purchases by private equity? I had no idea - are there any references for that available?


Here's one report I used for a presentation. The quote of interest:

"Over the last 12 months, 75 per cent of the most active “buyers” of $100m+ technology companies have been PE firms. This is despite the fact that the combined cash balances of the “Big Five” tech companies is an eye-watering $330 billion. Today’s most active tech buyers are private equity firms like Silver Lake, Francisco Partners, and Vista Equity. Not Google, Facebook or Amazon."

https://ftalphaville.ft.com/2019/01/07/1546862666000/Guest-p...

One of the things that is not obvious to people coming from the VC world is that once the company has gone from VC to PE, it's likely going to get flipped several more times from PE to PE. At a guess, I'd say 3/4 of what I look at are one PE to another. Depending on the firm, they will sell every 3 to 7 years or so.

iain


What's "hyponthenictal"?

You used it twice, but it's not on Google...


It's a misspelling of "hypothetical"


> I'm writing some Java at the moment for an Android app, and it's just killing me after doing Scheme and Python, but I have to admit, if that start-up had used Java or C#, they might have been better off in the long run.

that’s a logical fallacy. chances are that if they used Java or C# they might be dead by now (ie it sucks but usually it’s a good problem to have)


I know what you're getting at, and I agree with the sentiment very often, most of the time even. But in this particular case, I don't think so, and believe me, I spent a lot a of time thinking about that....


So what are your thoughts?


Uh hard. First, maybe don't buy companies with bad technical debt, just start from scratch? And always prefer two experts to 10 juniors in the beginning even if you move more slowly at first. I don't know what would have worked well in that particular situation, but if I were to do it again and could influence the original team, I'd use Clojure. I believe the immutability and control of side effects would have been more of a benefit than types. That was where the real problems came from, dynamic ability to create out-of-sight side effects and alter data in unexpected ways all over the damn place. Clojure's immutability plus spec would have been way better.


> Everything is a spelunking expedition, and there's no type safety or anything to help you figure it out.

For any significant Python codebase, I make sure to put extra effort into precise and strict type annotations so I can rely on type safety. Sometimes after fixing all the statically detected type problems, my program works on the first try.

Python was strongly typed from the beginning, before it was statically typed. When you add a string to a number in Python, you get a type error. Python's static type system– which is formally part of the language and has multiple implementations– is an elegant and natural fit to the language. Unlike TypeScript, where the type system is artificial, and restricts what you can do with JavaScript.

Language features that made it into ruby– like blocks and unless– were rejected from Python because they would have made it possible to create advanced DSLs, which is something the language maintainers aim to avoid– enabling the creation of arcane application-specific special languages.

PEP-484 is the static type system's name: https://www.python.org/dev/peps/pep-0484/ (implementations include mypy and Pyre)


Exactly my feeling honestly. I was a fan of Javascript and Python a couple of years ago, and I strongly (and still do) hated Java for its verbosity. I really have to admit that, especially with large teams, a strongly typed language is definitely the way to go.

I don't feel confident when I program in Python, nor in Javascript exactly for the lack of proper typing. One could argue that if you know exactly what you are doing, types can even be ignored, which is kind of true - but they're there to help the developer and his team. I don't see any good reason to use Python over Golang, for example, if we only consider the language itself and the confidence that you get from using a statically typed language.


I've often found that starting some quick-and-dirty thing in shell scripts is great, but once I get to around 100 lines of shell script code, I'm better off switching to python (or similar). Similarly, once I get to around 1000 lines of python code, I'm better off switching to compiled, statically typed language (e.g., Java, C, C++, etc.).

So my own heuristic has become:

    shell script when < 100 lines of script
    python code when < 1000 lines of code
    else statically typed, compiled language


I live by very suspiciously similar rules of thumb although my tolerance in the past for dynamic scripting languages was probably closer to several hundred lines of code.

However, I have to acknowledge that modern IDEs (and even advanced code editors like VSCode) have pretty good support for dynamic scripting languages these days that make working with them in larger code bases not entirely painful.


Yes, PyCharm was an amazing help.


I feel that Python has (among many, many other things) replaced Matlab and the likes.

These were the go-to "prototyping" languages people in science, R&D, etc. used (and still use!), and then you'd port those into more manageable languages for larger scale projects, or actual software products.

These days, Python does all that. Unless you need some extreme performance for your final product, a well-optimized Python program will work just fine.

But still, there's some of that matlab legacy left - instead of one comprehensive application, you have tons of smaller programs floating around, for one-off/single-issue uses.


On a continuing project with a company where everyone speaks Matlab as a second language, I succeeded in switching the guy writing Matlab code for the Matlab REST API to Python/numpy/scipy for the industrial analytics we need, saving us $25k a year in licence fees and VM costs.


This was my heuristic years ago before Python got type annotation support. Now there are linters that take type annotations into account, and LSP servers that can catch type errors as you write Python code.

I've found that using Python for large projects is both fine and productive as long as you utilize type annotations and modules. Ambiguity goes out the window when you do that.


I was super hopeful about type hints but they fell short of my expectations, unfortunately. It takes you to a place full of kluges and workarounds and hacks which is all .... really unpythonic.

In the end I feel, if you really want that, use a language that treats types as a first class citizen. There are languages with most of Python's attractive features but actual optional typing (we use Groovy), or static with really good inferred types (eg: Kotlin) etc.


I was really happy when I first learned of type annotations being added to Python. However, much of that excitement evaporated when I read that it was just a 'hint' and not something that's enforced.


This was also my initial reaction. However, if you couple type annotations with an LSP server[1] that takes the annotations into consideration[2], you'll get on-the-fly type checking in your editor or IDE. That way you can catch type errors as you write your typed Python code.

Besides VS Code, these days lightweight editors like Kate, Sublime Text, Vim, etc can all take advantage of type checking LSP servers for Python.

[1] https://github.com/palantir/python-language-server

[2] https://github.com/tomv564/pyls-mypy


It's not enforced by CPython the interpreter. You run mypy to enforce the type hints: http://mypy-lang.org/

It's quite good.


I like it a lot. Use it with --disallow-untyped-defs to make sure more is caught.


Nice guidelines! Though in my experience, the problem with the python code of < 1000 lines is that it tends to grow when you're not looking, and suddenly you find yourself needing to maintain (or rewrite) a Python monster.


what a really great way to think about it...i've been doing the same and hadn't thought of it that way (for decades)


Some interesting responses here, thanks. Not surprisingly lots of people are saying "types would have saved you". I'm sure they would have helped, though I personally think they do introduce a high cost too, and I'm a lisp-head nowadays not a Haskeller.

The real culprits, thinking about it since my post, were side effects and mutability. Python lets you create incredibly difficult to trace chains of side effects and mutations, and has basically no decent ways to prevent this. In Scheme, my code is still super small and I can dynamically create all kinds of object like things, but if I want private, I can make it god-damned private. In Python, anything can be changed by anything ("we're all consenting adults") and using side effects in weird ways is actually part of the idiom. I don't know how many times in Python literature I've seen some variant of "you don't need those baroque patterns because we can use 'import' as a singleton, running class initialization as the constructor". And so all the frameworks have crazy thread-local magic happening from bloody import statements!!! Do that too much and you have no idea what's happening where and why, and something as trivial as changing the order of imports can kill your app. Where I was, this had gotten so bad that the app couldn't even be turned on and tested in the normal way, and none of my predecessors had been willing to go through the pain of figuring out what the chain of imports were doing to bugger it up. (Because that didn't look like doing anything productive, I'm sure you all know the drill...)

If I were doing it again, personally, I'd use Clojure and Spec, and worry more about mutability and side-effects than anything else. Just my two cents Canadian.


Thanks for making this point. People focus on types but it's the whole smorgasboard of modern features missing from Python - lack of proper (idiomatic) support for functional programming, None types, etc etc.


I don't disagree, but I will add that most long-lived Spring applications I have worked with were disasters. Figuring out how a Spring application actually works (or why it doesn't) can be difficult. Upgrading the framework, or switching to a different one without breakage is often a non-trivial task.

The best codebases I have touched, in any language ecosystem, either didn't use a framework, or kept the framework firmly isolated from the other code. If you build your code "inside" a framework, it will eventually become an obstacle to change.

Furthermore, the best Java codebases I have seen avoided annotation-driven-development, reflection, classpath scanning, DI frameworks, AOP, etc. Anything that feels at all magical should be handled with great suspicion.


> Anything that feels at all magical should be handled with great suspicion.

This. True in any language.


This precise struggle is why Go is created, and also why Rob Pike said what he said. The industry needs a statically typed language with good ergonomics (not typing too much) and performant.

I agree with you, asking everyone to be discipline with their code in a dynamic language is a bit too much to ask (unfortunately).


If there is something similar with Django on golang ecosystem, I would use golang more. Battery-included web framework seem to be out of style these days and no one invest in developing one for newer language like golang. As a single developer working mostly alone, microservice-based development is quite painful.


Gobuffalo is close but it's been a while since I've looked at it. What you said is has been my experience too.

Here I go about the batteries included vs micro thing. People pick Flask because it looks approachable but then DIY features in. The same thing happens with Sinatra+Rails. If the python ecosystem grows and other communities come in, maybe they will bring their culture and standards in.

Take poetry for python. It's basically, "hey ... cargo/yarn/mix/bundler all kind of figured out these ergonomics". Poetry's "why" section in the README really resonated with me. Cross pollination of ideas across tribes is _good_.

But then, I'm biased/blub-paradox of course. And it's definitely in line with the productivity/DSLs/rapid vs types/verbosity/slower modes discussion.


Not sure about golang but D language has a battery included web framework based on the excellent vibe.d [1].

Fun fact, D language forum website is probably one of the fastest and the most responsive websites on the planet [2], and it's written in D but not based on vibe.d or Diamond [3].

[1] https://code.dlang.org/packages/diamond

[2] https://news.ycombinator.com/item?id=16731302

[3] https://news.ycombinator.com/item?id=3592769


Second this. There's no rails/express for golang. Personally I'd love to see a next-gen Rails-like framework arise in Swift. That would be amazing in my opinion.


Go needs to be renamed to something that can be googled.


Golang?


Yeah. When you have two experts in a garage, it rocks. When you absolutely need to hire some new folks in little old Victoria in the next two months so you turn to bootcamps and interns, not so much.... ;-)


Victoria, BC? /me waves from Quadra (Island, not Street).

Not job hunting, but always happy to chat with people doing cool stuff sort-of locally.


This is just my humble opinion, but i'd argue that it is much easier to hire Go developers than Scala or Clojure, even outside Silicon Valley.


Too bad the lack of generics means it has awful ergonomics in reality.

I also think rust's ? makes the "return err" pattern much nicer to deal with.


I like generics because it actually solves a lot of things that OOP and inheritance were supposed to without turning your dependencies into a ball of mud.

After 30 years of mediocre programming things I hate are unplanned/hidden side effects, train wrecks due to mutable state, and out of control dependencies. Error handling at this point seems like Weltschmerz.


Agreed on the second point, but when I think of "ergonomics", I'm mostly thinking about how much I have to type to get stuff done. For example, Java has generics and unchecked exceptions, but still sucks ergonomically without an IDE just due to syntax and imports.


I have done big and small python/C/C++ projects. For python small projects it is pretty sweet. You type some commands and you are up and running. With c/c++ it seems there is always this 'big' first time step for setup. Yet long term that seems to flip around like you have seen. Where that first time setup works pretty good for a long time. Whereas in python the org of the program becomes this dead weight you are kind of fighting. It is not awful but just kind of there. Now that is just my exp I am sure others had the opposite. But it is nice to see someone else out there thinking like I do about this. python for nice small contained things. other languages for longer term bigger projects. But it is same problem most of the time you have in many orgs. The prototype becomes the production thing :(

People always ask me 'how do I become a programmer' the first thing I always tell them is 'you need to learn python it is fairly easy to pick up and learn' I aim them at the python org website and their excellent tutorials. Funny enough it weeds out the people who do not really want to be a programmer but thought it sounded cool, as they have to stop and learn something.


Yeah, I hear you. Nowadays I do Scheme embedded in C/C++. The stuff I used to love Python for is now done in Scheme, and the outer containers are in C++. It's actually a really nice way to write code, for me at least. Forces a clean architecture on you, and you know you can always move anything out into C++ if you need either a library or lower-level access. I use S7, but you could use Guile, Gambit, Chicken similarly.


This is an interesting architecture, can you provide some more details on how exactly that works? E.g. do you use a c++ library that interprets scheme?


yeah, I use S7 Scheme. It's a complete scheme interpreter in one C file, super easy to embed, quite similar to Clojure linguistically. Basically all "business" logic is in Scheme, all OS/UI interaction is in C or C++. For me, it's amazing, because I write music apps and I can reuse my model code across desktop, phone, in Max/MSP or PureData, in web assembly, everywhere. Being able to prototype in Max is awesome. My model layer only knows about music, the high-level user actions (ie "play note", not "click button"), and Scheme. It doesn't know anything about widgets, phones, etc. All boundaries are crossed with the C FFI layer (which is really nice in S7) forcing a strict ports-and-adapters approach. There absolutely is upfront cost, but the liberation of being totally decoupled from any framework or vendor in my model code is awesome. I chose S7 because I'm doing music, but you could do similar with Guile, Chicken, Gambit, Embedded Common Lisp, etc. And Scheme is so minimal that if I needed to switch Scheme implementations it would not be a big deal - I would only be rewriting the adapter layer.

Interesting, one of the most successful companies I've seen in my tech due diligence work was doing the same thing in high end scientific computing. They had written their own DSL, and a whole containing application layer in C that could run on any target platform, with GUI adapters for windows, unix, osx, etc. They were doing very well. And they could get amazing scientists to work for them super-productively because the DSL was designed around the scientists needs.


This sounds awesome, glad to hear it's working well for you. One question that comes to mind for the business stuff is security. Basically, are you accepting Scheme code from untrusted users, or is it just a bundled part of the app?

The reason I'm asking is that that run-time flexibility can completely bite you in the ass if someone can submit code that does unacceptable, unanticipated things. For example, I had an engagement with a company to review their main application, that they host. The issue was that while the app was in Java, it accepted user templates in (IIRC) BeanShell. One quick System.exit() in a template as a proof of concept, and Tomcat came thundering down.

The concept still works of course if you restrict the language accepted to some minimal DSL.


Interesting. This actually sounds very similar to game developers embedding Lua or V8 for various logic. I've used Lua myself in a similar way, embedded in a high perf application, with a user script driving the logic.


Yeah Naughty Dog was doing the same thing in Scheme apparently!


To be specific, it's the lack of typing that makes large code bases in Python, Ruby, and especially JavaScript a huge pain compared to Java. It makes debugging much more difficult. Thankfully this is being fixed with stuff like Typescript and similar projects for both Python and Ruby.

Some people may argue, "but documentation would make this an non-issue". Guarantees built-in to the platform itself are more reliable than human aspirations, or automatic memory management as a feature wouldn't have gotten popular.


I actually found it was untraceable side effects from imports and ability to monkey patch that was worse. Related to the typing, but not totally the same. I would pick something like Clojure or F# now to control that. Side effects and coupling.


You can patch by linking a different library in C. What's the difference? I suppose it's easier in Python to patch. Seems like a good thing to me.


Didn't realize that monkey patching wasn't only limited to Ruby.


It tends to be frowned on in Python. Mostly because some fairly popular projects did it first, and if you monkey patch over them, you break them.

Consequently, there was a big disincentive to monkeypatching.


>But if you let a team of juniors do whatever seems like a good idea, with nobody calling the shots who understands tech debt and the large-scale architecture problems, the mess that can be made is staggering.

I've seen the same occur with Spring based Java. To me this is similar to the old "no one has gotten fired for going IBM". The new phrase would be "no one has gotten fired for going with a static typed language."

When someone writes bad Java code, we tend to blame the developer. When someone writes bad Python code, we blame Python.

If you think it's the lack of types that make things more difficult to follow, you can always force build failure if type hints aren't provided (which Python has supported for a while now) in the same way you can fail your Java builds if test coverage isn't up to par.


> but are really double-edged swords in large projects

Yeap. I ended up dragging my team to Golang due to this. Sure they didn't write unit tests and they still aren't writing unit tests. But at least now a compiler will take a look at the code before it's run. Lots of stupid errors at runtime disappeared.

For personal projects? Something like Python, Lua or Scheme is great. It can still be great in large teams, but they need to understand and care about what they are doing.

> but I would have been happier stepping into C++

Oh no you wouldn't. That's the nuclear foot gun.

I've been tasked(along with some great engineers, most more experienced than myself) to fix a mess left by two other teams. A project that was severely over budget and very late.

It was the worst abomination ever known to man. Crashes in random places. Race conditions. Global variables. Memory corruption galore. Lack of understanding of C++ in general (copy constructors, assignment operators, initializer lists) leading to many more bugs. Leaked like a sieve.

A lot of this wouldn't be possible in other languages (memory corruption in general). Some issues would happen no matter what (two modules opening the same file and writing on it simultaneously), unless something like Rust or Haskell was used to make this more difficult.

So far, that's the only project I've contradicted Joel Spolsky and it was completely thrown away and rewritten. Again in C++ because that was the requirement, but with proper modules this time, smart pointers and the like. It was completed in 3 months.


ok you're prob right. that sounds bad. :-)


> It was the worst abomination ever known to man. Crashes in random places. Race conditions. Global variables. Memory corruption galore.

All of these are trivially detected with msan, tsan, lsan and ubsan.


IMO I doubt switching language would help. It's the same for any language. You can learn to code easy, but learning to write good code/architecture + working together in a large team is really hard.. takes lots of experience and time till the team can come together.


Some languages and frameworks have the right defaults for scale.

Rails has a lot of magic that gets you running fast but it doesn't even have a service layer - and instead promotes the abomination that is mixins (concerns), fat models, fat controllers - stuff that gets you running with minimal effort but then crying over the code duplication all over the place and lack of logical separation.

ASP.NET is more verbose out the gate but you're basically guided in to stuff like repository pattern, service layer with POCO models, thin controllers and IoC. Static typing gives you guarantees when reading the code (I've seen RoR concerns that relied on random fields being present in target class, but had conditional logic with implicit assumptions about which class it would be included in - it was a hell to reason about). It's verbose but consistent and built to scale - I'll take a dirty ASP.NET project over a dirty RoR project any day.


+1. Every time we look at Ruby app in my tech diligence work, we wind up talking with them about how they wish they had that service layer and weren't doing Active Record methods from controllers. It's a real achilles heel of RoR once things get big.


You are not wrong. Any startup people reading this... your CTO is the second most important person at the company, maybe even the most important. They need to be awesome. You can't skip this.


On a large software team, about half the programmers are below the industry-wide average. Sure, you can carefully build a team of 10 where they're all above average. I've even seen a team of 30 where most were above average, and none were clearly below. But a team of 100? That's a lot harder.

This has serious implications for how you develop large systems. Don't try to get too sophisticated on a larger or longer-lasting project. You'll have people on it who won't' be able to play at that level, and if you try to make them, they'll make a mess.

The same thing happens over time. Sure, your team is only 20 people... today. But if the program lives for three decades, how many people will work on it? 100? Will they all be superstars? Even those maintenance people you hire 20 years from now? Probably not.


> If your team really knows what they're doing and can take advantage of the flexibility without creating a mess, they can let you move really fast and make elegant DSLs and so on. [..snip..] I never thought I'd say this before, but I would have been happier stepping into C++.

Actually, I find Nim language [1] to be a great alternative to C++ as far as compiled, statically typed, high performance modern languages are concerned. Moreover, Nim's meta-programming capabilities (generics, templates, macros) are second to none for creating elegant DSLs. And its syntax makes Python developers feel at home. Preempting questions about garbage collection -- Nim allows to switch off GC and manually manage memory if a programmer so desires. Or you can manually manage memory in some critical parts of your program while letting the rest be garbage collected.

[1] https://nim-lang.org/


> But if you let a team of juniors do whatever seems like a good idea, with nobody calling the shots who understands tech debt and the large-scale architecture problems, the mess that can be made is staggering.

In the several years I spent doing .NET consulting, I believe that this is the crux of the problem, and not Python per se.

In .NET land (probably Enterprise Java too), this often manifests as an apparent goal to get every single pattern from PoEAA [1] into every file rather than extensive metaprogramming - but the root cause is the same - juniors not having appropriate guidance.

[1]: https://martinfowler.com/books/eaa.html


I was about to say, I'd like to know which language juniors don't make a mess of.


> Sure, this is a problem of people, not language.

I don't think so. It is a problem of language.

Or more specifically, of dynamically typed languages (DTL) vs/ statically typed ones (STL).

We know for a fact now that DTL's simply don't scale to large code bases. And before dozens of people tell me there are a lot of large apps written in DTL, this doesn't mean it was a good idea in the first place nor that these large apps are easy to maintain and improve (they're not and they would be much easier to maintain and improve with type annotations).


> We know for a fact now that DTL's simply don't scale to large code bases.

We don't know that. This is not a fact.


There are simply too many large projects done in dynamic language to show that it's probably a good idea for them.

Or they eventually come up with a hybrid micro services.


(ever seen "import gc" in a Python program? yes, that means what you think it means..)

I recently spent half a day implementing a hacky workaround for something using gc. Once I was happy that I got it working, I immediately laughed at how absurd it was and deleted it.

I still don't think I will ever be happier in C++ but I'm only 11 years into python, and 6 years into coding full time, so who knows :-).


You know I would agree as someone who did a lot of python and dynamic languages there is a tendency for chaos. But then I have seen just as many chaotic Java code. If even go so far to say it’s more a function of the libs/framework than the ecosystem. I have seen a lot of terrible flask code, django code was much better on average and often better than spring code.


I have seen plenty of C# projects with these same problems.


I've seen plenty and worked on plenty too, but IME a lot of those issues were at the boundaries of the type system:

Abuse of untyped ViewBags to pass data to views, using ViewBags to pass things to Helpers, badly configured AOP or XML-based dependency injection, people using Dictionaries/dynamic/JSON instead of objects for passing data between application layers...

As soon as you have escape hatches you get those things that make refactoring a nightmare in large projects and cause new employees to always fail to grasp the full-picture because of how fragile projects were.

In the frontend I feel the same about Vuex. It's a great library, way more ergonomic than Redux and the defaults are great, but the fact that actions and getters are untyped by default in Typescript is a major source of bugs in projects I work on, even in codebases with 100% coverage.


Absolutely agree with you. But with advent of microservices architecture patterns, it is possible to keep service code within bounds of maintainable size.


I'm by no means an expert architect yet, but has the argument for microservices in smaller orgs basically become:

"Since y'all can't seem to use classes and interfaces correctly, let's physically separate the software so changing the service contracts becomes a bigger pain in the ass"?

I'm not even trying to be sarcastic. I'm genuinely curious. I feel like 90% of the benefits of microservices would be accomplished with discipline and good code review, but those things of course are more easily said than done.

(I understand your Facebooks and Netflixes benefit from having completely separate teams responsible for services with more autonomy, but I'm talking about small to midsize orgs that seem to have adopted microservices.)


Microservices are different from language abstractions (classes, etc.) In two important ways:

1. More decoupling means you can use an entirely different langauge

2. Services can be deployed, restarted, etc. independently.

But that comes with disadvantages. The contract then becomes message passing and serialization/deserialization, which is fairly primitive. You can't pass functions/callbacks, and simple static checks become a whole validation problem. It also limits your ability to use interesting types across services.

The XML craze 20 years ago tried to tackle all of these problems (remember XML schemas?), but it became unweildly and didn't really succeed despite a huge enterprise push.


In a mid-size shop is another language an advantage?


It certainly can be. Sometimes you start in one language but then need to integrate with something else that really only works well with a different language.


The more I do this, the more I come to believe that boundaries, no matter how you do them, are the million dollar question.


Or it's the other way round.

In certain languages, developers may find it difficult to maintain code quality and good design in a codebase of a certain size. So they find other ways of achieving good design by introducing service boundaries.


But even static types (even the Hindley-Milner variety) won't be of much help in a team of juniors with no technical leadership. The codebase will end up in the same place - needing a rewrite - because even a good typing discipline cannot immediately substitute hard-won lessons in program design.

In fact I never thought I'd say this - for the first two years of discovering sum types, I thought they were the silver bullet of computing, and certain experiences have sadly tempered me there :)

I still agree that types _can_ make the flow and transformation of data in the system clearer than if there were no types. Though without experience, we could very well end up with convoluted designs and that can be as difficult to untangle as a dynamic spaghetti.


If nothing else, a statically typed mess is easier to navigate, understand and refactor than a dynamically typed mess. Which I think is a big deal.


> I still love Python, but I now have a completely different attitude to hyper-dynamic languages (like Python, Ruby, Clojure, Elixir, etc).

Elixir is all about immutable structs and pattern matching. I don't think "hyper dynamic" is a fair characterization at all.


To be fair elixir has slowly become more and more static over time, as in via convention and tooling.


Even at the language level, every single variable is a constant—the opposite of "hyper dynamic".


The larger design patterns build on that immutability on the local level. Inheriting the design patterns from Erlang/OTP also make larger programs much more feasible. I can be away from a piece of Elixir code for months and come back and the structure is usually pretty straightforward to pick up again. Primarily due to supervisor hierarchies combined with the lack of weird OOP hierarchies and entanglement.


I have found in my own work life that the problems you get from large python code is quite different from a large C code. The python code usually start out very good and readable but as it get extended and changed it tend to create a massive mess that is hard to navigate. The C code in comparison don't generally get extended or changed at all since the initial work to even add the simplest change tend to require a large initial dive into the design, custom types and design details, that such code tend to mostly be left alone.

The upside with the python code is that a complete rewrite of parts that has get extended beyond recognition tend to be easier than expected.


> The C code in comparison don't generally get extended or changed at all since

That sounds like a disadvantage tbh


Use the right tool for the job. Python is great except at the huge (or performance) end. The problem occurs when one tries to use it for everything. And it has tools now to help at the high end, though you have to choose to use them.


My experience as well. I was a Perl then Python/R evangelist. But I doscovered and have been using Kotlin (JVM) extensively and TypeScript to a lesser extent for the last 2 or 3 years. And the more I dog and learn about the JVM the less I want to go back. I'm also interested in getting back to Haskell, Scala and Rust I toyed with just to keep my mind open.


Programming languages don't kill projects -- people kill projects.


> If your team really knows what they're doing... they can let you move really fast and make elegant DSLs and so on.

I agree on this part. The problem is many teams with 5 Ruby programmers is doing what 100 Java programmers' work. There would usually be many teams in Java projects, and nobody have full picture of the project. Then people would create jobs out of nowhere because they have to do something... Just see how many projects have dedicated teams writing piles of code for Kubernetes.

I would rather hire much fewer elites for higher price instead.


Not surprising. Lisp Curse has its own way of creeping in, but then again I'd take this any day over cubicle-farm software that is so sterile and stuck up that you need a 2 week long tutorial in order to run anything (I'm looking at you Google with your fancy-schmancy Borg and other inscrutable infra).

http://www.winestockwebdesign.com/Essays/Lisp_Curse.html


If you have junior engineers you can't let them run willy nilly, you have to make them slow down, come up with a design (obviously I'm talking about >$medium_size projects), have a senior engineer review it, and then let them go. They will do much better and have more confidence in their work. Obviously you have to follow up weekly. They're junior engineers for a reason. Even a language like Rust or OCaml won't keep them from effing things up.


Have been on a similar journey within the company I'm at now. Starting out with fairly loose Python, and now seeing egregious bugs make it into production and hurt your team's reputation, you start asking "why the hell are we suffering from things that are highly preventable". You work your way through type hints, strict linting, code reviews, etc etc, but you end up realising you're in part just using a hammer to nail in a screw.


I feel like it's nothing to do with the language. The quality of the team will reflect on the quality of the codebase. Period. Quality of teams/codebases is always going to follow some type of distribution.

Engineers are perpetually keeping mental tallies and notes in search of data on which tools, methods, and technologies will help them reach the promised land. I think really the majority of that is noise.


> (ever seen "import gc" in a Python program? yes, that means what you think it means..)

I'm using it sometimes, what's the issue?


I've used 'import gc' in python programs. gc.get_referers and gc.get_referents is very helpful.


I'm guessing you've been doing this for a while though... when you see that in a code base that you know has major issues and was made by fairly new programmers, it's like "oh god please tell me that doesn't mean garbage collector, I'm going to need a stiff drink"... :-)


yes, I have been doing 'import gc' for 20 years. Well, technically, I stopped about 15 years ago because I haven't had any unexpected references hanging around for that long.


gc.get_stats() too.


"import" as a replacement for a singleton, such a loaded gun. :-(


> ever seen "import gc" in a Python program?

Yes, I see it all the time. People think that `gc.collect` collects all the garbage, but that's not true. It only checks for reference cycles.


Doesn't C++ have the exact same problem? People have style guides that prohibit using part of the language features. Maybe the problem is just better understood in C++ land.


My experience is that, unless I am getting something extremely powerful with OS-like aspirations (Common Lisp, SmallTalk, Forth), I'd rather have static typing in a professional setting as well.

So a hierarchy of choice for me would be like:

S The languages mentioned above

A Languages with Hindley–Milner type system and inference (SML, OCaml)

B Your typical blue collar languages (C, Pascal, Go, Java, C#)

C Scripting dynamic languages (JS, Perl, PHP, Lua)


This is true for any language, strongly typed or otherwise, with or without memory management. The evidence gathered in the last decades is pretty clear: there is little to no advantage using a strongly typed language over something like Python or Ruby (on mobile so can't look up references but there's really plenty). And that's just looking at bug density, without even considering how quickly you can build something (often thanks to the great ecosystem of packages) or how easier it is to hire decent Python/Ruby/JS devs over, say, C++, Java or Rust.

That said: I wish you the best of luck!


The smoking gun wasn't really the types to be honest. It was the way you can monkey patch and have massive daisy chains of effects from a simple import statement. Import in Python can pretty much be made to do anything you want...


But Python is strongly typed. If you say x=2, type(x) will say “int”.

It does suffer from an excessive overloading of operators (“a” times 6 shouldn’t work). But that’s neither here nor there.

If Python wasn’t strongly typed, it wouldn’t be conceptually possible to move it towards explicit typing as the current push is.

Now, it’s not statically typed. You don’t get to assign values and interpretations to memory addresses. This is what “performant” people usually defend. But _that_ has nothing to do with typing assuming semantic roles in codebases.


Apologies for the "flexible" usage of the words "strongly typed". Arguably (as someone already pointed out) Python can be considered strongly typed (i.e. doing 1 + "1" won't cast the first operand to string, unlike e.g. JS) although there doesn't seem to be a clear definition of what strongly and weakly typed means (https://en.wikipedia.org/wiki/Strong_and_weak_typing). So let's stick with "statically" typed, which in the case of Python would mean using something like mypy.

Here are some references (there might be some overlap):

- https://labs.ig.com/static-typing-promise

- https://danluu.com/empirical-pl/

- https://vimeo.com/9270320

- https://medium.com/javascript-scene/the-shocking-secret-abou...

- https://www.researchgate.net/publication/259634489_An_empiri... (one of the original studies)

I hope it's clear I'm not implying that there are no advantages in using a statically typed language, only that it's often seen as a solution to a problem that doesn't originate there.

PS: what is up with the downvotes? What is this, Reddit?


> The evidence gathered in the last decades

Source?


I posted a (non exhaustive) list here: https://news.ycombinator.com/item?id=25001953




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: