My (unpopular) take--programmers have been 'gassed up' by a decade of overcompensation + title inflation.
People think the high pay and the fancy titles* they're (often) given reflects their value or intellect*, even subconsciously, and they behave in such a manner.
*Sorry, I don't consider web programming (which comprises a majority of modern software development) "engineering"
*Many are some of the most intelligent people quite literally on Earth, or are otherwise exceptional.
Great job on further eroding the trust from a prospective employer.
Require a formal degree in CS? That's gatekeeping.
Need to pass a whiteboard exam? Not representative of the actual work.
Live coding session? Biased against people who don't perform well under pressure.
Take home project? It's too much work to do for free.
Showcase a personal portfolio? Not fair to people with families or other obligations.
Either you enforce a minimal level of competency upfront in the form of an academic degree, industry-standard exams such as a PE Exam, etc. OR you push the entire responsibility of vetting prospective applicants downstream to the employer—which is exactly why interviews are multiple week long gauntlets.
The tech world likes to complain about all of this but other occupations 100% DO have high standards - it's just that it's paid up-front.
Want to become a lawyer? - You've got to pass the LSAT, get into law school, and pass the bar.
Want to become a doctor? - You've got to pass the MCAT, get into medical school, and do residency.
Want to become a pilot? - You've got to get your PPL, pass your check ride, do your IFR, multi-engine, commercial rating, ATP
God there are some days that I ABSOLUTELY HATE THIS INDUSTRY.
For what it's worth, I've worked at multiple places that ran shell scripts just fine for their deploys.
- One had only 2 services [php] and ran over 1 billion requests a day. Deploy was trivial, ssh some new files to the server and run a migration, 0 downtime.
- One was in an industry that didn't need "Webscale" (retirement accounts). Prod deploys were just docker commands run by jenkins. We ran two servers per service from the day I joined the day I left 4 years later (3x growth), and ultimately removed one service and one database during all that growth.
Another outstanding thing about both of these places was that we had all the testing environments you need, on-demand, in minutes.
The place I'm at now is trying to do kubernetes and is failing miserably (ongoing nightmare 4 months in and probably at least 8 to go, when it was allegedly supposed to only take 3 total). It has one shared test environment that it takes 3-hours to see your changes in.
I don't fault kubernetes directly, I fault the overall complexity. But at the end of the day kubernetes feels like complexity trying to abstract over complexity, and often I find that's less successful that removing complexity in the first place.
I don't think that struggle is really the defining feature here. If anything, many of the most toxic people I can think of in geeky circles are precisely the people who still have a chip on their shoulder about something that happened in middle school.
I think it's simpler than that: we sold out.
Tech in the 80s and 90s was the land of curious geeks who played with it because it was interesting, or because they had a goal they wanted to enable. But once tech became a powerhouse of investment, it became taken over by investors, financiers, and the kind of geeks who would play ball with them.
Some of them drank the kool-aid and became financiers themselves, corrupted by the same forces that corrupt bankers or politicians. Some of them sold out because hey, ping-pong table in the office, that's pretty cool! evil never has ping-pong tables! Some of them sold out because times are hard and they wanted a job. And some of them don't realize they have sold out, because tech culture does a very good job of propagandizing selling out as a virtue.
You're right but you're looking at a very small part of the picture.
> After all, to the founders, this is the most important thing ever. It’s their ticket... But smart people don’t work like this. Smart thoughts don’t come out of exhausted brains. It’s a meat grinder for no reason. It’s sadistic.
They're smart, but they're angry.
Go around and ask the 22 year old YC founders, "How do you deal with anger?" and their answer is "grind." I've asked 5 YC founders on different occasions, every single person said grind! They even used that word.
Some kids didn't deal with their anger, with their parents hitting them and the capriciousness of cram school. Other kids took Women, Gender and Sexuality classes, or they painted, and addressed their anger earlier.
Once a YC founder told me that "this" was "about" what sounded like a college aged vengeance. To proverbially prove all the haters wrong. Dude, that's nuts! These people are angry.
YC sincerely wants people to succeed. It's attractive for people who deal with their anger by working a lot. That's reasonable. Is it sustainable? I don't know, it's a complex question. I wouldn't do it to myself. But I don't feel angry every day.
That belief would be erroneous regardless of what compiler you used, because rust unsafe lets you do anything, including cause memory unsafety and other UB, even when using rustc.
Rust isn’t any kind of guarantee for end-users that some class of bugs doesn’t exist. It’s just a tool for programmers to make writing a certain class of bugs more difficult. If the programmer chooses to subvert the tool, they can.
(Also, I think most reasonable people would feel you were lying if you referred to a program that can’t be compiled by rustc as “written in rust”. Maybe “written in a dialect of rust” would be more accurate. Rust isn’t like C where there is a standard that multiple competing implementations target; the definition of rust is "what rustc accepts".)
>The Aztec C compiler would have originally be distributed on floppy disks, and is very small by moden standards.
If I remember correctly, Aztec C was from Mark Williams. It was also the basis for the c Compiler that came with Coherent OS.
But yes, things were far easier in the 80s, even on Minis which I worked on back then. These days development is just a series of Meetings, Agile Points, Scrums with maybe 2 hours of real work per week. Many people now tend to do their real work off-hours, a sad situation.
But I am looking for 1 more piece of hardware, then I can set up a DOS Machine to play with myself :)
>The Aztec compiler pre-dates ANSI C, and follows the archaic Kernigan & Ritchie syntax
I still do not like ANSI C standards after all these years.
Thanks, it didn't come easy. A few personal, professional, and academic failures before I got there myself. The last straw was when I realized I was the toxic asshole at a job. Lots of problems, my concerns were real, but my reaction was not helpful for anyone. I bounced and gave myself a mental reset and have been more deliberate about it since then.
Building products myself, with my own rules on how it'll be built, and how we get feedback from customers.
I fell out of love for programming after working at too many feature-shops where we'd churn out feature after feature (and in some cases blindly remove features), with nearly zero feedback from real users - just input from product managers with zero domain knowledge.
https://archive.is/CHgpq (bit tricky to get to this, the most recent capture showed the paywall, had to go to an older capture, figured this might not be intuitive for even a lot of HN users)
This "Big River" program seems like a proper use of classic "dogfooding" techniques (bastardizing that word to say testing your competitor's food, rather than its established meaning for eating your own food). It doesn't seem so different from a widget company testing and dismantling a competitor's hardware device. It's scaled up a lot, but anything related to global logistics has to be to get any good info.
The difference perhaps is that hardware is occasionally / often (depending on the widget) protected by patents. In logistics it's less likely to be (but still possible, especially given the existence of "business process" patents). Part of that scale here is also how many people they have working on it to get a super detailed picture of how their competitors are doing what they do.
I definitely have a cautious reaction about it, but I was expecting something that might violate regulations and statutes around trade secrets, and I don't see anything that obviously does. It's a bit creepy due to the nature of what investigating logistics involves, but it doesn't seem improper. There are plenty of practices Amazon perpetuates that I do believe are improper, but this didn't immediately strike me as one of them - but I'll keep an open mind if other people here make a strong case for it.
One of the things I don't think belongs in today's economy is someone who runs a platform/marketplace to also compete on that platform/marketplace. That is something I strongly criticize Amazon for doing.
-------Edit---------
Response to a now-deleted reply: Yes, it's an issue if Big River is helping Amazon compete with its competitors, but it wouldn't be an issue if Amazon wasn't participating in its own marketplace. I'd propose the remedy to be "stop doing the bad thing" rather than "stop doing the thing which appears to be mostly fine on the surface but also may be dual-purposed to assist in doing the bad thing".
Where my own argument falls apart is store-brands for supermarkets. I like buying affordable, quality products under the HEB, "Great Value" (Wal-Mart), and "Kirkland" (Costco) brands. I also don't think it's appropriate to make a law that prevents Amazon from doing it while exempt these other companies - or else both self-enforcement and enforcement by the government both get too difficult. Business managers would rationalize that they're more of a "Costco/Kirkland" than an "Amazon/Amazon Basics" and no one within the org would be able to push back with "No, I cannot do that clearly illegal thing."
So I'm not sure what the right answer should be - whether that's Costco needing to divest their Kirkland to a truly independent third-party and maybe have a rigorous quality testing program they can say "We recommend this brand because we know they meet our highest standards at a great price", or if there's some other way to regulate it.
There is precedent for the general concept, if not the exact implementation I propose. The SEC has a wide range of strictly-enforced regulations controlling what companies can do on markets they operate (or anyone who acts as a broker-dealer) that it mostly created a "de facto" ban on companies which run a stock/commodity/FX/etc market from also participating in that market. They can afford rules that slice the concept very close to the bone because they have very strong, harsh, and vigilant enforcement. For the SEC paradigm, see Fair Access Rule, Regulation of Broker-Dealers / Duty of Best Execution, Market Maker Rules, Anti-Manipulation Rules, and Conflict of Interest policies. It's really very illegal for an exchange or a market operator to prioritize its transactions or its affiliated participants’ transactions over those of others.
I'm not sure the FTC/etc can afford that same luxury, as they haven't demonstrated an ability to enforce rules aggressively and universally.
While I can empathize here as someone with no degree and a job at FAANG, it's hard for me to have a lot of sympathy. I put the blame squarely on these institutions and our government for this situation.
The institutions should be held liable for this debt, not the tax payer, and our government should not guarantee or subsidize these loans. Higher education is important in our society, but the situation we're in now is a complete disaster.
18-year olds taking on huge debt for useless degrees that can't be bankrupted, what could possibly go wrong?
Working my way up to calculus for the first time in my life, at nearly 40 years old. I've always hated math :(
When I was a kid, they always told me math would be super useful, especially if I liked computers. Well, 20+ years of a dev career later, I still have never used anything more than basic arithmetic and rudimentary algebra (to calculate responsive component sizes). But with web dev jobs going the way of the horse-drawn wagon, I figured it was time for a career change. Hoping to get into (civil/environmental) engineering instead, but I guess that field actually does use math, lol. We'll see how it goes...
In the meantime, also taking singing classes at the community college, and enjoying THAT way more. We performed at a nursing home a few weeks ago, and that brought SO much joy to the audience there, even though we're just a bunch of amateurs. It's just such a different reception than anything I've ever seen as a dev. Tech rarely inspires such joy.
If I could start all over again, I wish I would've pursued music over computer stuff. Much harder life though!
As far as I can tell, the only cryptocurrency that actually delivers on its name (i.e. being used as a currency) is Monero. Sure, it's all drugs and stolen credit cards, but it does undeniably solve a real world problem for its users instead of just being used as a vehicle for speculative investment.
With that said, I think if anyone comes up with a "killer-app" for crypto, then it'll be on the Ethereum chain. They seem to be the only ones who consistently work towards adding capabilities to the core technology.
Edit: I realize I haven't commented on the article at all. This sentence stood out to me:
> Today, we have all the tools we'll need, and indeed most of the tools we'll ever have, to build applications that are simultaneously cypherpunk and user-friendly. And so we should go out and do it.
Clearly, this is an important step. But the two examples he provides as a beacon of what's possible (Daimo and Farcaster) don't inspire a lot of enthusiasm. Daimo is just a decentralized version of Venmo and Farcaster is a protocol to build social networks on the blockchain, which is yet another tool and not an application.
I do still like reading Vitaliks thoughts. He's a pretty good writer, and it's evident that he spends a lot of time actually thinking about the topics he writes about.
So, as a 56yo who's been programming professionally since 2004, currently having trouble finding enough contract work, a bit of perspective:
When there are more openings than qualified people, even when you reduce qualifications to what is actually necessary, it seems like there's "nobody" available. Now, when the shoe is on the other foot, it seems like "nobody" is hiring. It's harsh, but it also doesn't take that much of a shift in the supply and/or demand to go from >1 to <1, even though the effect you feel is large.
There is no kind way to put this: a lot of stupid stuff got done with a lot of smart programmers in the last ten years. Meanwhile, boring old stuff like manufacturing was starved for programmers, and eventually gave up trying to get them. Now that it is possible, it takes some time for all the companies who couldn't get (or couldn't keep) programmers before, to realize that it is once again possible. However, anecdotally, I see it happening, albeit slowly.
Programming that actually accomplishes something useful in the world, is still a productive thing, and positions will get created. However, large sectors of the economy take time to pivot, and so it is best to find a way to make ends meet in the meantime, and also do something useful (even if unpaid) with your programming skills, to keep your mind in practice.
Unlike you I don't have a dislike of graphics. I do however see value in small simple software. The Web is a runtime so very complex that it takes huge organizations to create.
Theoretically, you could sacrifice full compatibility by implementing only the APIs used for Google, Facebook, YouTube, Reddit, Amazon etc. and have something much simpler. But that would still be a hard task because you are making a big compatibility hack for certain websites. Like the wine compatibility layer only for websites. Except that the websites could stop working at anytime and then you'll have to pile on more interfaces to keep up with them.
When evaluating software utility we often times forget that websites are software and don't attempt cost them in. Using them is a recurring cost in terms of complexity. They are definitely not free or even low cost.
I can't stand Git. The points the author makes in the intro are true. Developers learn just enough Git to do their jobs, which is unfortunately more Git than they actually understand. I'm including myself in this. I've been using Git for nearly 20 years (since the very beginning), and I still can't accomplish anything more than the most basic things without reading documentation and hitting up the net for help. Even if I stick with simple operations, I can get in over my head. WTF is "detached HEAD"? I've looked it up dozens of times, and as soon as I get past whatever I'm working on, I forget what it means. I don't know what the reflog is, other than that it has to do with some dark internals. Resolving merge conflicts during a rebase is a clusterfuck. And on any team I've ever worked on, the branch history is a nightmare to look at.
So I clicked on this link with the hope that it might be a kinder, simpler VCS. Especially after the article author talked about how bad Git's CLI is and how much better Jujutsu's is. But it looks like it's just as much of an overcomplicated clusterfuck, and since it's built on top of Git, you still get to experience all the pain of Git if anything goes wrong. And I wonder what kind of frightening scenario occurs when some contributors use Jujutsu and some use Git.
I understand that Git is powerful, and many of its features are valuable for large teams working on enormous projects (like the Linux kernel it was invented to manage). Hardly anyone needs that, though. I want a dead simple VCS for small project use that doesn't come with so many footguns. I know simpler VCS exists, but without wide adoption, GitHub-like repos, and tooling support, using Git is still the path of least resistance.
> over several days, to run a work load an order of magnitude smaller
Here I sit, running a query on a fancy cloud-based tool we pay nontrivial amounts of money for, which takes ~15 minutes.
If I download the data set to a Linux box I can do the query in 3 seconds with grep and awk.
Oh but that is not The Way. So here I sit waiting ~15 minutes every time I need to fine tune and test the query.
Also, of course the query now is written in the vendor's homegrown weird query language which is lacking a lot of functionality, so whenever I need to do some different transformation or pull apart data a bit differently, I get to file a feature request and wait a few month for it to be implemented. On the linux box I could just change my awk parameters a little bit (or throw perl in the pipeline for heavier lifting) and be done in a minute. But hey at least I can put the ticket in blocked state for a few months while waiting for the vendor.
> It is [...] entirely optional to call free. If you don’t call free, memory usage will increase over time, but technically, it’s not a leak. As an optimization, you may choose to call free to reduce memory, but again, strictly optional.
This is beautiful! Unless your program is long-running, there's no point of ever calling free in your C programs. The system will free the memory for you when the program ends. Free-less C programming is an exhilarating experience that I recommend to anybody.
In the rare cases where your program needs to be long-running, leaks may become a real problem. In that case, it's better to write a long-running shell script that calls a pipeline of elegant free-less C programs.
> Wanna build a video game that teaches developers how to code and use Twilio? Let's try it! Wanna build an AI application with Tony Hawk and have Tony Hawk debug the code live on stage? Sure!
These are the strange effects of ZIRP and infinite QE. Many companies never had to care at all about profit and could just do things "for fun", and still see valuations skyrocket as long as they hired more people. What a time.
35mb seems small to you? That's an absolutely massive binary.
But to answer your question, you're going to see some performance improvements if your binary can fit into lower levels of the CPU cache.
Kdb+/q for example is less than 700kb, which mean that the entire thing can fit into the L1 instruction cache on high end server CPUs. And that size is even more impressive considering its only dynamically linked to libc and libpthread.
This small size contributes to the out of this world performance the system has. And remember that's the language and the time series db.
> Java has a culture of over-engineering [which] Go successfully jettisoned
[looks at the code bases of several recent jobs]
[shakes head in violent disagreement]
If I'm having to open 6-8 different files just to follow what a HTTP handler does because it calls into an interface which calls into an interface which calls into an interface (and there's no possibility that any of these will ever have a different implementation) ... I think we're firmly into over-engineering territory.
Google builds non-PIE, non-PIC, static, profile-guided, link-time-optimized, and post-link-optimized binaries and probably DGAF about calling conventions.
People think the high pay and the fancy titles* they're (often) given reflects their value or intellect*, even subconsciously, and they behave in such a manner.
*Sorry, I don't consider web programming (which comprises a majority of modern software development) "engineering"
*Many are some of the most intelligent people quite literally on Earth, or are otherwise exceptional.