Even as someone who is in charge, it can be incredibly difficult to effect change once this type of culture is established. The only way it can happen is if enough people know it's wrong up and down the chain and are able to consciously recognize the systemic problems and rally individuals to start pulling the cart out of the mud. This is easier said than done, because most people are willing to put up with a tremendous amount of bullshit as long as it is predictable bullshit and they don't feel responsible for it.
I also want to challenge your framing about learned helplessness by "the one in control". It's true that in many situations you won't have power and you need to pick your battles. But on the other hand, everyone has some degree of agency, and programming especially requires folks to understand what they are doing and how it fits into larger systems (both technological and human). As a manager, I have seen a tremendous amount of learned helplessness based on assumed constraints that simply were not true. Yes, in some cases it's justified, but the most successful people tend to have fewer assumed constraints, regularly take action to do what they can to improve things, and they aren't easily discouraged when (inevitably) outcomes fall short of their ideal vision.
If people are serious about rooting out people who lie or cheat to appear on-target (beneath them in the management chain) they need employ anonymous whistleblowing straight up to them.
But I think many people in charge see the corruptness of the people below them as "headaches to deal with" and "if it's not squeaking it must be working" which I think is exactly backwards (it's the "if it compiles it's working" approach to management).
develop a network of senior people and build consensus around change. if you try to go this alone you are perceived (and may very well be) a shortsighted iconoclast acting out some narcissistic fantasy.
modulo that, once we have established some legitimacy and buy-in from the people that matter..just lose your fear and do the right thing. sure, they can fire you.
One of the subtler points Eli Goldratt makes indirectly is essentially that, "this type of culture is established" as you put it, in situations of abundance.
So Goldratt's description of "crunch time" is a company which starts out every quarter doing things in the (as they see it) "right way," then they close out every quarter doing everything in the "wrong way" or taking shortcuts or spending considerable costs to finish projects to make their revenue come in.
The spin I'm offering is: when you have a season of abundance, you do not have to face crunch time, and you get, as you put it, "predictable bullshit and they don't feel responsible for it." But effecting change during crunch time, by contrast, seems much easier. As long as you can keep your head straight. Goldratt's starting point begins, "maybe these 'shorctuts' we take under pressure are not the 'wrong way' per se."
I personally hate learning abstractions without examples, so let's get concrete. Your software project gets overdue and you and your peers start approving each others' merge-requests after only skimming them for flagrantly bad code, rather than "is this method in the right class?" and "does this have adequate test coverage?" you are now instead looking for "okay that's a raw SQL query, could that accidentally drop the whole database? no? approved! anything else we can add tests and refactor in a month." That attitude, right?
Goldratt says, "no, pay attention to that!" ... you say "but that's not how it should be done, we must review code as we go, tests are good to have" and my-imaginary-version-of-him replies, "sure I understand that you have a need for code safety, I am not saying that you don't. But the fact remains, you accepted the bypassing of this process when crunch-time came, which means that it constrained the team in unacceptable ways overall. So if we believe in working smarter-not-harder, we need to get imaginative now, because the 'logical place' for code safety measures to get injected, is too slow. And there are lots of imaginative solutions. We could hire someone whose only job will be code review, they could start reviewing your code before you're even done writing it. Or we could mandate that everyone must commit their code to the main branch every single day with only pro-forma code review, this will force us to adopt out-of-band practices to make things safe. Other things like that."
So what does effecting change look like in a system that does not have the luxury of abundance? It looks like a "crunch time" that never "ends", but does "get better." Goldratt's point is essentially that you don't want to meet this punch with a counterforce, as that will hurt: you instead want to yield with it and let its own momentum carry the both of you to a better solution.
I have a theory about how this works on software projects but I didn't quite get the chance to run the experiment at my last job, so if this intrigues you and you have the headcount to hire me... :)
I also want to challenge your framing about learned helplessness by "the one in control". It's true that in many situations you won't have power and you need to pick your battles. But on the other hand, everyone has some degree of agency, and programming especially requires folks to understand what they are doing and how it fits into larger systems (both technological and human). As a manager, I have seen a tremendous amount of learned helplessness based on assumed constraints that simply were not true. Yes, in some cases it's justified, but the most successful people tend to have fewer assumed constraints, regularly take action to do what they can to improve things, and they aren't easily discouraged when (inevitably) outcomes fall short of their ideal vision.