The short version is to adopt modern tooling (the vcpkg suggestion is an excellent one) and dependency management rather than using OS specific tools (unless you are on Windows). Part of the reason for this mess is because the Unix world operates on an evergreen philosophy and nothing is truly backwards compatible out of the box without manual intervention. The modern web development and machine learning world runs on the opposite doctrine that programmer time is the most expensive commodity above all else; bandwidth is cheap, storage has a negligible cost, and horizontal scaling can sometimes fix compute bound problems. Deployment processes are thus optimized for reliably reproducible builds. Docker is the classic example: bundle literally every dependency possible just to ensure that the build always succeeds, anywhere, anytime. It has its downsides but it is still one of the most widely used deployment methods for a reason.
In the Windows world, you often find desktops with ten different copies of the "Windows C++/.Net redistributable" (the windows version of the C++/CLR standard library dynamically loaded artefacts) installed because each individual app have their own specific dependencies and it's better for them to bundle/specify it rather than rely on the OS to figure out what to load. The JavaScript, Julia, Rust, Go ecosystems all have first party support for pulling in driver binaries that may be hundred of gigabytes in size (because Nvidia is about as cooperative as a 3 year old child). You don't waste time fiddling with autotools and ./configure and praying that everything would run. Just run `npm install` and most if not all of the popular dependency heavy libraries would work out of the box.
To further these suggestions. Act as if your "installation" is your "deployment" and perform all the necessary checks to ensure your dependencies are there (and are the correct versions) before running. In .Net, this is handled for you mostly by the framework. In Go, everything is compiled together mostly so you (again) don't have to worry about it. In javascript or python it's assumed that you can npm install or pip install your requirements and that the versions will match. From there, you can treat that as your final build and run it.
As a C++ game developer myself, I make sure that my dependencies are part of my repo as submodules so that I can update/pull and build the version I need to from git tag versions.
So if you are tagging your releases, your final outputs, in your git source tree, then going back to a version from 20 years ago is just as simple as git checkout v0.0.1
Vcpkg for C++ dependencies is another option (my preferred if you don't go git submodule route) and ALWAYS USE CMAKE! Don't opt for some crazy build setup, or some internal build tool used by <insert FAANG here> that they force you to use (V8 team, if you're reading this, fix your build pipeline).
KISS. Keep it simple slick. If your package isn't available in the OS package manager, it's time to adopt a package manager or adopt a devops practice that allows you to revert to any version of the code you need (git submodule route).
In the Windows world, you often find desktops with ten different copies of the "Windows C++/.Net redistributable" (the windows version of the C++/CLR standard library dynamically loaded artefacts) installed because each individual app have their own specific dependencies and it's better for them to bundle/specify it rather than rely on the OS to figure out what to load. The JavaScript, Julia, Rust, Go ecosystems all have first party support for pulling in driver binaries that may be hundred of gigabytes in size (because Nvidia is about as cooperative as a 3 year old child). You don't waste time fiddling with autotools and ./configure and praying that everything would run. Just run `npm install` and most if not all of the popular dependency heavy libraries would work out of the box.