I don't think that anyone is proposing a transparent algorithm that can be audited. They're all proposing trendy, proprietary machine learning techniques sold as AI. If you're just talking about boring transparent algorithms, we've always had the technology to devise them - it's called pen and paper. Mandatory Minimum sentences and Three Strikes laws are clearly punishment algorithms.
The point you're making about the Brock Turner case is well taken; the judge was another standout Stanford athlete in a sport that doesn't have a professional level. My guess is that these algorithms won't be devised by poor black people who dropped out of high school, but they will have a lot to say about how much worse poor black people who dropped out of high school are than people from the demographics of the people who do devise them.
It's also important to note that the algorithms today cannot be audited seriously. We know, for example, that many municipalities have been caught installing red light cameras and then adjusting the yellow light timing to maximize revenue rather than maximize commuter safety. We know that some laws, particularly drug laws, have consequences designed to punish "undesirable groups" (minorities) rather than make streets safer or protect the lives of citizens. So having pseudo-AI that is also a black box isn't necessarily worse than what we have today. Because as of today we can't really audit the algorithm itself, only its output.
The point you're making about the Brock Turner case is well taken; the judge was another standout Stanford athlete in a sport that doesn't have a professional level. My guess is that these algorithms won't be devised by poor black people who dropped out of high school, but they will have a lot to say about how much worse poor black people who dropped out of high school are than people from the demographics of the people who do devise them.