The definition of "company one keeps" is open to interpretation. How many Facebook connections away do you think you are from a drug dealer? Take him away, boys!
Social media also provides strength of connection. How often do you communicate with the drug dealer, at what times, how often is your GPS position in his house for 5 minutes.
Probably everyone who isn't friendless on Facebook has a drug dealer in their immediate network. But using more data you can filter out the actual druggies with high confidence.
> But using more data you can filter out the actual druggies with high confidence.
i'm not comfortable with our justice system acting on the "high confidence" of some proprietary heuristic with a closed source implementation and little scientific evidence to support its claims, and with little ability for citizens to vet its workings at a detailed level, to see whether it implements their values or the values enshrined by the constitution.
to say nothing of the fact that i have zero interest in using the criminal justice system to look for "druggies" (violent drug dealers, sure, because they're violent; drug abuse is a health problem that the criminal justice system is ill-equipped to deal with).
>to say nothing of the fact that i have zero interest in using the criminal justice system to look for "druggies" (violent drug dealers, sure, because they're violent; drug abuse is a health problem that the criminal justice system is ill-equipped to deal with).
Me neither, I was using your example (connection to drug dealers). You could similarly apply this to connections to murderous gangs and you have a similar argument. Saying drugs are ok is just detracting from the point.
can you explain why doing it with a computer might make it more valid? are you already highly confident that you have good heuristics to automate and scale? if so, what's your evidence? if the heuristics used by the software are proprietary, how do you judge whether they line up with your values without essentially running an experiment on society? doesn't that seem a little cruel if the software is an integral part of a process that violates citizens' constitutional rights?
a palantir employee does a great job of explaining why their technology is frightening, as part of pushing back on someone from NO who wanted predictions with numerical rankings for potential offenders/victims:
> “The looming concern is that an opaque scoring algorithm substitutes the veneer of quantitative certainty for more holistic, qualitative judgement and human culpability,” Bowman wrote. “One of the lasting virtues of the SNA work we’ve done to date is that we’ve kept human analysts in the loop to ensure that networks are being explored and analyzed in a way that passes the straight-face test.”