Headless browsers are used to create robots in order to automate the gaming of web-based value systems - thus diluting the value for legitimate participants. Examples:
* create fake profiles in order to boost someone's "followers" in a social network where you can monetize your "influencer" status
* click ads from a competitor in a way that would trigger fraud prevention from the ad network effectively preventing the competitor to advertise there
If there's malicious code on the page you could use this to block headless browsers (which might be security scanners) from trying to load / run the malicious code, such as CoinHive.
rather than blocking a bot, it would make much more sense to CAPTCHA an ip that is producing a lot of traffic in a short time. Scaping has always been part of the web, and one should not have the belief that the information on a website is only going to be available on said website.
This approach only stops the most basic and laziest scrapers. Some people have tens of thousands of diverse IP addressed to utilize for scraping. Many of them will not give a shit about your bandwidth or server constraints and will cause your server to hit bottlenecks, making it slow and useless for everyone.
> it would make much more sense to CAPTCHA an ip that is producing a lot of traffic in a short time.
CAPTCHAs are useful, but they're an X/Y problem in the same way that this headless-detection is: trying to detect human vs bot, when the real solution would be to slow down (a portion of) the traffic.
Hashcash would seem like a better solution, since that doesn't lock anybody out (human or bot), it just slows them down to reduce server load. If some clients are higher priority than others (e.g. human users vs poorly-programmed bots) then use info like IP, cookies, etc. to slow down the low priority requests, or even adjust the difficulty depending on how likely the client is to be causing load.