Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wish every company which crawls had such a page. Maybe a centralized directory of crawlers would be a good for the internet.


I mean, technically you can just disallow all robots in your robots.txt and only allow those of which you are aware and you like?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: