Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's really good when you're producing a lot of data fast that you don't want to lose, and you want multiple consumers to read.

It's a complex tool that solves a complicated problem. But if you don't actually have that problem, then that's a whole lot of complexity for no gain.



Hypothetical use case I've been thinking about:

Say I want to log all http requests to the server (I know I said a keyword of log) and then process those logs into aggregates, stick them in a time series.

Would it be insane to "log" everything into kafka? Or what would be the more "correct" tool for that job?


No, that's a common use case. Ultimately Kafka is just a big dumb pipe. Only question I'd ask is how much data are you expecting?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: