Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Automatic differentiation is very nifty and should be more widely known about. Many machine learners will already know about reverse-mode AD under a different name though: backpropagation.

The author might be interested in the theano library used to compute gradients for neural models in pylearn2. It supports performing some algebraic simplifications on the compute graph before running AD. I'm not sure if it supports exactly what's being proposed here, but it does seem to enable some interesting hybrid symbolic / automatic approaches.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: