Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

working with dynamic neural net libraries that have autodiff (PyTorch, regular Torch, and I think also MXNet now) feels a lot like this. you just write normal functions, but after you execute them you can ask for gradients too.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: