Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Probably not realistic for now to run it locally, GPT-3 has like 175 billion parameters, you need to count around at least 2 bytes in optimistic scenario per parameter so you have around 350 GB of GPU memory, you probably need at least around 15 GPUs with minimum 32 GB of memory each.


Isn’t there an abundance of GPUs from crypto farmers? ;)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: