Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think you can run BLOOM locally, but it's not quite as powerful as this iteration of chatGPT. Also the vRAM requirements are pretty high if you want to run the biggest model.

https://huggingface.co/bigscience/bloom



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: