Probably not realistic for now to run it locally, GPT-3 has like 175 billion parameters, you need to count around at least 2 bytes in optimistic scenario per parameter so you have around 350 GB of GPU memory, you probably need at least around 15 GPUs with minimum 32 GB of memory each.