Instead of fake solutions that will most likely not work it offers you one that you'll eventually end up with. Just counting patiently to 5, as many times as you need.
The performance is surprisingly similar to one of those FLAN models. Can't exactly expect much from the 7B version but the 30B one ought to be interesting once it's up.
LLaMA: 1. 2. 3. 4. 5. 1. 2. 3. 4. 5. 1. 2. 3. 4. 5. 1. 2. 3. 4. 5. 1. 2. 3. 4. 5. 1. 2. 3. 4. 5. 1. 2. 3. 4. 5. 1. 2. 3. 4. 5. 1. 2. 3. 4. 5. 1. 2. 3. 4. 5. 1. 2. 3. 4. 5.
(gpt: https://cloud.typingmind.com/share/0841633b-5150-4f7c-a370-0... )