Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
l1n
on July 6, 2023
|
parent
|
context
|
favorite
| on:
Scaling Transformers to 1B Tokens
Claude's context is 100K not 1M [1]. If you're somehow shoving in a million tokens that could explain the issue you're having!
[1]
https://www.anthropic.com/index/100k-context-windows
gamegoblin
on July 6, 2023
|
next
[–]
Misremembered, the main thrust of the comment still stands, the 100K context window isn't "real", it would be absurdly expensive to do it for real. They are using a lot of approximation tricks to get there.
phillipcarter
on July 6, 2023
|
prev
[–]
Yep, mistype on my end as well. Claude just fails to process the request if you get above 100k tokens (I've done that, heh).
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
[1] https://www.anthropic.com/index/100k-context-windows