Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
IanCal
24 days ago
|
parent
|
context
|
favorite
| on:
Prompt caching for cheaper LLM tokens
Do any providers do this level of granularity? Anthropic require explicit cache markers, for example.
jgeralnik
24 days ago
[–]
Anthropic requires explicit cache markers but will “look backwards” some amount, so you don’t need to fall on the exact split to get cached tokens
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: