Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
Der_Einzige
15 days ago
|
parent
|
context
|
favorite
| on:
The Bitter Lesson of LLM Extensions
Zero discussion around LLM sampling. How do you leave such a gaping hole in such a written piece? I know it's not AI cus AI wouldn't be that sloppy.
ttkciar
15 days ago
[–]
Local inference users are all about sampling, but users addicted to commercial inference services are wary of sampling, because they have to pay by the token.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: