Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I know people are pushing back, taking "only" literally, but from a reasonable perspective what causes LLMs (technically their outputs) to give that impression is indeed the crux of what holds progress back: how/what LLMs learn from data. In my personal opinion, there's something fundamentally flawed the whole field has yet to properly pinpointing and fix.


there's something fundamentally flawed the whole field has yet to properly pinpointing and fix.

Isn't it obvious?

It's all built around probability and statistics.

This is not how you reach definitive answers. Maybe the results make sense and maybe they're just nice sounding BS. You guess which one is the case.

The real catch --- if you know enough to spot the BS, you probably didn't need to ask the question in the first place.


> It's all built around probability and statistics.

Yes, the world is probabilistic.

> This is not how you reach definitive answers.

Do go on? This is the only way to build anything approximating certainty in our world. Do you think that ... answers just exist? What type of weird deterministic video game world do you live in where this is not the case?


How many "r's" are in the word "strawberry"?

I'm certain this simple question has a definitive answer.


which apparently hard to come up with by compiling lots of text into a statistical model for what text is most likely to come after your question




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: