Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

An interesting concept that stood out to me. Committing the prompts instead of the resulting code only.

It it really true the LLM's are non-deterministic? I thought if you used the exact input and seed with the temperature set to 0 you would get the same output. It would actually be interesting to probe the commit prompts to see how slight variants preformed.

 help



> I thought if you used the exact input and seed with the temperature set to 0 you would get the same output.

I think they can also be differences on different hardware, and also usually temperature is set higher than zero because it produces more "useful/interesting" outputs


The LWN comments say that you are correct for local AIs (but not LLM services), modulo some caveats about compiler flags and hardware used.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: