Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Can anyone explain what this means?


Possibly a huge leap forward in open-source model capability. GPT4's prowess supposedly comes from strong dataset + RLHF + MoE (Mixture of Experts).

Mixtral brings MoE to an already-powerful model.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: