Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For those looking to easily build on top of this or other OpenAI-compatible LLM APIs -- you can have a look at Langroid[1] (I am the lead dev): you can easily switch to cerebras (or groq, or other LLMs/Providers). E.g. after installing langroid in your virtual env, and setting up CEREBRAS_API_KEY in your env or .env file, you can run a simple chat example[2] like this:

    python3 examples/basic/chat.py -m cerebras/llama3.1-70b
Specifying the model and setting up basic chat is simple (and there are numerous other examples in the examples folder in the repo):

    import langroid.language_models as lm
    import langroid as lr
    llm_config = lm.OpenAIGPTConfig(chat_model= "cerebras/llama3.1-70b")
    agent = lr.ChatAgent(
        lr.ChatAgentConfig(llm=llm_config, system_message="Be helpful but concise"))
    )
    task = lr.Task(agent)
    task.run()
[1] https://github.com/langroid/langroid [2] https://github.com/langroid/langroid/blob/main/examples/basi... [3] Guide to using Langroid with non-OpenAI LLM APIs https://langroid.github.io/langroid/tutorials/local-llm-setu...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: