Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Right now it just works with OpenAI chat models (gpt-3.5-turbo, gpt-4) but if there's interest I plan to extend it to have several backends. These would probably each be an existing library that implements generating structured output like https://github.com/outlines-dev/outlines or https://github.com/guidance-ai/guidance. If you have ideas how this should be done let me know - on a github issue would be great to make it visible to others.


Oh, and some companies offer APIs that match the OpenAI API and there are some open-source projects that do this for llama running locally. Since those would be compatible with the openai python package they will work with magentic too - though some of these do not support function calling.

See for example Anyscale Endpoints https://app.endpoints.anyscale.com/landing and https://github.com/AmineDiro/cria


There's also LocalAI[0] which allows the use of local LLMs with an OpenAI compatible API.

[0] https://github.com/go-skynet/LocalAI


Thanks for sharing! LocalAI supports function calling[0] so this should work for most or all features of magentic - I'm interested to see if concurrent requests work. I will test this out.

[0] https://localai.io/features/openai-functions/


I tried out guidance. Encountered endless bugs




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: