CSAI question before purchase about custom models

Hello guys,
I am using my own infra for inference of some open weight LLM`s ( oss120, qwen3 coder etc.) So we have a ollama endpoint and an vLLM one, which opens a openai compatable api endpoint. My question is if I purchase the CSAI extension, will I be able to use those endpoints for it? I saw in the docs and it has cs_ai_provider_register function, which i think is great, but i just wanted to make sure before I buy. Also shall we use this to create a custom provider or we can straight away just use a different endpoint url for the openai compatable one?

Best regards,
Alex

1 Like

The GPT4All endpoint is also an OpenAI compatible endpoint so best case scenario you can just add your URL to the GPT4All settings and it should all work. If you’re comfortable with it you can send me your endpoint URL in a secure note and I can test if it’ll work out of box. With the cs_ai_provider_register you can definitely create an integration, it’ll just be a little overhead on your end. You can ask us questions about this if you go this route though. Have a great day.