Hello guys,
I am using my own infra for inference of some open weight LLM`s ( oss120, qwen3 coder etc.) So we have a ollama endpoint and an vLLM one, which opens a openai compatable api endpoint. My question is if I purchase the CSAI extension, will I be able to use those endpoints for it? I saw in the docs and it has cs_ai_provider_register function, which i think is great, but i just wanted to make sure before I buy. Also shall we use this to create a custom provider or we can straight away just use a different endpoint url for the openai compatable one?
Best regards,
Alex