Fix "Messages is invalid" error for Ollama/custom LLM providers and add comprehensive AI documentation (#225)

* Add comprehensive AI/LLM configuration documentation
* Fix Chat.start! to use default model when model is nil or empty
* Ensure all controllers use Chat.default_model for consistency
* Move AI doc inside `hosting/`
* Probably too much error handling

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: jjmata <187772+jjmata@users.noreply.github.com>
Co-authored-by: Juan José Mata <juanjo.mata@gmail.com>
This commit is contained in:
Copilot
2025-10-24 12:04:19 +02:00
committed by GitHub
parent 4f446307a7
commit a8f318c3f9
13 changed files with 833 additions and 11 deletions

View File

@@ -35,6 +35,18 @@ class Provider::Openai < Provider
DEFAULT_OPENAI_MODEL_PREFIXES.any? { |prefix| model.start_with?(prefix) }
end
def provider_name
custom_provider? ? "Custom OpenAI-compatible (#{@uri_base})" : "OpenAI"
end
def supported_models_description
if custom_provider?
@default_model.present? ? "configured model: #{@default_model}" : "any model"
else
"models starting with: #{DEFAULT_OPENAI_MODEL_PREFIXES.join(', ')}"
end
end
def custom_provider?
@uri_base.present?
end