Fix "Messages is invalid" error for Ollama/custom LLM providers and add comprehensive AI documentation (#225)

* Add comprehensive AI/LLM configuration documentation
* Fix Chat.start! to use default model when model is nil or empty
* Ensure all controllers use Chat.default_model for consistency
* Move AI doc inside `hosting/`
* Probably too much error handling

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: jjmata <187772+jjmata@users.noreply.github.com>
Co-authored-by: Juan José Mata <juanjo.mata@gmail.com>
This commit is contained in:
Copilot
2025-10-24 12:04:19 +02:00
committed by GitHub
parent 4f446307a7
commit a8f318c3f9
13 changed files with 833 additions and 11 deletions

View File

@@ -92,7 +92,9 @@ module ApplicationHelper
end
def default_ai_model
ENV.fetch("OPENAI_MODEL", Setting.openai_model.presence || Provider::Openai::DEFAULT_MODEL)
# Always return a valid model, never nil or empty
# Delegates to Chat.default_model for consistency
Chat.default_model
end
# Renders Markdown text using Redcarpet