mirror of
https://github.com/we-promise/sure.git
synced 2026-04-19 12:04:08 +00:00
Fix "Messages is invalid" error for Ollama/custom LLM providers and add comprehensive AI documentation (#225)
* Add comprehensive AI/LLM configuration documentation * Fix Chat.start! to use default model when model is nil or empty * Ensure all controllers use Chat.default_model for consistency * Move AI doc inside `hosting/` * Probably too much error handling --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: jjmata <187772+jjmata@users.noreply.github.com> Co-authored-by: Juan José Mata <juanjo.mata@gmail.com>
This commit is contained in:
@@ -92,7 +92,9 @@ module ApplicationHelper
|
||||
end
|
||||
|
||||
def default_ai_model
|
||||
ENV.fetch("OPENAI_MODEL", Setting.openai_model.presence || Provider::Openai::DEFAULT_MODEL)
|
||||
# Always return a valid model, never nil or empty
|
||||
# Delegates to Chat.default_model for consistency
|
||||
Chat.default_model
|
||||
end
|
||||
|
||||
# Renders Markdown text using Redcarpet
|
||||
|
||||
Reference in New Issue
Block a user