mirror of
https://github.com/we-promise/sure.git
synced 2026-04-07 14:31:25 +00:00
Fix "Messages is invalid" error for Ollama/custom LLM providers and add comprehensive AI documentation (#225)
* Add comprehensive AI/LLM configuration documentation * Fix Chat.start! to use default model when model is nil or empty * Ensure all controllers use Chat.default_model for consistency * Move AI doc inside `hosting/` * Probably too much error handling --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: jjmata <187772+jjmata@users.noreply.github.com> Co-authored-by: Juan José Mata <juanjo.mata@gmail.com>
This commit is contained in:
@@ -9,7 +9,7 @@ class Api::V1::MessagesController < Api::V1::BaseController
|
||||
@message = @chat.messages.build(
|
||||
content: message_params[:content],
|
||||
type: "UserMessage",
|
||||
ai_model: message_params[:model] || "gpt-4"
|
||||
ai_model: message_params[:model].presence || Chat.default_model
|
||||
)
|
||||
|
||||
if @message.save
|
||||
|
||||
Reference in New Issue
Block a user