mirror of
https://github.com/we-promise/sure.git
synced 2026-04-19 20:14:08 +00:00
Fix "Messages is invalid" error for Ollama/custom LLM providers and add comprehensive AI documentation (#225)
* Add comprehensive AI/LLM configuration documentation * Fix Chat.start! to use default model when model is nil or empty * Ensure all controllers use Chat.default_model for consistency * Move AI doc inside `hosting/` * Probably too much error handling --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: jjmata <187772+jjmata@users.noreply.github.com> Co-authored-by: Juan José Mata <juanjo.mata@gmail.com>
This commit is contained in:
@@ -35,6 +35,18 @@ class Provider::Openai < Provider
|
||||
DEFAULT_OPENAI_MODEL_PREFIXES.any? { |prefix| model.start_with?(prefix) }
|
||||
end
|
||||
|
||||
def provider_name
|
||||
custom_provider? ? "Custom OpenAI-compatible (#{@uri_base})" : "OpenAI"
|
||||
end
|
||||
|
||||
def supported_models_description
|
||||
if custom_provider?
|
||||
@default_model.present? ? "configured model: #{@default_model}" : "any model"
|
||||
else
|
||||
"models starting with: #{DEFAULT_OPENAI_MODEL_PREFIXES.join(', ')}"
|
||||
end
|
||||
end
|
||||
|
||||
def custom_provider?
|
||||
@uri_base.present?
|
||||
end
|
||||
|
||||
Reference in New Issue
Block a user