Files
sure/app/views/chats/_error.html.erb
Claude 6be6388d39 fix: Display helpful error when LLM model lacks function calling support
When users configure an OpenAI-compatible provider (like OpenRouter) with
a model that doesn't support function calling, they previously saw only a
generic "404" error. This made troubleshooting difficult.

Changes:
- Add FunctionCallingNotSupportedError class with a clear, actionable message
- Detect 404 errors and tool-related error messages when using custom providers
- Update error partial to display the actual error message instead of generic text
- Add i18n support for the error message

Fixes #830

https://claude.ai/code/session_01EpuAVyy5qRV4hYjwPELff4
2026-01-30 00:36:03 +00:00

38 lines
963 B
Plaintext

<%# locals: (chat:) %>
<%
# Try to extract a meaningful error message
error_message = nil
begin
if chat.error.present?
parsed = JSON.parse(chat.error) rescue nil
if parsed.is_a?(Hash) && parsed["message"].present?
error_message = parsed["message"]
end
end
rescue
# Fall back to generic message
end
%>
<div id="chat-error" class="px-3 py-2 bg-red-100 border border-red-500 rounded-lg">
<% if chat.debug_mode? %>
<div class="overflow-x-auto text-xs p-4 bg-red-200 rounded-md mb-2">
<code><%= chat.error %></code>
</div>
<% end %>
<div class="flex flex-col gap-2">
<% if error_message.present? %>
<p class="text-xs text-red-500"><%= error_message %></p>
<% else %>
<p class="text-xs text-red-500">Failed to generate response. Please try again.</p>
<% end %>
<%= render DS::Button.new(
text: "Retry",
href: retry_chat_path(chat),
) %>
</div>
</div>