mirror of
https://github.com/we-promise/sure.git
synced 2026-04-08 06:44:52 +00:00
* Implement support for generic OpenAI api - Implements support to route requests to any openAI capable provider ( Deepsek, Qwen, VLLM, LM Studio, Ollama ). - Keeps support for pure OpenAI and uses the new better responses api - Uses the /chat/completions api for the generic providers - If uri_base is not set, uses default implementation. * Fix json handling and indentation * Fix linter error indent * Fix tests to set env vars * Fix updating settings * Change to prefix checking for OAI models * FIX check model if custom uri is set * Change chat to sync calls Some local models don't support streaming. Revert to sync calls for generic OAI api * Fix tests * Fix tests * Fix for gpt5 message extraction - Finds the message output by filtering for "type" == "message" instead of assuming it's at index 0 - Safely extracts the text using safe navigation operators (&.) - Raises a clear error if no message content is found - Parses the JSON as before * Add more langfuse logging - Add Langfuse to auto categorizer and merchant detector - Fix monitoring on streaming chat responses - Add Langfuse traces also for model errors now * Update app/models/provider/openai.rb Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> Signed-off-by: soky srm <sokysrm@gmail.com> * handle nil function results explicitly * Exposing some config vars. * Linter and nitpick comments * Drop back to `gpt-4.1` as default for now * Linter * Fix for strict tool schema in Gemini - This fixes tool calling in Gemini OpenAI api - Fix for getTransactions function, page size is not used. --------- Signed-off-by: soky srm <sokysrm@gmail.com> Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> Co-authored-by: Juan José Mata <juanjo.mata@gmail.com>
32 lines
1.5 KiB
Plaintext
32 lines
1.5 KiB
Plaintext
<%# locals: (chat: nil, message_hint: nil) %>
|
|
|
|
<div id="chat-form" class="space-y-2">
|
|
<% model = chat && chat.persisted? ? [chat, Message.new] : Chat.new %>
|
|
|
|
<%= form_with model: model,
|
|
class: "flex lg:flex-col gap-2 bg-container px-2 py-1.5 rounded-full lg:rounded-lg shadow-border-xs",
|
|
data: { chat_target: "form" } do |f| %>
|
|
|
|
<%# In the future, this will be a dropdown with different AI models %>
|
|
<%= f.hidden_field :ai_model, value: default_ai_model %>
|
|
|
|
<%= f.text_area :content, placeholder: "Ask anything ...", value: message_hint,
|
|
class: "w-full border-0 focus:ring-0 text-sm resize-none px-1 bg-transparent",
|
|
data: { chat_target: "input", action: "input->chat#autoResize keydown->chat#handleInputKeyDown" },
|
|
rows: 1 %>
|
|
|
|
<div class="flex items-center justify-between gap-1">
|
|
<div class="items-center gap-1 hidden lg:flex">
|
|
<%# These are disabled for now, but in the future, will all open specific menus with their own context and search %>
|
|
<% ["plus", "command", "at-sign", "mouse-pointer-click"].each do |icon| %>
|
|
<%= icon(icon, as_button: true, disabled: true, class: "cursor-not-allowed", title: "Coming soon") %>
|
|
<% end %>
|
|
</div>
|
|
|
|
<%= icon("arrow-up", as_button: true, type: "submit") %>
|
|
</div>
|
|
<% end %>
|
|
|
|
<p class="text-xs text-secondary">AI responses are informational only and are not financial advice.</p>
|
|
</div>
|