Files
sure/app/views/settings/hostings/_openai_settings.html.erb
soky srm 8cd109a5b2 Implement support for generic OpenAI api (#213)
* Implement support for generic OpenAI api

- Implements support to route requests to any openAI capable provider ( Deepsek, Qwen, VLLM, LM Studio, Ollama ).
- Keeps support for pure OpenAI and uses the new better responses api
- Uses the /chat/completions api for the generic providers
- If uri_base is not set, uses default implementation.

* Fix json handling and indentation

* Fix linter error indent

* Fix tests to set env vars

* Fix updating settings

* Change to prefix checking for OAI models

* FIX check model if custom uri is set

* Change chat to sync calls

Some local models don't support streaming. Revert to sync calls for generic OAI api

* Fix tests

* Fix tests

* Fix for gpt5 message extraction

- Finds the message output by filtering for "type" == "message" instead of assuming it's at index 0
- Safely extracts the text using safe navigation operators (&.)
- Raises a clear error if no message content is found
- Parses the JSON as before

* Add more langfuse logging

- Add Langfuse to auto categorizer and merchant detector
- Fix monitoring on streaming chat responses
- Add Langfuse traces also for model errors now

* Update app/models/provider/openai.rb

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Signed-off-by: soky srm <sokysrm@gmail.com>

* handle nil function results explicitly

* Exposing some config vars.

* Linter and nitpick comments

* Drop back to `gpt-4.1` as default for now

* Linter

* Fix for strict tool schema in Gemini

- This fixes tool calling in Gemini OpenAI api
- Fix for getTransactions function, page size is not used.

---------

Signed-off-by: soky srm <sokysrm@gmail.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Co-authored-by: Juan José Mata <juanjo.mata@gmail.com>
2025-10-22 16:02:50 +02:00

52 lines
2.2 KiB
Plaintext

<div class="space-y-4">
<div>
<h2 class="font-medium mb-1"><%= t(".title") %></h2>
<% if ENV["OPENAI_ACCESS_TOKEN"].present? %>
<p class="text-sm text-secondary"><%= t(".env_configured_message") %></p>
<% else %>
<p class="text-secondary text-sm mb-4"><%= t(".description") %></p>
<% end %>
</div>
<%= styled_form_with model: Setting.new,
url: settings_hosting_path,
method: :patch,
data: {
controller: "auto-submit-form",
"auto-submit-form-trigger-event-value": "blur"
} do |form| %>
<%= form.password_field :openai_access_token,
label: t(".access_token_label"),
placeholder: t(".access_token_placeholder"),
value: (Setting.openai_access_token.present? ? "********" : nil),
autocomplete: "off",
autocapitalize: "none",
spellcheck: "false",
inputmode: "text",
disabled: ENV["OPENAI_ACCESS_TOKEN"].present?,
data: { "auto-submit-form-target": "auto" } %>
<%= form.text_field :openai_uri_base,
label: t(".uri_base_label"),
placeholder: t(".uri_base_placeholder"),
value: Setting.openai_uri_base,
autocomplete: "off",
autocapitalize: "none",
spellcheck: "false",
inputmode: "url",
disabled: ENV["OPENAI_URI_BASE"].present?,
data: { "auto-submit-form-target": "auto" } %>
<%= form.text_field :openai_model,
label: t(".model_label"),
placeholder: t(".model_placeholder"),
value: Setting.openai_model,
autocomplete: "off",
autocapitalize: "none",
spellcheck: "false",
inputmode: "text",
disabled: ENV["OPENAI_MODEL"].present?,
data: { "auto-submit-form-target": "auto" } %>
<% end %>
</div>