mirror of
https://github.com/we-promise/sure.git
synced 2026-04-09 07:14:47 +00:00
* Implement support for generic OpenAI api - Implements support to route requests to any openAI capable provider ( Deepsek, Qwen, VLLM, LM Studio, Ollama ). - Keeps support for pure OpenAI and uses the new better responses api - Uses the /chat/completions api for the generic providers - If uri_base is not set, uses default implementation. * Fix json handling and indentation * Fix linter error indent * Fix tests to set env vars * Fix updating settings * Change to prefix checking for OAI models * FIX check model if custom uri is set * Change chat to sync calls Some local models don't support streaming. Revert to sync calls for generic OAI api * Fix tests * Fix tests * Fix for gpt5 message extraction - Finds the message output by filtering for "type" == "message" instead of assuming it's at index 0 - Safely extracts the text using safe navigation operators (&.) - Raises a clear error if no message content is found - Parses the JSON as before * Add more langfuse logging - Add Langfuse to auto categorizer and merchant detector - Fix monitoring on streaming chat responses - Add Langfuse traces also for model errors now * Update app/models/provider/openai.rb Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> Signed-off-by: soky srm <sokysrm@gmail.com> * handle nil function results explicitly * Exposing some config vars. * Linter and nitpick comments * Drop back to `gpt-4.1` as default for now * Linter * Fix for strict tool schema in Gemini - This fixes tool calling in Gemini OpenAI api - Fix for getTransactions function, page size is not used. --------- Signed-off-by: soky srm <sokysrm@gmail.com> Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> Co-authored-by: Juan José Mata <juanjo.mata@gmail.com>
47 lines
977 B
Ruby
47 lines
977 B
Ruby
class Provider::Openai::ChatConfig
|
|
def initialize(functions: [], function_results: [])
|
|
@functions = functions
|
|
@function_results = function_results
|
|
end
|
|
|
|
def tools
|
|
functions.map do |fn|
|
|
{
|
|
type: "function",
|
|
name: fn[:name],
|
|
description: fn[:description],
|
|
parameters: fn[:params_schema],
|
|
strict: fn[:strict]
|
|
}
|
|
end
|
|
end
|
|
|
|
def build_input(prompt)
|
|
results = function_results.map do |fn_result|
|
|
# Handle nil explicitly to avoid serializing to "null"
|
|
output = fn_result[:output]
|
|
serialized_output = if output.nil?
|
|
""
|
|
elsif output.is_a?(String)
|
|
output
|
|
else
|
|
output.to_json
|
|
end
|
|
|
|
{
|
|
type: "function_call_output",
|
|
call_id: fn_result[:call_id],
|
|
output: serialized_output
|
|
}
|
|
end
|
|
|
|
[
|
|
{ role: "user", content: prompt },
|
|
*results
|
|
]
|
|
end
|
|
|
|
private
|
|
attr_reader :functions, :function_results
|
|
end
|