mirror of
https://github.com/we-promise/sure.git
synced 2026-04-08 14:54:49 +00:00
* Add SearchFamilyImportedFiles assistant function with vector store support Implement per-Family document search using OpenAI vector stores, allowing the AI assistant to search through uploaded financial documents (tax returns, statements, contracts, etc.). The architecture is modular with a provider- agnostic VectorStoreConcept interface so other RAG backends can be added. Key components: - Assistant::Function::SearchFamilyImportedFiles - tool callable from any LLM - Provider::VectorStoreConcept - abstract vector store interface - Provider::Openai vector store methods (create, upload, search, delete) - Family::VectorSearchable concern with document management - FamilyDocument model for tracking uploaded files - Migration adding vector_store_id to families and family_documents table https://claude.ai/code/session_01TSkKc7a9Yu2ugm1RvSf4dh * Extract VectorStore adapter layer for swappable backends Replace the Provider::VectorStoreConcept mixin with a standalone adapter architecture under VectorStore::. This cleanly separates vector store concerns from the LLM provider and makes it trivial to swap backends. Components: - VectorStore::Base — abstract interface (create/delete/upload/remove/search) - VectorStore::Openai — uses ruby-openai gem's native vector_stores.search - VectorStore::Pgvector — skeleton for local pgvector + embedding model - VectorStore::Qdrant — skeleton for Qdrant vector DB - VectorStore::Registry — resolves adapter from VECTOR_STORE_PROVIDER env - VectorStore::Response — success/failure wrapper (like Provider::Response) Consumers updated to go through VectorStore.adapter: - Family::VectorSearchable - Assistant::Function::SearchFamilyImportedFiles - FamilyDocument Removed: Provider::VectorStoreConcept, vector store methods from Provider::Openai https://claude.ai/code/session_01TSkKc7a9Yu2ugm1RvSf4dh * Add Vector Store configuration docs to ai.md Documents how to configure the document search feature, covering all three supported backends (OpenAI, pgvector, Qdrant), environment variables, Docker Compose examples, supported file types, and privacy considerations. https://claude.ai/code/session_01TSkKc7a9Yu2ugm1RvSf4dh * No need to specify `imported` in code * Missed a couple more places * Tiny reordering for the human OCD * Update app/models/assistant/function/search_family_files.rb Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> Signed-off-by: Juan José Mata <jjmata@jjmata.com> * PR comments * More PR comments --------- Signed-off-by: Juan José Mata <jjmata@jjmata.com> Co-authored-by: Claude <noreply@anthropic.com> Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
90 lines
2.4 KiB
Ruby
90 lines
2.4 KiB
Ruby
# Adapter that delegates to OpenAI's hosted vector-store and file-search APIs.
|
|
#
|
|
# Requirements:
|
|
# - gem "ruby-openai" (already in Gemfile)
|
|
# - OPENAI_ACCESS_TOKEN env var or Setting.openai_access_token
|
|
#
|
|
# OpenAI manages chunking, embedding, and retrieval; we simply upload files
|
|
# and issue search queries.
|
|
class VectorStore::Openai < VectorStore::Base
|
|
def initialize(access_token:, uri_base: nil)
|
|
client_options = { access_token: access_token }
|
|
client_options[:uri_base] = uri_base if uri_base.present?
|
|
client_options[:request_timeout] = ENV.fetch("OPENAI_REQUEST_TIMEOUT", 60).to_i
|
|
|
|
@client = ::OpenAI::Client.new(**client_options)
|
|
end
|
|
|
|
def create_store(name:)
|
|
with_response do
|
|
response = client.vector_stores.create(parameters: { name: name })
|
|
{ id: response["id"] }
|
|
end
|
|
end
|
|
|
|
def delete_store(store_id:)
|
|
with_response do
|
|
client.vector_stores.delete(id: store_id)
|
|
end
|
|
end
|
|
|
|
def upload_file(store_id:, file_content:, filename:)
|
|
with_response do
|
|
tempfile = Tempfile.new([ File.basename(filename, ".*"), File.extname(filename) ])
|
|
begin
|
|
tempfile.binmode
|
|
tempfile.write(file_content)
|
|
tempfile.rewind
|
|
|
|
file_response = client.files.upload(
|
|
parameters: { file: tempfile, purpose: "assistants" }
|
|
)
|
|
file_id = file_response["id"]
|
|
|
|
begin
|
|
client.vector_store_files.create(
|
|
vector_store_id: store_id,
|
|
parameters: { file_id: file_id }
|
|
)
|
|
rescue => e
|
|
client.files.delete(id: file_id) rescue nil
|
|
raise
|
|
end
|
|
|
|
{ file_id: file_id }
|
|
ensure
|
|
tempfile.close
|
|
tempfile.unlink
|
|
end
|
|
end
|
|
end
|
|
|
|
def remove_file(store_id:, file_id:)
|
|
with_response do
|
|
client.vector_store_files.delete(vector_store_id: store_id, id: file_id)
|
|
end
|
|
end
|
|
|
|
def search(store_id:, query:, max_results: 10)
|
|
with_response do
|
|
response = client.vector_stores.search(
|
|
id: store_id,
|
|
parameters: { query: query, max_num_results: max_results }
|
|
)
|
|
|
|
(response["data"] || []).map do |result|
|
|
{
|
|
content: Array(result["content"]).filter_map { |c| c["text"] }.join("\n"),
|
|
filename: result["filename"],
|
|
score: result["score"],
|
|
file_id: result["file_id"]
|
|
}
|
|
end
|
|
end
|
|
end
|
|
|
|
private
|
|
|
|
attr_reader :client
|
|
end
|