mirror of
https://github.com/we-promise/sure.git
synced 2026-05-13 23:54:55 +00:00
Merge branch 'main' into copilot/fix-twelvedata-api-limit-bug
Signed-off-by: Juan José Mata <juanjo.mata@gmail.com>
This commit is contained in:
46
.cursor/rules/api-endpoint-consistency.mdc
Normal file
46
.cursor/rules/api-endpoint-consistency.mdc
Normal file
@@ -0,0 +1,46 @@
|
||||
---
|
||||
description: API endpoint consistency — checklist to run after every API endpoint commit (Minitest behavior, rswag docs-only, API key auth).
|
||||
globs: app/controllers/api/v1/**/*.rb, spec/requests/api/v1/**/*.rb, test/controllers/api/v1/**/*.rb
|
||||
alwaysApply: false
|
||||
---
|
||||
|
||||
# API endpoint consistency (post-commit checklist)
|
||||
|
||||
When adding or modifying API v1 endpoints, ensure the following so behavior, docs, and auth stay consistent.
|
||||
|
||||
## 1. Minitest behavioral coverage
|
||||
|
||||
- **Location**: `test/controllers/api/v1/{resource}_controller_test.rb`
|
||||
- **Scope**: All new or changed actions must have Minitest coverage here. Do not rely on rswag specs for behavioral assertions.
|
||||
- **Pattern**:
|
||||
- Use `ApiKey.create!` (read and read_write scopes) and `api_headers(api_key)` → `{ "X-Api-Key" => api_key.display_key }`. Do not use OAuth/Bearer in these tests.
|
||||
- Cover: index/show (and create/update/destroy for write endpoints), read-only key blocking writes (403), invalid params (422), invalid date (422), not found (404), missing auth (401).
|
||||
- Follow existing API v1 test style: see [valuations_controller_test.rb](mdc:test/controllers/api/v1/valuations_controller_test.rb) and [transactions_controller_test.rb](mdc:test/controllers/api/v1/transactions_controller_test.rb).
|
||||
|
||||
## 2. rswag is docs-only
|
||||
|
||||
- **Location**: `spec/requests/api/v1/{resource}_spec.rb`
|
||||
- **Rule**: These specs exist only for OpenAPI generation. Do not add `expect(...)` or `assert_*` (or any behavioral assertions). Use `run_test!` without custom assertion blocks so the spec only documents request/response and regenerates `docs/api/openapi.yaml`.
|
||||
- **Regenerate**: After edits, run `RAILS_ENV=test bundle exec rake rswag:specs:swaggerize`.
|
||||
|
||||
## 3. Same API key auth in all rswag specs
|
||||
|
||||
- **Rule**: Every request spec in `spec/requests/api/v1/` must use the same API key auth pattern so generated docs are consistent.
|
||||
- **Pattern** (match holdings_spec, trades_spec, transactions_spec, etc.):
|
||||
|
||||
```ruby
|
||||
let(:api_key) do
|
||||
key = ApiKey.generate_secure_key
|
||||
ApiKey.create!(
|
||||
user: user,
|
||||
name: 'API Docs Key',
|
||||
key: key,
|
||||
scopes: %w[read_write],
|
||||
source: 'web'
|
||||
)
|
||||
end
|
||||
|
||||
let(:'X-Api-Key') { api_key.plain_key }
|
||||
```
|
||||
|
||||
- Do not use Doorkeeper/OAuth in these specs (no `Doorkeeper::Application`, `Doorkeeper::AccessToken`, or `Authorization: "Bearer ..."`). Use API key only. Note: [valuations_spec.rb](mdc:spec/requests/api/v1/valuations_spec.rb) currently uses OAuth; update it to the API key pattern above when editing that file.
|
||||
@@ -28,6 +28,8 @@ TWELVE_DATA_API_KEY =
|
||||
OPENAI_ACCESS_TOKEN =
|
||||
OPENAI_URI_BASE =
|
||||
OPENAI_MODEL =
|
||||
# OPENAI_REQUEST_TIMEOUT: Request timeout in seconds (default: 60)
|
||||
# OPENAI_SUPPORTS_PDF_PROCESSING: Set to false for endpoints without vision support (default: true)
|
||||
|
||||
# (example: LM Studio/Docker config) OpenAI-compatible API endpoint config
|
||||
# OPENAI_URI_BASE = http://host.docker.internal:1234/
|
||||
@@ -46,3 +48,36 @@ LANGFUSE_HOST = https://cloud.langfuse.com
|
||||
|
||||
# Set to `true` to get error messages rendered in the /chats UI
|
||||
AI_DEBUG_MODE =
|
||||
|
||||
# =============================================================================
|
||||
# SSL/TLS Configuration for Self-Signed Certificates
|
||||
# =============================================================================
|
||||
# Use these settings when connecting to services with self-signed or internal
|
||||
# CA certificates (e.g., self-hosted Keycloak, Authentik, or AI endpoints).
|
||||
#
|
||||
# SSL_CA_FILE: Path to custom CA certificate file (PEM format)
|
||||
# - The certificate that signed your server's SSL certificate
|
||||
# - Must be readable by the application
|
||||
# - Will be validated at startup
|
||||
# SSL_CA_FILE = /certs/my-ca.crt
|
||||
#
|
||||
# SSL_VERIFY: Enable/disable SSL certificate verification
|
||||
# - Default: true (verification enabled)
|
||||
# - Set to "false" ONLY for development/testing
|
||||
# - WARNING: Disabling removes protection against man-in-the-middle attacks
|
||||
# SSL_VERIFY = true
|
||||
#
|
||||
# SSL_DEBUG: Enable verbose SSL logging for troubleshooting
|
||||
# - Default: false
|
||||
# - When enabled, logs detailed SSL connection information
|
||||
# - Useful for diagnosing certificate issues
|
||||
# SSL_DEBUG = false
|
||||
#
|
||||
# Example docker-compose.yml configuration:
|
||||
# services:
|
||||
# app:
|
||||
# environment:
|
||||
# SSL_CA_FILE: /certs/my-ca.crt
|
||||
# SSL_DEBUG: "true"
|
||||
# volumes:
|
||||
# - ./my-ca.crt:/certs/my-ca.crt:ro
|
||||
|
||||
5
.gitattributes
vendored
5
.gitattributes
vendored
@@ -7,3 +7,8 @@ db/schema.rb linguist-generated
|
||||
vendor/* linguist-vendored
|
||||
config/credentials/*.yml.enc diff=rails_credentials
|
||||
config/credentials.yml.enc diff=rails_credentials
|
||||
|
||||
# Ensure consistent line endings for scripts and Ruby files to avoid shebang issues on Windows
|
||||
bin/* text eol=lf
|
||||
*.sh text eol=lf
|
||||
*.rb text eol=lf
|
||||
|
||||
2
.github/workflows/flutter-build.yml
vendored
2
.github/workflows/flutter-build.yml
vendored
@@ -179,4 +179,4 @@ jobs:
|
||||
path: |
|
||||
mobile/build/ios/iphoneos/Runner.app
|
||||
mobile/build/ios-build-info.txt
|
||||
retention-days: 30
|
||||
retention-days: 30
|
||||
30
.github/workflows/helm-release.yaml
vendored
30
.github/workflows/helm-release.yaml
vendored
@@ -6,6 +6,8 @@ on:
|
||||
- main
|
||||
paths:
|
||||
- 'charts/**'
|
||||
tags:
|
||||
- 'v*'
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
@@ -36,13 +38,22 @@ jobs:
|
||||
id: version
|
||||
run: |
|
||||
# Generate version like: 0.0.0-nightly.20251213.173045
|
||||
VERSION="0.0.0-nightly.$(date -u +'%Y%m%d.%H%M%S')"
|
||||
if [[ "${GITHUB_REF_TYPE}" == "tag" && "${GITHUB_REF_NAME}" == v* ]]; then
|
||||
VERSION="${GITHUB_REF_NAME#v}"
|
||||
else
|
||||
BASE_VERSION="$(git tag -l 'v*' | sed 's/^v//' | sort -V | tail -n 1)"
|
||||
if [[ -z "${BASE_VERSION}" ]]; then
|
||||
BASE_VERSION="0.0.0"
|
||||
fi
|
||||
VERSION="${BASE_VERSION}-nightly.$(date -u +'%Y%m%d.%H%M%S')"
|
||||
fi
|
||||
echo "version=$VERSION" >> $GITHUB_OUTPUT
|
||||
echo "Generated version: $VERSION"
|
||||
|
||||
- name: Update Chart.yaml version
|
||||
run: |
|
||||
sed -i "s/^version:.*/version: ${{ steps.version.outputs.version }}/" charts/sure/Chart.yaml
|
||||
sed -i "s/^appVersion:.*/appVersion: \"${{ steps.version.outputs.version }}\"/" charts/sure/Chart.yaml
|
||||
cat charts/sure/Chart.yaml
|
||||
|
||||
- name: Add Helm repositories
|
||||
@@ -83,5 +94,20 @@ jobs:
|
||||
git config user.name "$GIT_USER_NAME"
|
||||
git config user.email "$GIT_USER_EMAIL"
|
||||
git add .
|
||||
git commit -m "Release nightly: ${{ steps.version.outputs.version }}"
|
||||
if git diff --cached --quiet; then
|
||||
echo "No Helm chart updates to publish."
|
||||
exit 0
|
||||
fi
|
||||
if [[ "${GITHUB_REF_TYPE}" == "tag" && "${GITHUB_REF_NAME}" == v* ]]; then
|
||||
git commit -m "Release chart for ${{ github.ref_name }}"
|
||||
else
|
||||
git commit -m "Release nightly: ${{ steps.version.outputs.version }}"
|
||||
fi
|
||||
git push
|
||||
|
||||
- name: Upload chart to GitHub Release
|
||||
if: startsWith(github.ref, 'refs/tags/v')
|
||||
uses: softprops/action-gh-release@v2
|
||||
with:
|
||||
tag_name: ${{ github.ref_name }}
|
||||
files: .cr-release-packages/*.tgz
|
||||
|
||||
114
.github/workflows/mobile-release.yml
vendored
Normal file
114
.github/workflows/mobile-release.yml
vendored
Normal file
@@ -0,0 +1,114 @@
|
||||
name: Mobile Release
|
||||
|
||||
on:
|
||||
push:
|
||||
tags:
|
||||
- 'mobile-v*'
|
||||
|
||||
permissions:
|
||||
contents: write
|
||||
|
||||
jobs:
|
||||
build:
|
||||
name: Build Mobile Apps
|
||||
uses: ./.github/workflows/flutter-build.yml
|
||||
secrets: inherit
|
||||
|
||||
release:
|
||||
name: Create Mobile GitHub Release
|
||||
needs: [build]
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 10
|
||||
|
||||
permissions:
|
||||
contents: write
|
||||
|
||||
steps:
|
||||
- name: Extract version from tag
|
||||
id: version
|
||||
run: |
|
||||
# Strip 'mobile-' prefix to get the version part (e.g., 'mobile-v1.0.0' -> 'v1.0.0')
|
||||
VERSION="${GITHUB_REF_NAME#mobile-}"
|
||||
echo "version=$VERSION" >> $GITHUB_OUTPUT
|
||||
echo "Extracted version: $VERSION"
|
||||
|
||||
- name: Download Android APK artifact
|
||||
uses: actions/download-artifact@v4.3.0
|
||||
with:
|
||||
name: app-release-apk
|
||||
path: ${{ runner.temp }}/mobile-artifacts
|
||||
|
||||
- name: Download iOS build artifact
|
||||
uses: actions/download-artifact@v4.3.0
|
||||
with:
|
||||
name: ios-build-unsigned
|
||||
path: ${{ runner.temp }}/ios-build
|
||||
|
||||
- name: Prepare release assets
|
||||
run: |
|
||||
set -euo pipefail
|
||||
mkdir -p ${{ runner.temp }}/release-assets
|
||||
|
||||
echo "=== Downloaded artifacts ==="
|
||||
echo "Mobile artifacts:"
|
||||
ls -laR "${{ runner.temp }}/mobile-artifacts" || echo "No mobile-artifacts directory"
|
||||
echo "iOS build:"
|
||||
ls -laR "${{ runner.temp }}/ios-build" || echo "No ios-build directory"
|
||||
echo "==========================="
|
||||
|
||||
# Copy debug APK if it exists
|
||||
if [ -f "${{ runner.temp }}/mobile-artifacts/app-debug.apk" ]; then
|
||||
cp "${{ runner.temp }}/mobile-artifacts/app-debug.apk" \
|
||||
"${{ runner.temp }}/release-assets/sure-${{ steps.version.outputs.version }}-debug.apk"
|
||||
echo "✓ Debug APK prepared"
|
||||
fi
|
||||
|
||||
# Copy release APK if it exists
|
||||
if [ -f "${{ runner.temp }}/mobile-artifacts/app-release.apk" ]; then
|
||||
cp "${{ runner.temp }}/mobile-artifacts/app-release.apk" \
|
||||
"${{ runner.temp }}/release-assets/sure-${{ steps.version.outputs.version }}.apk"
|
||||
echo "✓ Release APK prepared"
|
||||
fi
|
||||
|
||||
# Create iOS app archive (zip the .app bundle)
|
||||
if [ -d "${{ runner.temp }}/ios-build/ios/iphoneos/Runner.app" ]; then
|
||||
cd "${{ runner.temp }}/ios-build/ios/iphoneos"
|
||||
zip -r "${{ runner.temp }}/release-assets/sure-${{ steps.version.outputs.version }}-ios-unsigned.zip" Runner.app
|
||||
echo "✓ iOS build archive prepared"
|
||||
fi
|
||||
|
||||
# Copy iOS build info
|
||||
if [ -f "${{ runner.temp }}/ios-build/ios-build-info.txt" ]; then
|
||||
cp "${{ runner.temp }}/ios-build/ios-build-info.txt" "${{ runner.temp }}/release-assets/"
|
||||
fi
|
||||
|
||||
echo "Release assets:"
|
||||
ls -la "${{ runner.temp }}/release-assets/"
|
||||
|
||||
# Fail early if no assets were produced
|
||||
if [ -z "$(ls -A "${{ runner.temp }}/release-assets/")" ]; then
|
||||
echo "::error::No release assets were produced"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Create GitHub Release
|
||||
uses: softprops/action-gh-release@v2
|
||||
with:
|
||||
tag_name: ${{ github.ref_name }}
|
||||
name: "${{ steps.version.outputs.version }} (Mobile)"
|
||||
draft: false
|
||||
prerelease: ${{ contains(github.ref_name, 'alpha') || contains(github.ref_name, 'beta') || contains(github.ref_name, 'rc') }}
|
||||
generate_release_notes: false
|
||||
files: |
|
||||
${{ runner.temp }}/release-assets/*
|
||||
body: |
|
||||
## Mobile-Only Release: ${{ steps.version.outputs.version }}
|
||||
|
||||
This is a mobile-only release. It does not include server-side changes.
|
||||
|
||||
### Downloads
|
||||
|
||||
- **Android APK**: Debug build for testing on Android devices
|
||||
- **iOS Build**: Unsigned iOS build (requires code signing for installation)
|
||||
|
||||
> **Note**: These are builds intended for testing purposes. For production use, please build from source with proper signing credentials.
|
||||
54
.github/workflows/publish.yml
vendored
54
.github/workflows/publish.yml
vendored
@@ -340,21 +340,52 @@ jobs:
|
||||
contents: write
|
||||
|
||||
steps:
|
||||
- name: Check out main branch
|
||||
- name: Determine source branch for tag
|
||||
id: source_branch
|
||||
run: |
|
||||
# Fetch all branches to find which one contains this tag's commit
|
||||
git init --quiet
|
||||
git remote add origin "https://github.com/${{ github.repository }}.git"
|
||||
git fetch origin --quiet
|
||||
|
||||
# Find branches containing the tagged commit
|
||||
BRANCHES=$(git branch -r --contains ${{ github.sha }} | grep -v HEAD | sed 's/origin\///' | xargs)
|
||||
echo "Branches containing commit: $BRANCHES"
|
||||
|
||||
# Prefer non-main branches (release branches) over main
|
||||
SOURCE_BRANCH="main"
|
||||
for branch in $BRANCHES; do
|
||||
if [ "$branch" != "main" ] && [ "$branch" != "master" ]; then
|
||||
SOURCE_BRANCH="$branch"
|
||||
break
|
||||
fi
|
||||
done
|
||||
|
||||
echo "Selected source branch: $SOURCE_BRANCH"
|
||||
echo "branch=$SOURCE_BRANCH" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Check out source branch
|
||||
uses: actions/checkout@v4.2.0
|
||||
with:
|
||||
ref: main
|
||||
ref: ${{ steps.source_branch.outputs.branch }}
|
||||
token: ${{ secrets.GH_PAT }}
|
||||
|
||||
- name: Bump pre-release version
|
||||
run: |
|
||||
VERSION_FILE="config/initializers/version.rb"
|
||||
CHART_FILE="charts/sure/Chart.yaml"
|
||||
|
||||
# Ensure version file exists
|
||||
if [ ! -f "$VERSION_FILE" ]; then
|
||||
echo "ERROR: Version file not found: $VERSION_FILE"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Ensure chart file exists
|
||||
if [ ! -f "$CHART_FILE" ]; then
|
||||
echo "ERROR: Chart file not found: $CHART_FILE"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Extract current version
|
||||
CURRENT_VERSION=$(grep -oP '"\K[0-9]+\.[0-9]+\.[0-9]+-(alpha|beta|rc)\.[0-9]+' "$VERSION_FILE")
|
||||
@@ -394,12 +425,23 @@ jobs:
|
||||
echo "Updated version.rb:"
|
||||
grep "semver" "$VERSION_FILE"
|
||||
|
||||
# Update Helm chart version and appVersion
|
||||
sed -i -E "s/^version: .*/version: ${NEW_VERSION}/" "$CHART_FILE"
|
||||
sed -i -E "s/^appVersion: .*/appVersion: \"${NEW_VERSION}\"/" "$CHART_FILE"
|
||||
|
||||
# Verify the change
|
||||
echo "Updated Chart.yaml:"
|
||||
grep -E "^(version|appVersion):" "$CHART_FILE"
|
||||
|
||||
- name: Commit and push version bump
|
||||
env:
|
||||
SOURCE_BRANCH: ${{ steps.source_branch.outputs.branch }}
|
||||
run: |
|
||||
git config user.name "github-actions[bot]"
|
||||
git config user.email "github-actions[bot]@users.noreply.github.com"
|
||||
|
||||
git add config/initializers/version.rb
|
||||
git add charts/sure/Chart.yaml
|
||||
|
||||
# Check if there are changes to commit
|
||||
if git diff --cached --quiet; then
|
||||
@@ -409,9 +451,11 @@ jobs:
|
||||
|
||||
git commit -m "Bump version to next iteration after ${{ github.ref_name }} release"
|
||||
|
||||
echo "Pushing to branch: $SOURCE_BRANCH"
|
||||
|
||||
# Push with retry logic
|
||||
attempts=0
|
||||
until git push origin main; do
|
||||
until git push origin HEAD:$SOURCE_BRANCH; do
|
||||
attempts=$((attempts + 1))
|
||||
if [[ $attempts -ge 4 ]]; then
|
||||
echo "ERROR: Failed to push after 4 attempts." >&2
|
||||
@@ -420,5 +464,5 @@ jobs:
|
||||
delay=$((2 ** attempts))
|
||||
echo "Push failed (attempt $attempts). Retrying in ${delay} seconds..."
|
||||
sleep ${delay}
|
||||
git pull --rebase origin main
|
||||
done
|
||||
git pull --rebase origin $SOURCE_BRANCH
|
||||
done
|
||||
14
AGENTS.md
14
AGENTS.md
@@ -34,6 +34,20 @@
|
||||
- Never commit secrets. Start from `.env.local.example`; use `.env.local` for development only.
|
||||
- Run `bin/brakeman` before major PRs. Prefer environment variables over hard-coded values.
|
||||
|
||||
## API Development Guidelines
|
||||
|
||||
### OpenAPI Documentation (MANDATORY)
|
||||
When adding or modifying API endpoints in `app/controllers/api/v1/`, you **MUST** create or update corresponding OpenAPI request specs for **DOCUMENTATION ONLY**:
|
||||
|
||||
1. **Location**: `spec/requests/api/v1/{resource}_spec.rb`
|
||||
2. **Framework**: RSpec with rswag for OpenAPI generation
|
||||
3. **Schemas**: Define reusable schemas in `spec/swagger_helper.rb`
|
||||
4. **Generated Docs**: `docs/api/openapi.yaml`
|
||||
5. **Regenerate**: Run `RAILS_ENV=test bundle exec rake rswag:specs:swaggerize` after changes
|
||||
|
||||
### Post-commit API consistency (LLM checklist)
|
||||
After every API endpoint commit, ensure: (1) **Minitest** behavioral coverage in `test/controllers/api/v1/{resource}_controller_test.rb` (no behavioral assertions in rswag); (2) **rswag** remains docs-only (no `expect`/`assert_*` in `spec/requests/api/v1/`); (3) **rswag auth** uses the same API key pattern everywhere (`X-Api-Key`, not OAuth/Bearer). Full checklist: [.cursor/rules/api-endpoint-consistency.mdc](.cursor/rules/api-endpoint-consistency.mdc).
|
||||
|
||||
## Providers: Pending Transactions and FX Metadata (SimpleFIN/Plaid/Lunchflow)
|
||||
|
||||
- Pending detection
|
||||
|
||||
53
CLAUDE.md
53
CLAUDE.md
@@ -82,6 +82,7 @@ The application provides both internal and external APIs:
|
||||
- External API: `/api/v1/` namespace with Doorkeeper OAuth and API key authentication
|
||||
- API responses use Jbuilder templates for JSON rendering
|
||||
- Rate limiting via Rack Attack with configurable limits per API key
|
||||
- **OpenAPI Documentation**: All API endpoints MUST have corresponding OpenAPI specs in `spec/requests/api/` using rswag. See `docs/api/openapi.yaml` for the generated documentation.
|
||||
|
||||
### Sync & Import System
|
||||
Two primary data ingestion methods:
|
||||
@@ -164,6 +165,7 @@ Sidekiq handles asynchronous tasks:
|
||||
- Test helpers in `test/support/` for common scenarios
|
||||
- Only test critical code paths that significantly increase confidence
|
||||
- Write tests as you go, when required
|
||||
- **API Endpoints require OpenAPI specs** in `spec/requests/api/` for documentation purposes ONLY, not test (uses RSpec + rswag)
|
||||
|
||||
### Performance Considerations
|
||||
- Database queries optimized with proper indexes
|
||||
@@ -323,4 +325,53 @@ end
|
||||
### Stubs and Mocks
|
||||
- Use `mocha` gem
|
||||
- Prefer `OpenStruct` for mock instances
|
||||
- Only mock what's necessary
|
||||
- Only mock what's necessary
|
||||
|
||||
## API Development Guidelines
|
||||
|
||||
### OpenAPI Documentation (MANDATORY)
|
||||
When adding or modifying API endpoints in `app/controllers/api/v1/`, you **MUST** create or update corresponding OpenAPI request specs:
|
||||
|
||||
1. **Location**: `spec/requests/api/v1/{resource}_spec.rb`
|
||||
2. **Framework**: RSpec with rswag for OpenAPI generation
|
||||
3. **Schemas**: Define reusable schemas in `spec/swagger_helper.rb`
|
||||
4. **Generated Docs**: `docs/api/openapi.yaml`
|
||||
|
||||
**Example structure for a new API endpoint:**
|
||||
```ruby
|
||||
# spec/requests/api/v1/widgets_spec.rb
|
||||
require 'swagger_helper'
|
||||
|
||||
RSpec.describe 'API V1 Widgets', type: :request do
|
||||
path '/api/v1/widgets' do
|
||||
get 'List widgets' do
|
||||
tags 'Widgets'
|
||||
security [ { apiKeyAuth: [] } ]
|
||||
produces 'application/json'
|
||||
|
||||
response '200', 'widgets listed' do
|
||||
schema '$ref' => '#/components/schemas/WidgetCollection'
|
||||
run_test!
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
```
|
||||
|
||||
**Regenerate OpenAPI docs after changes:**
|
||||
```bash
|
||||
RAILS_ENV=test bundle exec rake rswag:specs:swaggerize
|
||||
```
|
||||
|
||||
### Post-commit API consistency (issue #944)
|
||||
After every API endpoint commit, ensure:
|
||||
|
||||
1. **Minitest behavioral coverage** — Add or update tests in `test/controllers/api/v1/{resource}_controller_test.rb`. Use API key and `api_headers` (X-Api-Key). Cover index/show, CRUD where relevant, 401/403/422/404. Do not rely on rswag for behavioral assertions.
|
||||
|
||||
2. **rswag docs-only** — Do not add `expect(...)` or `assert_*` in `spec/requests/api/v1/`. Use `run_test!` only so specs document request/response and regenerate `docs/api/openapi.yaml`.
|
||||
|
||||
3. **Same API key auth in rswag** — Every request spec in `spec/requests/api/v1/` must use the same API key pattern (`ApiKey.generate_secure_key`, `ApiKey.create!(...)`, `let(:'X-Api-Key') { api_key.plain_key }`). Do not use Doorkeeper/OAuth in those specs so generated docs stay consistent.
|
||||
|
||||
Full checklist and pattern: [.cursor/rules/api-endpoint-consistency.mdc](.cursor/rules/api-endpoint-consistency.mdc).
|
||||
|
||||
To verify the implementation: `ruby test/support/verify_api_endpoint_consistency.rb`. To scan the current APIs for violations: `ruby test/support/verify_api_endpoint_consistency.rb --compliance`.
|
||||
6
Gemfile
6
Gemfile
@@ -60,6 +60,7 @@ gem "countries"
|
||||
# OAuth & API Security
|
||||
gem "doorkeeper"
|
||||
gem "rack-attack", "~> 6.6"
|
||||
gem "rack-cors"
|
||||
gem "pundit"
|
||||
gem "faraday"
|
||||
gem "faraday-retry"
|
||||
@@ -80,6 +81,7 @@ gem "rotp", "~> 6.3"
|
||||
gem "rqrcode", "~> 3.0"
|
||||
gem "activerecord-import"
|
||||
gem "rubyzip", "~> 2.3"
|
||||
gem "pdf-reader", "~> 2.12"
|
||||
|
||||
# OpenID Connect, OAuth & SAML authentication
|
||||
gem "omniauth", "~> 2.1"
|
||||
@@ -93,10 +95,6 @@ gem "omniauth-saml", "~> 2.1"
|
||||
gem "aasm"
|
||||
gem "after_commit_everywhere", "~> 1.0"
|
||||
|
||||
# Feature flags
|
||||
gem "flipper"
|
||||
gem "flipper-active_record"
|
||||
|
||||
# AI
|
||||
gem "ruby-openai"
|
||||
gem "langfuse-ruby", "~> 0.1.4", require: "langfuse"
|
||||
|
||||
38
Gemfile.lock
38
Gemfile.lock
@@ -1,6 +1,7 @@
|
||||
GEM
|
||||
remote: https://rubygems.org/
|
||||
specs:
|
||||
Ascii85 (2.0.1)
|
||||
aasm (5.5.1)
|
||||
concurrent-ruby (~> 1.0)
|
||||
actioncable (7.2.2.2)
|
||||
@@ -79,6 +80,7 @@ GEM
|
||||
addressable (2.8.7)
|
||||
public_suffix (>= 2.0.2, < 7.0)
|
||||
aes_key_wrap (1.1.0)
|
||||
afm (1.0.0)
|
||||
after_commit_everywhere (1.6.0)
|
||||
activerecord (>= 4.2)
|
||||
activesupport
|
||||
@@ -197,7 +199,7 @@ GEM
|
||||
event_stream_parser (1.0.0)
|
||||
faker (3.5.2)
|
||||
i18n (>= 1.8.11, < 2)
|
||||
faraday (2.13.2)
|
||||
faraday (2.14.1)
|
||||
faraday-net_http (>= 2.0, < 3.5)
|
||||
json
|
||||
logger
|
||||
@@ -205,8 +207,8 @@ GEM
|
||||
faraday (>= 1, < 3)
|
||||
faraday-multipart (1.1.1)
|
||||
multipart-post (~> 2.0)
|
||||
faraday-net_http (3.4.1)
|
||||
net-http (>= 0.5.0)
|
||||
faraday-net_http (3.4.2)
|
||||
net-http (~> 0.5)
|
||||
faraday-retry (2.3.2)
|
||||
faraday (~> 2.0)
|
||||
ffi (1.17.2-aarch64-linux-gnu)
|
||||
@@ -217,11 +219,6 @@ GEM
|
||||
ffi (1.17.2-x86_64-darwin)
|
||||
ffi (1.17.2-x86_64-linux-gnu)
|
||||
ffi (1.17.2-x86_64-linux-musl)
|
||||
flipper (1.3.6)
|
||||
concurrent-ruby (< 2)
|
||||
flipper-active_record (1.3.6)
|
||||
activerecord (>= 4.2, < 9)
|
||||
flipper (~> 1.3.6)
|
||||
foreman (0.88.1)
|
||||
fugit (1.11.1)
|
||||
et-orbi (~> 1, >= 1.2.11)
|
||||
@@ -232,6 +229,7 @@ GEM
|
||||
globalid (1.2.1)
|
||||
activesupport (>= 6.1)
|
||||
hashdiff (1.2.0)
|
||||
hashery (2.1.2)
|
||||
hashie (5.0.0)
|
||||
heapy (0.2.0)
|
||||
thor
|
||||
@@ -284,7 +282,7 @@ GEM
|
||||
actionview (>= 5.0.0)
|
||||
activesupport (>= 5.0.0)
|
||||
jmespath (1.6.2)
|
||||
json (2.12.2)
|
||||
json (2.18.1)
|
||||
json-jwt (1.16.7)
|
||||
activesupport (>= 4.2)
|
||||
aes_key_wrap
|
||||
@@ -364,8 +362,8 @@ GEM
|
||||
bigdecimal (>= 3.1, < 5)
|
||||
multipart-post (2.4.1)
|
||||
mutex_m (0.3.0)
|
||||
net-http (0.6.0)
|
||||
uri
|
||||
net-http (0.9.1)
|
||||
uri (>= 0.11.1)
|
||||
net-imap (0.5.8)
|
||||
date
|
||||
net-protocol
|
||||
@@ -446,6 +444,12 @@ GEM
|
||||
parser (3.3.8.0)
|
||||
ast (~> 2.4.1)
|
||||
racc
|
||||
pdf-reader (2.15.1)
|
||||
Ascii85 (>= 1.0, < 3.0, != 2.0.0)
|
||||
afm (>= 0.2.1, < 2)
|
||||
hashery (~> 2.0)
|
||||
ruby-rc4
|
||||
ttfunk
|
||||
pg (1.5.9)
|
||||
plaid (41.0.0)
|
||||
faraday (>= 1.0.1, < 3.0)
|
||||
@@ -477,6 +481,9 @@ GEM
|
||||
rack (3.1.18)
|
||||
rack-attack (6.7.0)
|
||||
rack (>= 1.0, < 4)
|
||||
rack-cors (3.0.0)
|
||||
logger
|
||||
rack (>= 3.0.14)
|
||||
rack-mini-profiler (4.0.0)
|
||||
rack (>= 1.2.0)
|
||||
rack-oauth2 (2.2.1)
|
||||
@@ -626,6 +633,7 @@ GEM
|
||||
faraday (>= 1)
|
||||
faraday-multipart (>= 1)
|
||||
ruby-progressbar (1.13.0)
|
||||
ruby-rc4 (0.1.5)
|
||||
ruby-saml (1.18.1)
|
||||
nokogiri (>= 1.13.10)
|
||||
rexml
|
||||
@@ -709,6 +717,8 @@ GEM
|
||||
unicode-display_width (>= 1.1.1, < 4)
|
||||
thor (1.4.0)
|
||||
timeout (0.4.3)
|
||||
ttfunk (1.8.0)
|
||||
bigdecimal (~> 3.1)
|
||||
turbo-rails (2.0.16)
|
||||
actionpack (>= 7.1.0)
|
||||
railties (>= 7.1.0)
|
||||
@@ -719,7 +729,7 @@ GEM
|
||||
unicode-display_width (3.1.4)
|
||||
unicode-emoji (~> 4.0, >= 4.0.4)
|
||||
unicode-emoji (4.0.4)
|
||||
uri (1.0.4)
|
||||
uri (1.1.1)
|
||||
useragent (0.16.11)
|
||||
validate_url (1.0.15)
|
||||
activemodel (>= 3.0.0)
|
||||
@@ -788,8 +798,6 @@ DEPENDENCIES
|
||||
faraday
|
||||
faraday-multipart
|
||||
faraday-retry
|
||||
flipper
|
||||
flipper-active_record
|
||||
foreman
|
||||
hotwire-livereload
|
||||
hotwire_combobox
|
||||
@@ -815,6 +823,7 @@ DEPENDENCIES
|
||||
omniauth_openid_connect
|
||||
ostruct
|
||||
pagy
|
||||
pdf-reader (~> 2.12)
|
||||
pg (~> 1.5)
|
||||
plaid
|
||||
posthog-ruby
|
||||
@@ -822,6 +831,7 @@ DEPENDENCIES
|
||||
puma (>= 5.0)
|
||||
pundit
|
||||
rack-attack (~> 6.6)
|
||||
rack-cors
|
||||
rack-mini-profiler
|
||||
rails (~> 7.2.2)
|
||||
rails-settings-cached
|
||||
|
||||
@@ -1,16 +1,14 @@
|
||||
<%= wrapper_element do %>
|
||||
<%= tag.dialog class: "w-full h-full bg-transparent theme-dark:backdrop:bg-alpha-black-900 backdrop:bg-overlay pt-[env(safe-area-inset-top)] pb-[env(safe-area-inset-bottom)] #{drawer? ? "lg:p-3" : "lg:p-1"}", **merged_opts do %>
|
||||
<%= tag.dialog class: "w-full h-full bg-transparent theme-dark:backdrop:bg-alpha-black-900 backdrop:bg-overlay pt-[env(safe-area-inset-top)] pb-[env(safe-area-inset-bottom)] #{(drawer? || responsive?) ? "lg:p-3" : "lg:p-1"}", **merged_opts do %>
|
||||
<%= tag.div class: dialog_outer_classes do %>
|
||||
<%= tag.div class: dialog_inner_classes, data: { DS__dialog_target: "content" } do %>
|
||||
<div class="grow overflow-y-auto py-4 space-y-4 flex flex-col">
|
||||
<% if header? %>
|
||||
<%= header %>
|
||||
<% end %>
|
||||
|
||||
<% if body? %>
|
||||
<div class="px-4 grow">
|
||||
<%= body %>
|
||||
|
||||
<% if sections.any? %>
|
||||
<div class="space-y-4">
|
||||
<% sections.each do |section| %>
|
||||
@@ -20,11 +18,9 @@
|
||||
<% end %>
|
||||
</div>
|
||||
<% end %>
|
||||
|
||||
<%# Optional, for customizing dialogs %>
|
||||
<%= content %>
|
||||
</div>
|
||||
|
||||
<% if actions? %>
|
||||
<div class="flex items-center gap-2 justify-end p-4">
|
||||
<% actions.each do |action| %>
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
class DS::Dialog < DesignSystemComponent
|
||||
renders_one :header, ->(title: nil, subtitle: nil, hide_close_icon: false, **opts, &block) do
|
||||
renders_one :header, ->(title: nil, subtitle: nil, custom_header: false, **opts, &block) do
|
||||
content_tag(:header, class: "px-4 flex flex-col gap-2", **opts) do
|
||||
title_div = content_tag(:div, class: "flex items-center justify-between gap-2") do
|
||||
title = content_tag(:h2, title, class: class_names("font-medium text-primary", drawer? ? "text-lg" : "")) if title
|
||||
close_icon = render DS::Button.new(variant: "icon", class: "ml-auto", icon: "x", tabindex: "-1", data: { action: "DS--dialog#close" }) unless hide_close_icon
|
||||
close_icon = close_button unless custom_header
|
||||
safe_join([ title, close_icon ].compact)
|
||||
end
|
||||
|
||||
@@ -33,7 +33,7 @@ class DS::Dialog < DesignSystemComponent
|
||||
end
|
||||
end
|
||||
|
||||
attr_reader :variant, :auto_open, :reload_on_close, :width, :disable_frame, :content_class, :disable_click_outside, :opts
|
||||
attr_reader :variant, :auto_open, :reload_on_close, :width, :disable_frame, :content_class, :disable_click_outside, :opts, :responsive
|
||||
|
||||
VARIANTS = %w[modal drawer].freeze
|
||||
WIDTHS = {
|
||||
@@ -43,7 +43,7 @@ class DS::Dialog < DesignSystemComponent
|
||||
full: "lg:max-w-full"
|
||||
}.freeze
|
||||
|
||||
def initialize(variant: "modal", auto_open: true, reload_on_close: false, width: "md", frame: nil, disable_frame: false, content_class: nil, disable_click_outside: false, **opts)
|
||||
def initialize(variant: "modal", auto_open: true, reload_on_close: false, width: "md", frame: nil, disable_frame: false, content_class: nil, disable_click_outside: false, responsive: false, **opts)
|
||||
@variant = variant.to_sym
|
||||
@auto_open = auto_open
|
||||
@reload_on_close = reload_on_close
|
||||
@@ -52,6 +52,7 @@ class DS::Dialog < DesignSystemComponent
|
||||
@disable_frame = disable_frame
|
||||
@content_class = content_class
|
||||
@disable_click_outside = disable_click_outside
|
||||
@responsive = responsive
|
||||
@opts = opts
|
||||
end
|
||||
|
||||
@@ -69,7 +70,9 @@ class DS::Dialog < DesignSystemComponent
|
||||
end
|
||||
|
||||
def dialog_outer_classes
|
||||
variant_classes = if drawer?
|
||||
variant_classes = if responsive?
|
||||
"items-center justify-center lg:items-end lg:justify-end"
|
||||
elsif drawer?
|
||||
"items-end justify-end"
|
||||
else
|
||||
"items-center justify-center"
|
||||
@@ -82,7 +85,9 @@ class DS::Dialog < DesignSystemComponent
|
||||
end
|
||||
|
||||
def dialog_inner_classes
|
||||
variant_classes = if drawer?
|
||||
variant_classes = if responsive?
|
||||
"max-h-full lg:h-full lg:w-[550px]"
|
||||
elsif drawer?
|
||||
"lg:w-[550px] h-full"
|
||||
else
|
||||
class_names(
|
||||
@@ -116,4 +121,20 @@ class DS::Dialog < DesignSystemComponent
|
||||
def drawer?
|
||||
variant == :drawer
|
||||
end
|
||||
|
||||
def responsive?
|
||||
@responsive
|
||||
end
|
||||
|
||||
def close_button
|
||||
classes = responsive? ? "ml-auto hidden lg:flex" : "ml-auto"
|
||||
render DS::Button.new(
|
||||
variant: "icon",
|
||||
class: classes,
|
||||
icon: "x",
|
||||
title: I18n.t("common.close"),
|
||||
aria_label: I18n.t("common.close"),
|
||||
data: { action: "DS--dialog#close" }
|
||||
)
|
||||
end
|
||||
end
|
||||
|
||||
@@ -50,7 +50,8 @@ class DS::MenuItem < DesignSystemComponent
|
||||
data = merged_opts.delete(:data) || {}
|
||||
|
||||
if confirm.present?
|
||||
data = data.merge(turbo_confirm: confirm.to_data_attribute)
|
||||
confirm_value = confirm.respond_to?(:to_data_attribute) ? confirm.to_data_attribute : confirm
|
||||
data = data.merge(turbo_confirm: confirm_value)
|
||||
end
|
||||
|
||||
if frame.present?
|
||||
|
||||
@@ -14,6 +14,7 @@ class AccountsController < ApplicationController
|
||||
@mercury_items = family.mercury_items.ordered.includes(:syncs, :mercury_accounts)
|
||||
@coinbase_items = family.coinbase_items.ordered.includes(:coinbase_accounts, :accounts, :syncs)
|
||||
@snaptrade_items = family.snaptrade_items.ordered.includes(:syncs, :snaptrade_accounts)
|
||||
@indexa_capital_items = family.indexa_capital_items.ordered.includes(:syncs, :indexa_capital_accounts)
|
||||
|
||||
# Build sync stats maps for all providers
|
||||
build_sync_stats_maps
|
||||
@@ -116,14 +117,11 @@ class AccountsController < ApplicationController
|
||||
# Capture provider accounts before clearing links (so we can destroy them)
|
||||
simplefin_account_to_destroy = @account.simplefin_account
|
||||
|
||||
# Capture SnaptradeAccounts linked via AccountProvider
|
||||
# Destroying them will trigger delete_snaptrade_connection callback to free connection slots
|
||||
snaptrade_accounts_to_destroy = @account.account_providers
|
||||
.where(provider_type: "SnaptradeAccount")
|
||||
.map { |ap| SnaptradeAccount.find_by(id: ap.provider_id) }
|
||||
.compact
|
||||
|
||||
# Remove new system links (account_providers join table)
|
||||
# SnaptradeAccount records are preserved (not destroyed) so users can relink later.
|
||||
# This follows the Plaid pattern where the provider account survives as "unlinked".
|
||||
# SnapTrade has limited connection slots (5 free), so preserving the record avoids
|
||||
# wasting a slot on reconnect.
|
||||
@account.account_providers.destroy_all
|
||||
|
||||
# Remove legacy system links (foreign keys)
|
||||
@@ -135,11 +133,6 @@ class AccountsController < ApplicationController
|
||||
# - SimplefinAccount only caches API data which is regenerated on reconnect
|
||||
# - If user reconnects SimpleFin later, a new SimplefinAccount will be created
|
||||
simplefin_account_to_destroy&.destroy!
|
||||
|
||||
# Destroy SnaptradeAccount records to free up SnapTrade connection slots
|
||||
# The before_destroy callback will delete the connection from SnapTrade API
|
||||
# if no other accounts share the same authorization
|
||||
snaptrade_accounts_to_destroy.each(&:destroy!)
|
||||
end
|
||||
|
||||
redirect_to accounts_path, notice: t("accounts.unlink.success")
|
||||
@@ -277,5 +270,12 @@ class AccountsController < ApplicationController
|
||||
.count
|
||||
@coinbase_unlinked_count_map[item.id] = count
|
||||
end
|
||||
|
||||
# IndexaCapital sync stats
|
||||
@indexa_capital_sync_stats_map = {}
|
||||
@indexa_capital_items.each do |item|
|
||||
latest_sync = item.syncs.ordered.first
|
||||
@indexa_capital_sync_stats_map[item.id] = latest_sync&.sync_stats || {}
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
@@ -6,6 +6,10 @@ module Api
|
||||
skip_before_action :authenticate_request!
|
||||
skip_before_action :check_api_key_rate_limit
|
||||
skip_before_action :log_api_access
|
||||
before_action :authenticate_request!, only: :enable_ai
|
||||
before_action :ensure_write_scope, only: :enable_ai
|
||||
before_action :check_api_key_rate_limit, only: :enable_ai
|
||||
before_action :log_api_access, only: :enable_ai
|
||||
|
||||
def signup
|
||||
# Check if invite code is required
|
||||
@@ -46,17 +50,15 @@ module Api
|
||||
InviteCode.claim!(params[:invite_code]) if params[:invite_code].present?
|
||||
|
||||
# Create device and OAuth token
|
||||
device = create_or_update_device(user)
|
||||
token_response = create_oauth_token_for_device(user, device)
|
||||
begin
|
||||
device = MobileDevice.upsert_device!(user, device_params)
|
||||
token_response = device.issue_token!
|
||||
rescue ActiveRecord::RecordInvalid => e
|
||||
render json: { error: "Failed to register device: #{e.message}" }, status: :unprocessable_entity
|
||||
return
|
||||
end
|
||||
|
||||
render json: token_response.merge(
|
||||
user: {
|
||||
id: user.id,
|
||||
email: user.email,
|
||||
first_name: user.first_name,
|
||||
last_name: user.last_name
|
||||
}
|
||||
), status: :created
|
||||
render json: token_response.merge(user: mobile_user_payload(user)), status: :created
|
||||
else
|
||||
render json: { errors: user.errors.full_messages }, status: :unprocessable_entity
|
||||
end
|
||||
@@ -84,22 +86,75 @@ module Api
|
||||
end
|
||||
|
||||
# Create device and OAuth token
|
||||
device = create_or_update_device(user)
|
||||
token_response = create_oauth_token_for_device(user, device)
|
||||
begin
|
||||
device = MobileDevice.upsert_device!(user, device_params)
|
||||
token_response = device.issue_token!
|
||||
rescue ActiveRecord::RecordInvalid => e
|
||||
render json: { error: "Failed to register device: #{e.message}" }, status: :unprocessable_entity
|
||||
return
|
||||
end
|
||||
|
||||
render json: token_response.merge(
|
||||
user: {
|
||||
id: user.id,
|
||||
email: user.email,
|
||||
first_name: user.first_name,
|
||||
last_name: user.last_name
|
||||
}
|
||||
)
|
||||
render json: token_response.merge(user: mobile_user_payload(user))
|
||||
else
|
||||
render json: { error: "Invalid email or password" }, status: :unauthorized
|
||||
end
|
||||
end
|
||||
|
||||
def sso_exchange
|
||||
code = sso_exchange_params
|
||||
|
||||
if code.blank?
|
||||
render json: { error: "invalid_or_expired_code", message: "Authorization code is required" }, status: :unauthorized
|
||||
return
|
||||
end
|
||||
|
||||
cache_key = "mobile_sso:#{code}"
|
||||
cached = Rails.cache.read(cache_key)
|
||||
|
||||
unless cached.present?
|
||||
render json: { error: "invalid_or_expired_code", message: "Authorization code is invalid or expired" }, status: :unauthorized
|
||||
return
|
||||
end
|
||||
|
||||
# Atomic delete — only the request that successfully deletes the key may proceed.
|
||||
# This prevents a race where two concurrent requests both read the same code.
|
||||
unless Rails.cache.delete(cache_key)
|
||||
render json: { error: "invalid_or_expired_code", message: "Authorization code is invalid or expired" }, status: :unauthorized
|
||||
return
|
||||
end
|
||||
|
||||
render json: {
|
||||
access_token: cached[:access_token],
|
||||
refresh_token: cached[:refresh_token],
|
||||
token_type: cached[:token_type],
|
||||
expires_in: cached[:expires_in],
|
||||
created_at: cached[:created_at],
|
||||
user: {
|
||||
id: cached[:user_id],
|
||||
email: cached[:user_email],
|
||||
first_name: cached[:user_first_name],
|
||||
last_name: cached[:user_last_name],
|
||||
ui_layout: cached[:user_ui_layout],
|
||||
ai_enabled: cached[:user_ai_enabled]
|
||||
}
|
||||
}
|
||||
end
|
||||
|
||||
def enable_ai
|
||||
user = current_resource_owner
|
||||
|
||||
unless user.ai_available?
|
||||
render json: { error: "AI is not available for your account" }, status: :forbidden
|
||||
return
|
||||
end
|
||||
|
||||
if user.update(ai_enabled: true)
|
||||
render json: { user: mobile_user_payload(user) }
|
||||
else
|
||||
render json: { errors: user.errors.full_messages }, status: :unprocessable_entity
|
||||
end
|
||||
end
|
||||
|
||||
def refresh
|
||||
# Find the refresh token
|
||||
refresh_token = params[:refresh_token]
|
||||
@@ -121,6 +176,7 @@ module Api
|
||||
new_token = Doorkeeper::AccessToken.create!(
|
||||
application: access_token.application,
|
||||
resource_owner_id: access_token.resource_owner_id,
|
||||
mobile_device_id: access_token.mobile_device_id,
|
||||
expires_in: 30.days.to_i,
|
||||
scopes: access_token.scopes,
|
||||
use_refresh_token: true
|
||||
@@ -173,39 +229,28 @@ module Api
|
||||
required_fields.all? { |field| device[field].present? }
|
||||
end
|
||||
|
||||
def create_or_update_device(user)
|
||||
# Handle both string and symbol keys
|
||||
device_data = params[:device].permit(:device_id, :device_name, :device_type, :os_version, :app_version)
|
||||
|
||||
device = user.mobile_devices.find_or_initialize_by(device_id: device_data[:device_id])
|
||||
device.update!(device_data.merge(last_seen_at: Time.current))
|
||||
device
|
||||
def device_params
|
||||
params.require(:device).permit(:device_id, :device_name, :device_type, :os_version, :app_version)
|
||||
end
|
||||
|
||||
def create_oauth_token_for_device(user, device)
|
||||
# Create OAuth application for this device if needed
|
||||
oauth_app = device.create_oauth_application!
|
||||
|
||||
# Revoke any existing tokens for this device
|
||||
device.revoke_all_tokens!
|
||||
|
||||
# Create new access token with 30-day expiration
|
||||
access_token = Doorkeeper::AccessToken.create!(
|
||||
application: oauth_app,
|
||||
resource_owner_id: user.id,
|
||||
expires_in: 30.days.to_i,
|
||||
scopes: "read_write",
|
||||
use_refresh_token: true
|
||||
)
|
||||
def sso_exchange_params
|
||||
params.require(:code)
|
||||
end
|
||||
|
||||
def mobile_user_payload(user)
|
||||
{
|
||||
access_token: access_token.plaintext_token,
|
||||
refresh_token: access_token.plaintext_refresh_token,
|
||||
token_type: "Bearer",
|
||||
expires_in: access_token.expires_in,
|
||||
created_at: access_token.created_at.to_i
|
||||
id: user.id,
|
||||
email: user.email,
|
||||
first_name: user.first_name,
|
||||
last_name: user.last_name,
|
||||
ui_layout: user.ui_layout,
|
||||
ai_enabled: user.ai_enabled?
|
||||
}
|
||||
end
|
||||
|
||||
def ensure_write_scope
|
||||
authorize_scope!(:write)
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
108
app/controllers/api/v1/holdings_controller.rb
Normal file
108
app/controllers/api/v1/holdings_controller.rb
Normal file
@@ -0,0 +1,108 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class Api::V1::HoldingsController < Api::V1::BaseController
|
||||
include Pagy::Backend
|
||||
|
||||
before_action :ensure_read_scope
|
||||
before_action :set_holding, only: [ :show ]
|
||||
|
||||
def index
|
||||
family = current_resource_owner.family
|
||||
holdings_query = family.holdings.joins(:account).where(accounts: { status: [ "draft", "active" ] })
|
||||
|
||||
holdings_query = apply_filters(holdings_query)
|
||||
holdings_query = holdings_query.includes(:account, :security).chronological
|
||||
|
||||
@pagy, @holdings = pagy(
|
||||
holdings_query,
|
||||
page: safe_page_param,
|
||||
limit: safe_per_page_param
|
||||
)
|
||||
@per_page = safe_per_page_param
|
||||
|
||||
render :index
|
||||
rescue ArgumentError => e
|
||||
render_validation_error(e.message, [ e.message ])
|
||||
rescue => e
|
||||
log_and_render_error("index", e)
|
||||
end
|
||||
|
||||
def show
|
||||
render :show
|
||||
rescue => e
|
||||
log_and_render_error("show", e)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def set_holding
|
||||
family = current_resource_owner.family
|
||||
@holding = family.holdings.joins(:account).where(accounts: { status: %w[draft active] }).find(params[:id])
|
||||
rescue ActiveRecord::RecordNotFound
|
||||
render json: { error: "not_found", message: "Holding not found" }, status: :not_found
|
||||
end
|
||||
|
||||
def ensure_read_scope
|
||||
authorize_scope!(:read)
|
||||
end
|
||||
|
||||
def apply_filters(query)
|
||||
if params[:account_id].present?
|
||||
query = query.where(account_id: params[:account_id])
|
||||
end
|
||||
if params[:account_ids].present?
|
||||
query = query.where(account_id: Array(params[:account_ids]))
|
||||
end
|
||||
if params[:date].present?
|
||||
query = query.where(date: parse_date!(params[:date], "date"))
|
||||
end
|
||||
if params[:start_date].present?
|
||||
query = query.where("holdings.date >= ?", parse_date!(params[:start_date], "start_date"))
|
||||
end
|
||||
if params[:end_date].present?
|
||||
query = query.where("holdings.date <= ?", parse_date!(params[:end_date], "end_date"))
|
||||
end
|
||||
if params[:security_id].present?
|
||||
query = query.where(security_id: params[:security_id])
|
||||
end
|
||||
query
|
||||
end
|
||||
|
||||
def safe_page_param
|
||||
page = params[:page].to_i
|
||||
page > 0 ? page : 1
|
||||
end
|
||||
|
||||
def safe_per_page_param
|
||||
per_page = params[:per_page].to_i
|
||||
case per_page
|
||||
when 1..100
|
||||
per_page
|
||||
else
|
||||
25
|
||||
end
|
||||
end
|
||||
|
||||
def parse_date!(value, param_name)
|
||||
Date.parse(value)
|
||||
rescue Date::Error, ArgumentError, TypeError
|
||||
raise ArgumentError, "Invalid #{param_name} format"
|
||||
end
|
||||
|
||||
def render_validation_error(message, errors)
|
||||
render json: {
|
||||
error: "validation_failed",
|
||||
message: message,
|
||||
errors: errors
|
||||
}, status: :unprocessable_entity
|
||||
end
|
||||
|
||||
def log_and_render_error(action, exception)
|
||||
Rails.logger.error "HoldingsController##{action} error: #{exception.message}"
|
||||
Rails.logger.error exception.backtrace.join("\n")
|
||||
render json: {
|
||||
error: "internal_server_error",
|
||||
message: "Error: #{exception.message}"
|
||||
}, status: :internal_server_error
|
||||
end
|
||||
end
|
||||
@@ -67,7 +67,7 @@ class Api::V1::ImportsController < Api::V1::BaseController
|
||||
}, status: :unprocessable_entity
|
||||
end
|
||||
|
||||
unless Import::ALLOWED_MIME_TYPES.include?(file.content_type)
|
||||
unless Import::ALLOWED_CSV_MIME_TYPES.include?(file.content_type)
|
||||
return render json: {
|
||||
error: "invalid_file_type",
|
||||
message: "Invalid file type. Please upload a CSV file."
|
||||
|
||||
313
app/controllers/api/v1/trades_controller.rb
Normal file
313
app/controllers/api/v1/trades_controller.rb
Normal file
@@ -0,0 +1,313 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class Api::V1::TradesController < Api::V1::BaseController
|
||||
include Pagy::Backend
|
||||
|
||||
before_action :ensure_read_scope, only: [ :index, :show ]
|
||||
before_action :ensure_write_scope, only: [ :create, :update, :destroy ]
|
||||
before_action :set_trade, only: [ :show, :update, :destroy ]
|
||||
|
||||
def index
|
||||
family = current_resource_owner.family
|
||||
trades_query = family.trades.visible
|
||||
|
||||
trades_query = apply_filters(trades_query)
|
||||
trades_query = trades_query.includes({ entry: :account }, :security, :category).reverse_chronological
|
||||
|
||||
@pagy, @trades = pagy(
|
||||
trades_query,
|
||||
page: safe_page_param,
|
||||
limit: safe_per_page_param
|
||||
)
|
||||
@per_page = safe_per_page_param
|
||||
|
||||
render :index
|
||||
rescue ArgumentError => e
|
||||
render_validation_error(e.message, [ e.message ])
|
||||
rescue => e
|
||||
log_and_render_error("index", e)
|
||||
end
|
||||
|
||||
def show
|
||||
render :show
|
||||
rescue => e
|
||||
log_and_render_error("show", e)
|
||||
end
|
||||
|
||||
def create
|
||||
unless trade_params[:account_id].present?
|
||||
return render_validation_error("Account ID is required", [ "Account ID is required" ])
|
||||
end
|
||||
|
||||
account = current_resource_owner.family.accounts.visible.find(trade_params[:account_id])
|
||||
|
||||
unless account.supports_trades?
|
||||
return render_validation_error(
|
||||
"Account does not support trades (investment or crypto exchange only)",
|
||||
[ "Account must be an investment or crypto exchange account" ]
|
||||
)
|
||||
end
|
||||
|
||||
create_params = build_create_form_params(account)
|
||||
return if performed? # build_create_form_params may have rendered validation errors
|
||||
|
||||
model = Trade::CreateForm.new(create_params).create
|
||||
|
||||
unless model.persisted?
|
||||
errors = model.is_a?(Entry) ? model.errors.full_messages : [ "Trade could not be created" ]
|
||||
return render_validation_error("Trade could not be created", errors)
|
||||
end
|
||||
|
||||
if model.is_a?(Entry)
|
||||
model.lock_saved_attributes!
|
||||
model.mark_user_modified!
|
||||
model.sync_account_later
|
||||
@trade = model.trade
|
||||
else
|
||||
@trade = model
|
||||
end
|
||||
|
||||
apply_trade_create_options!
|
||||
return if performed?
|
||||
|
||||
@entry = @trade.entry
|
||||
render :show, status: :created
|
||||
rescue ActiveRecord::RecordNotFound => e
|
||||
message = (e.model == "Account") ? "Account not found" : "Security not found"
|
||||
render json: { error: "not_found", message: message }, status: :not_found
|
||||
rescue => e
|
||||
log_and_render_error("create", e)
|
||||
end
|
||||
|
||||
def update
|
||||
updatable = build_entry_params_for_update
|
||||
|
||||
if @entry.update(updatable.except(:nature))
|
||||
@entry.lock_saved_attributes!
|
||||
@entry.mark_user_modified!
|
||||
@entry.sync_account_later
|
||||
@trade = @entry.trade
|
||||
render :show
|
||||
else
|
||||
render_validation_error("Trade could not be updated", @entry.errors.full_messages)
|
||||
end
|
||||
rescue => e
|
||||
log_and_render_error("update", e)
|
||||
end
|
||||
|
||||
def destroy
|
||||
@entry = @trade.entry
|
||||
@entry.destroy!
|
||||
@entry.sync_account_later
|
||||
|
||||
render json: { message: "Trade deleted successfully" }, status: :ok
|
||||
rescue => e
|
||||
log_and_render_error("destroy", e)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def set_trade
|
||||
family = current_resource_owner.family
|
||||
@trade = family.trades.visible.find(params[:id])
|
||||
@entry = @trade.entry
|
||||
rescue ActiveRecord::RecordNotFound
|
||||
render json: { error: "not_found", message: "Trade not found" }, status: :not_found
|
||||
end
|
||||
|
||||
def ensure_read_scope
|
||||
authorize_scope!(:read)
|
||||
end
|
||||
|
||||
def ensure_write_scope
|
||||
authorize_scope!(:write)
|
||||
end
|
||||
|
||||
def apply_filters(query)
|
||||
need_entry_join = params[:account_id].present? || params[:account_ids].present? ||
|
||||
params[:start_date].present? || params[:end_date].present?
|
||||
query = query.joins(:entry) if need_entry_join
|
||||
|
||||
if params[:account_id].present?
|
||||
query = query.where(entries: { account_id: params[:account_id] })
|
||||
end
|
||||
if params[:account_ids].present?
|
||||
query = query.where(entries: { account_id: Array(params[:account_ids]) })
|
||||
end
|
||||
if params[:start_date].present?
|
||||
query = query.where("entries.date >= ?", parse_date!(params[:start_date], "start_date"))
|
||||
end
|
||||
if params[:end_date].present?
|
||||
query = query.where("entries.date <= ?", parse_date!(params[:end_date], "end_date"))
|
||||
end
|
||||
query
|
||||
end
|
||||
|
||||
def trade_params
|
||||
params.require(:trade).permit(
|
||||
:account_id, :date, :qty, :price, :currency,
|
||||
:security_id, :ticker, :manual_ticker, :investment_activity_label, :category_id
|
||||
)
|
||||
end
|
||||
|
||||
def trade_update_params
|
||||
params.require(:trade).permit(
|
||||
:name, :date, :amount, :currency, :notes, :nature, :type,
|
||||
:qty, :price, :investment_activity_label, :category_id
|
||||
)
|
||||
end
|
||||
|
||||
def build_entry_params_for_update
|
||||
flat = trade_update_params.to_h
|
||||
entry_params = {
|
||||
name: flat[:name],
|
||||
date: flat[:date],
|
||||
amount: flat[:amount],
|
||||
currency: flat[:currency],
|
||||
notes: flat[:notes],
|
||||
entryable_type: "Trade",
|
||||
entryable_attributes: {
|
||||
id: @trade.id,
|
||||
investment_activity_label: flat[:investment_activity_label],
|
||||
category_id: flat[:category_id]
|
||||
}.compact_blank
|
||||
}.compact
|
||||
|
||||
original_qty = flat[:qty]
|
||||
original_price = flat[:price]
|
||||
type_or_nature = flat[:type].presence || flat[:nature]
|
||||
|
||||
if original_qty.present? || original_price.present?
|
||||
qty = original_qty.present? ? original_qty : @trade.qty.abs
|
||||
price = original_price.present? ? original_price : @trade.price
|
||||
is_sell = type_or_nature.present? ? trade_sell_from_type_or_nature?(type_or_nature) : @trade.qty.negative?
|
||||
signed_qty = is_sell ? -qty.to_d.abs : qty.to_d.abs
|
||||
entry_params[:entryable_attributes][:qty] = signed_qty
|
||||
entry_params[:amount] = signed_qty * price.to_d
|
||||
ticker = @trade.security&.ticker
|
||||
entry_params[:name] = Trade.build_name(is_sell ? "sell" : "buy", signed_qty.abs, ticker) if ticker.present?
|
||||
entry_params[:entryable_attributes][:investment_activity_label] = flat[:investment_activity_label].presence || @trade.investment_activity_label.presence || (is_sell ? "Sell" : "Buy")
|
||||
end
|
||||
|
||||
entry_params
|
||||
end
|
||||
|
||||
# True for sell: "sell" or "inflow". False for buy: "buy", "outflow", or blank. Keeps create (buy/sell) and update (type or nature) consistent.
|
||||
def trade_sell_from_type_or_nature?(value)
|
||||
return false if value.blank?
|
||||
|
||||
normalized = value.to_s.downcase.strip
|
||||
%w[sell inflow].include?(normalized)
|
||||
end
|
||||
|
||||
def build_create_form_params(account)
|
||||
type = params.dig(:trade, :type).to_s.downcase
|
||||
unless %w[buy sell].include?(type)
|
||||
render_validation_error("Type must be buy or sell", [ "type must be 'buy' or 'sell'" ])
|
||||
return nil
|
||||
end
|
||||
|
||||
ticker_value = nil
|
||||
manual_ticker_value = nil
|
||||
|
||||
unless trade_params[:date].present?
|
||||
render_validation_error("Date is required", [ "date must be present" ])
|
||||
return nil
|
||||
end
|
||||
|
||||
if trade_params[:security_id].present?
|
||||
security = Security.find(trade_params[:security_id])
|
||||
ticker_value = security.exchange_operating_mic.present? ? "#{security.ticker}|#{security.exchange_operating_mic}" : security.ticker
|
||||
elsif trade_params[:ticker].present?
|
||||
ticker_value = trade_params[:ticker]
|
||||
elsif trade_params[:manual_ticker].present?
|
||||
manual_ticker_value = trade_params[:manual_ticker]
|
||||
else
|
||||
render_validation_error("Security identifier required", [ "Provide security_id, ticker, or manual_ticker" ])
|
||||
return nil
|
||||
end
|
||||
|
||||
qty_raw = trade_params[:qty].to_s.strip
|
||||
price_raw = trade_params[:price].to_s.strip
|
||||
return render_validation_error("Quantity and price are required", [ "qty and price must be present and positive" ]) if qty_raw.blank? || price_raw.blank?
|
||||
|
||||
qty = qty_raw.to_d
|
||||
price = price_raw.to_d
|
||||
if qty <= 0 || price <= 0
|
||||
# Non-numeric input (e.g. "abc") becomes 0 with to_d; give a clearer message than "must be present"
|
||||
non_numeric = (qty.zero? && qty_raw !~ /\A0(\.0*)?\z/) || (price.zero? && price_raw !~ /\A0(\.0*)?\z/)
|
||||
return render_validation_error("Quantity and price must be valid numbers", [ "qty and price must be valid positive numbers" ]) if non_numeric
|
||||
return render_validation_error("Quantity and price are required", [ "qty and price must be present and positive" ])
|
||||
end
|
||||
|
||||
{
|
||||
account: account,
|
||||
date: trade_params[:date],
|
||||
qty: qty,
|
||||
price: price,
|
||||
currency: trade_params[:currency].presence || account.currency,
|
||||
type: type,
|
||||
ticker: ticker_value,
|
||||
manual_ticker: manual_ticker_value
|
||||
}.compact
|
||||
end
|
||||
|
||||
def apply_trade_create_options!
|
||||
attrs = {}
|
||||
if trade_params[:investment_activity_label].present?
|
||||
label = trade_params[:investment_activity_label]
|
||||
unless Trade::ACTIVITY_LABELS.include?(label)
|
||||
render_validation_error("Invalid investment_activity_label", [ "investment_activity_label must be one of: #{Trade::ACTIVITY_LABELS.join(', ')}" ])
|
||||
return
|
||||
end
|
||||
attrs[:investment_activity_label] = label
|
||||
end
|
||||
if trade_params[:category_id].present?
|
||||
category = current_resource_owner.family.categories.find_by(id: trade_params[:category_id])
|
||||
unless category
|
||||
render_validation_error("Category not found or does not belong to your family", [ "category_id is invalid" ])
|
||||
return
|
||||
end
|
||||
attrs[:category_id] = category.id
|
||||
end
|
||||
@trade.update!(attrs) if attrs.any?
|
||||
end
|
||||
|
||||
def render_validation_error(message, errors)
|
||||
render json: {
|
||||
error: "validation_failed",
|
||||
message: message,
|
||||
errors: errors
|
||||
}, status: :unprocessable_entity
|
||||
end
|
||||
|
||||
def parse_date!(value, param_name)
|
||||
Date.parse(value)
|
||||
rescue Date::Error, ArgumentError, TypeError
|
||||
raise ArgumentError, "Invalid #{param_name} format"
|
||||
end
|
||||
|
||||
def log_and_render_error(action, exception)
|
||||
Rails.logger.error "TradesController##{action} error: #{exception.message}"
|
||||
Rails.logger.error exception.backtrace.join("\n")
|
||||
render json: {
|
||||
error: "internal_server_error",
|
||||
message: "Error: #{exception.message}"
|
||||
}, status: :internal_server_error
|
||||
end
|
||||
|
||||
def safe_page_param
|
||||
page = params[:page].to_i
|
||||
page > 0 ? page : 1
|
||||
end
|
||||
|
||||
def safe_per_page_param
|
||||
per_page = params[:per_page].to_i
|
||||
case per_page
|
||||
when 1..100
|
||||
per_page
|
||||
else
|
||||
25
|
||||
end
|
||||
end
|
||||
end
|
||||
@@ -105,19 +105,29 @@ class Api::V1::TransactionsController < Api::V1::BaseController
|
||||
end
|
||||
|
||||
def update
|
||||
if @entry.update(entry_params_for_update)
|
||||
@entry.sync_account_later
|
||||
@entry.lock_saved_attributes!
|
||||
@entry.transaction.lock_attr!(:tag_ids) if @entry.transaction.tags.any?
|
||||
Entry.transaction do
|
||||
if @entry.update(entry_params_for_update)
|
||||
# Handle tags separately - only when explicitly provided in the request
|
||||
# This allows clearing tags with tag_ids: [] while preserving tags when not specified
|
||||
if tags_provided?
|
||||
@entry.transaction.tag_ids = transaction_params[:tag_ids] || []
|
||||
@entry.transaction.save!
|
||||
@entry.transaction.lock_attr!(:tag_ids) if @entry.transaction.tags.any?
|
||||
end
|
||||
|
||||
@transaction = @entry.transaction
|
||||
render :show
|
||||
else
|
||||
render json: {
|
||||
error: "validation_failed",
|
||||
message: "Transaction could not be updated",
|
||||
errors: @entry.errors.full_messages
|
||||
}, status: :unprocessable_entity
|
||||
@entry.sync_account_later
|
||||
@entry.lock_saved_attributes!
|
||||
|
||||
@transaction = @entry.transaction
|
||||
render :show
|
||||
else
|
||||
render json: {
|
||||
error: "validation_failed",
|
||||
message: "Transaction could not be updated",
|
||||
errors: @entry.errors.full_messages
|
||||
}, status: :unprocessable_entity
|
||||
raise ActiveRecord::Rollback
|
||||
end
|
||||
end
|
||||
|
||||
rescue => e
|
||||
@@ -283,8 +293,9 @@ end
|
||||
entryable_attributes: {
|
||||
id: @entry.entryable_id,
|
||||
category_id: transaction_params[:category_id],
|
||||
merchant_id: transaction_params[:merchant_id],
|
||||
tag_ids: transaction_params[:tag_ids]
|
||||
merchant_id: transaction_params[:merchant_id]
|
||||
# Note: tag_ids handled separately in update action to distinguish
|
||||
# "not provided" from "explicitly set to empty"
|
||||
}.compact_blank
|
||||
}
|
||||
|
||||
@@ -296,6 +307,12 @@ end
|
||||
entry_params.compact
|
||||
end
|
||||
|
||||
# Check if tag_ids was explicitly provided in the request.
|
||||
# This distinguishes between "user wants to update tags" vs "user didn't specify tags".
|
||||
def tags_provided?
|
||||
params[:transaction].key?(:tag_ids)
|
||||
end
|
||||
|
||||
def calculate_signed_amount
|
||||
amount = transaction_params[:amount].to_f
|
||||
nature = transaction_params[:nature]
|
||||
|
||||
218
app/controllers/api/v1/valuations_controller.rb
Normal file
218
app/controllers/api/v1/valuations_controller.rb
Normal file
@@ -0,0 +1,218 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class Api::V1::ValuationsController < Api::V1::BaseController
|
||||
before_action :ensure_read_scope, only: [ :show ]
|
||||
before_action :ensure_write_scope, only: [ :create, :update ]
|
||||
before_action :set_valuation, only: [ :show, :update ]
|
||||
|
||||
def show
|
||||
render :show
|
||||
rescue => e
|
||||
Rails.logger.error "ValuationsController#show error: #{e.message}"
|
||||
Rails.logger.error e.backtrace.join("\n")
|
||||
|
||||
render json: {
|
||||
error: "internal_server_error",
|
||||
message: "Error: #{e.message}"
|
||||
}, status: :internal_server_error
|
||||
end
|
||||
|
||||
def create
|
||||
unless valuation_account_id.present?
|
||||
render json: {
|
||||
error: "validation_failed",
|
||||
message: "Account ID is required",
|
||||
errors: [ "Account ID is required" ]
|
||||
}, status: :unprocessable_entity
|
||||
return
|
||||
end
|
||||
|
||||
unless valuation_params[:amount].present?
|
||||
render json: {
|
||||
error: "validation_failed",
|
||||
message: "Amount is required",
|
||||
errors: [ "Amount is required" ]
|
||||
}, status: :unprocessable_entity
|
||||
return
|
||||
end
|
||||
|
||||
unless valuation_params[:date].present?
|
||||
render json: {
|
||||
error: "validation_failed",
|
||||
message: "Date is required",
|
||||
errors: [ "Date is required" ]
|
||||
}, status: :unprocessable_entity
|
||||
return
|
||||
end
|
||||
|
||||
account = current_resource_owner.family.accounts.find(valuation_account_id)
|
||||
|
||||
create_success = false
|
||||
error_payload = nil
|
||||
|
||||
ActiveRecord::Base.transaction do
|
||||
result = account.create_reconciliation(
|
||||
balance: valuation_params[:amount],
|
||||
date: valuation_params[:date]
|
||||
)
|
||||
|
||||
unless result.success?
|
||||
error_payload = {
|
||||
error: "validation_failed",
|
||||
message: "Valuation could not be created",
|
||||
errors: [ result.error_message ]
|
||||
}
|
||||
raise ActiveRecord::Rollback
|
||||
end
|
||||
|
||||
@entry = account.entries.valuations.find_by!(date: valuation_params[:date])
|
||||
@valuation = @entry.entryable
|
||||
|
||||
if valuation_params.key?(:notes)
|
||||
unless @entry.update(notes: valuation_params[:notes])
|
||||
error_payload = {
|
||||
error: "validation_failed",
|
||||
message: "Valuation could not be created",
|
||||
errors: @entry.errors.full_messages
|
||||
}
|
||||
raise ActiveRecord::Rollback
|
||||
end
|
||||
end
|
||||
|
||||
create_success = true
|
||||
end
|
||||
|
||||
unless create_success
|
||||
render json: error_payload, status: :unprocessable_entity
|
||||
return
|
||||
end
|
||||
|
||||
render :show, status: :created
|
||||
|
||||
rescue ActiveRecord::RecordNotFound
|
||||
render json: {
|
||||
error: "not_found",
|
||||
message: "Account or valuation entry not found"
|
||||
}, status: :not_found
|
||||
rescue => e
|
||||
Rails.logger.error "ValuationsController#create error: #{e.message}"
|
||||
Rails.logger.error e.backtrace.join("\n")
|
||||
|
||||
render json: {
|
||||
error: "internal_server_error",
|
||||
message: "Error: #{e.message}"
|
||||
}, status: :internal_server_error
|
||||
end
|
||||
|
||||
def update
|
||||
if valuation_params[:date].present? || valuation_params[:amount].present?
|
||||
unless valuation_params[:date].present? && valuation_params[:amount].present?
|
||||
render json: {
|
||||
error: "validation_failed",
|
||||
message: "Both amount and date are required when updating reconciliation",
|
||||
errors: [ "Amount and date must both be provided" ]
|
||||
}, status: :unprocessable_entity
|
||||
return
|
||||
end
|
||||
|
||||
update_success = false
|
||||
error_payload = nil
|
||||
updated_entry = nil
|
||||
|
||||
ActiveRecord::Base.transaction do
|
||||
result = @entry.account.update_reconciliation(
|
||||
@entry,
|
||||
balance: valuation_params[:amount],
|
||||
date: valuation_params[:date]
|
||||
)
|
||||
|
||||
unless result.success?
|
||||
error_payload = {
|
||||
error: "validation_failed",
|
||||
message: "Valuation could not be updated",
|
||||
errors: [ result.error_message ]
|
||||
}
|
||||
raise ActiveRecord::Rollback
|
||||
end
|
||||
|
||||
updated_entry = @entry.account.entries.valuations.find_by!(date: valuation_params[:date])
|
||||
|
||||
if valuation_params.key?(:notes)
|
||||
unless updated_entry.update(notes: valuation_params[:notes])
|
||||
error_payload = {
|
||||
error: "validation_failed",
|
||||
message: "Valuation could not be updated",
|
||||
errors: updated_entry.errors.full_messages
|
||||
}
|
||||
raise ActiveRecord::Rollback
|
||||
end
|
||||
end
|
||||
|
||||
update_success = true
|
||||
end
|
||||
|
||||
unless update_success
|
||||
render json: error_payload, status: :unprocessable_entity
|
||||
return
|
||||
end
|
||||
|
||||
@entry = updated_entry
|
||||
@valuation = @entry.entryable
|
||||
render :show
|
||||
else
|
||||
if valuation_params.key?(:notes)
|
||||
unless @entry.update(notes: valuation_params[:notes])
|
||||
render json: {
|
||||
error: "validation_failed",
|
||||
message: "Valuation could not be updated",
|
||||
errors: @entry.errors.full_messages
|
||||
}, status: :unprocessable_entity
|
||||
return
|
||||
end
|
||||
end
|
||||
@entry.reload
|
||||
@valuation = @entry.entryable
|
||||
render :show
|
||||
end
|
||||
|
||||
rescue => e
|
||||
Rails.logger.error "ValuationsController#update error: #{e.message}"
|
||||
Rails.logger.error e.backtrace.join("\n")
|
||||
|
||||
render json: {
|
||||
error: "internal_server_error",
|
||||
message: "Error: #{e.message}"
|
||||
}, status: :internal_server_error
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def set_valuation
|
||||
@entry = current_resource_owner.family
|
||||
.entries
|
||||
.where(entryable_type: "Valuation")
|
||||
.find(params[:id])
|
||||
@valuation = @entry.entryable
|
||||
rescue ActiveRecord::RecordNotFound
|
||||
render json: {
|
||||
error: "not_found",
|
||||
message: "Valuation not found"
|
||||
}, status: :not_found
|
||||
end
|
||||
|
||||
def ensure_read_scope
|
||||
authorize_scope!(:read)
|
||||
end
|
||||
|
||||
def ensure_write_scope
|
||||
authorize_scope!(:write)
|
||||
end
|
||||
|
||||
def valuation_account_id
|
||||
params.dig(:valuation, :account_id)
|
||||
end
|
||||
|
||||
def valuation_params
|
||||
params.require(:valuation).permit(:amount, :date, :notes)
|
||||
end
|
||||
end
|
||||
@@ -18,6 +18,28 @@ class ApplicationController < ActionController::Base
|
||||
helper_method :demo_config, :demo_host_match?, :show_demo_warning?
|
||||
|
||||
private
|
||||
def accept_pending_invitation_for(user)
|
||||
return false if user.blank?
|
||||
|
||||
token = session[:pending_invitation_token]
|
||||
return false if token.blank?
|
||||
|
||||
invitation = Invitation.pending.find_by(token: token.to_s)
|
||||
return false unless invitation
|
||||
return false unless invitation.accept_for(user)
|
||||
|
||||
session.delete(:pending_invitation_token)
|
||||
true
|
||||
end
|
||||
|
||||
def store_pending_invitation_if_valid
|
||||
token = params[:invitation].to_s.presence
|
||||
return if token.blank?
|
||||
|
||||
invitation = Invitation.pending.find_by(token: token)
|
||||
session[:pending_invitation_token] = token if invitation
|
||||
end
|
||||
|
||||
def detect_os
|
||||
user_agent = request.user_agent
|
||||
@os = case user_agent
|
||||
|
||||
@@ -42,7 +42,7 @@ class BudgetCategoriesController < ApplicationController
|
||||
end
|
||||
|
||||
def set_budget
|
||||
start_date = Budget.param_to_date(params[:budget_month_year])
|
||||
start_date = Budget.param_to_date(params[:budget_month_year], family: Current.family)
|
||||
@budget = Current.family.budgets.find_by(start_date: start_date)
|
||||
end
|
||||
end
|
||||
|
||||
@@ -35,7 +35,7 @@ class BudgetsController < ApplicationController
|
||||
end
|
||||
|
||||
def set_budget
|
||||
start_date = Budget.param_to_date(params[:month_year])
|
||||
start_date = Budget.param_to_date(params[:month_year], family: Current.family)
|
||||
@budget = Budget.find_or_bootstrap(Current.family, start_date: start_date)
|
||||
raise ActiveRecord::RecordNotFound unless @budget
|
||||
end
|
||||
|
||||
@@ -7,7 +7,15 @@ module Periodable
|
||||
|
||||
private
|
||||
def set_period
|
||||
@period = Period.from_key(params[:period] || Current.user&.default_period)
|
||||
period_key = params[:period] || Current.user&.default_period
|
||||
|
||||
@period = if period_key == "current_month"
|
||||
Period.current_month_for(Current.family)
|
||||
elsif period_key == "last_month"
|
||||
Period.last_month_for(Current.family)
|
||||
else
|
||||
Period.from_key(period_key)
|
||||
end
|
||||
rescue Period::InvalidKeyError
|
||||
@period = Period.last_30_days
|
||||
end
|
||||
|
||||
@@ -4,6 +4,8 @@ class Import::ConfigurationsController < ApplicationController
|
||||
before_action :set_import
|
||||
|
||||
def show
|
||||
# PDF imports are auto-configured from AI extraction, skip to clean step
|
||||
redirect_to import_clean_path(@import) if @import.is_a?(PdfImport)
|
||||
end
|
||||
|
||||
def update
|
||||
|
||||
@@ -33,7 +33,7 @@ class Import::UploadsController < ApplicationController
|
||||
end
|
||||
|
||||
def csv_str
|
||||
@csv_str ||= upload_params[:csv_file]&.read || upload_params[:raw_file_str]
|
||||
@csv_str ||= upload_params[:import_file]&.read || upload_params[:raw_file_str]
|
||||
end
|
||||
|
||||
def csv_valid?(str)
|
||||
@@ -48,6 +48,6 @@ class Import::UploadsController < ApplicationController
|
||||
end
|
||||
|
||||
def upload_params
|
||||
params.require(:import).permit(:raw_file_str, :csv_file, :col_sep)
|
||||
params.require(:import).permit(:raw_file_str, :import_file, :col_sep)
|
||||
end
|
||||
end
|
||||
|
||||
@@ -1,7 +1,23 @@
|
||||
class ImportsController < ApplicationController
|
||||
include SettingsHelper
|
||||
|
||||
before_action :set_import, only: %i[show publish destroy revert apply_template]
|
||||
before_action :set_import, only: %i[show update publish destroy revert apply_template]
|
||||
|
||||
def update
|
||||
# Handle both pdf_import[account_id] and import[account_id] param formats
|
||||
account_id = params.dig(:pdf_import, :account_id) || params.dig(:import, :account_id)
|
||||
|
||||
if account_id.present?
|
||||
account = Current.family.accounts.find_by(id: account_id)
|
||||
unless account
|
||||
redirect_back_or_to import_path(@import), alert: t("imports.update.invalid_account", default: "Account not found.")
|
||||
return
|
||||
end
|
||||
@import.update!(account: account)
|
||||
end
|
||||
|
||||
redirect_to import_path(@import), notice: t("imports.update.account_saved", default: "Account saved.")
|
||||
end
|
||||
|
||||
def publish
|
||||
@import.publish_later
|
||||
@@ -22,9 +38,27 @@ class ImportsController < ApplicationController
|
||||
|
||||
def new
|
||||
@pending_import = Current.family.imports.ordered.pending.first
|
||||
@document_upload_extensions = document_upload_supported_extensions
|
||||
end
|
||||
|
||||
def create
|
||||
file = import_params[:import_file]
|
||||
|
||||
if file.present? && document_upload_request?
|
||||
create_document_import(file)
|
||||
return
|
||||
end
|
||||
|
||||
# Handle PDF file uploads - process with AI
|
||||
if file.present? && Import::ALLOWED_PDF_MIME_TYPES.include?(file.content_type)
|
||||
unless valid_pdf_file?(file)
|
||||
redirect_to new_import_path, alert: t("imports.create.invalid_pdf")
|
||||
return
|
||||
end
|
||||
create_pdf_import(file)
|
||||
return
|
||||
end
|
||||
|
||||
type = params.dig(:import, :type).to_s
|
||||
type = "TransactionImport" unless Import::TYPES.include?(type)
|
||||
|
||||
@@ -35,35 +69,35 @@ class ImportsController < ApplicationController
|
||||
date_format: Current.family.date_format,
|
||||
)
|
||||
|
||||
if import_params[:csv_file].present?
|
||||
file = import_params[:csv_file]
|
||||
|
||||
if file.present?
|
||||
if file.size > Import::MAX_CSV_SIZE
|
||||
import.destroy
|
||||
redirect_to new_import_path, alert: "File is too large. Maximum size is #{Import::MAX_CSV_SIZE / 1.megabyte}MB."
|
||||
redirect_to new_import_path, alert: t("imports.create.file_too_large", max_size: Import::MAX_CSV_SIZE / 1.megabyte)
|
||||
return
|
||||
end
|
||||
|
||||
unless Import::ALLOWED_MIME_TYPES.include?(file.content_type)
|
||||
unless Import::ALLOWED_CSV_MIME_TYPES.include?(file.content_type)
|
||||
import.destroy
|
||||
redirect_to new_import_path, alert: "Invalid file type. Please upload a CSV file."
|
||||
redirect_to new_import_path, alert: t("imports.create.invalid_file_type")
|
||||
return
|
||||
end
|
||||
|
||||
# Stream reading is not fully applicable here as we store the raw string in the DB,
|
||||
# but we have validated size beforehand to prevent memory exhaustion from massive files.
|
||||
import.update!(raw_file_str: file.read)
|
||||
redirect_to import_configuration_path(import), notice: "CSV uploaded successfully."
|
||||
redirect_to import_configuration_path(import), notice: t("imports.create.csv_uploaded")
|
||||
else
|
||||
redirect_to import_upload_path(import)
|
||||
end
|
||||
end
|
||||
|
||||
def show
|
||||
return unless @import.requires_csv_workflow?
|
||||
|
||||
if !@import.uploaded?
|
||||
redirect_to import_upload_path(@import), alert: "Please finalize your file upload."
|
||||
redirect_to import_upload_path(@import), alert: t("imports.show.finalize_upload")
|
||||
elsif !@import.publishable?
|
||||
redirect_to import_confirm_path(@import), alert: "Please finalize your mappings before proceeding."
|
||||
redirect_to import_confirm_path(@import), alert: t("imports.show.finalize_mappings")
|
||||
end
|
||||
end
|
||||
|
||||
@@ -93,6 +127,79 @@ class ImportsController < ApplicationController
|
||||
end
|
||||
|
||||
def import_params
|
||||
params.require(:import).permit(:csv_file)
|
||||
params.require(:import).permit(:import_file)
|
||||
end
|
||||
|
||||
def create_pdf_import(file)
|
||||
if file.size > Import::MAX_PDF_SIZE
|
||||
redirect_to new_import_path, alert: t("imports.create.pdf_too_large", max_size: Import::MAX_PDF_SIZE / 1.megabyte)
|
||||
return
|
||||
end
|
||||
|
||||
pdf_import = Current.family.imports.create!(type: "PdfImport")
|
||||
pdf_import.pdf_file.attach(file)
|
||||
pdf_import.process_with_ai_later
|
||||
|
||||
redirect_to import_path(pdf_import), notice: t("imports.create.pdf_processing")
|
||||
end
|
||||
|
||||
def create_document_import(file)
|
||||
adapter = VectorStore.adapter
|
||||
unless adapter
|
||||
redirect_to new_import_path, alert: t("imports.create.document_provider_not_configured")
|
||||
return
|
||||
end
|
||||
|
||||
if file.size > Import::MAX_PDF_SIZE
|
||||
redirect_to new_import_path, alert: t("imports.create.document_too_large", max_size: Import::MAX_PDF_SIZE / 1.megabyte)
|
||||
return
|
||||
end
|
||||
|
||||
filename = file.original_filename.to_s
|
||||
ext = File.extname(filename).downcase
|
||||
supported_extensions = adapter.supported_extensions.map(&:downcase)
|
||||
|
||||
unless supported_extensions.include?(ext)
|
||||
redirect_to new_import_path, alert: t("imports.create.invalid_document_file_type")
|
||||
return
|
||||
end
|
||||
|
||||
if ext == ".pdf"
|
||||
unless valid_pdf_file?(file)
|
||||
redirect_to new_import_path, alert: t("imports.create.invalid_pdf")
|
||||
return
|
||||
end
|
||||
|
||||
create_pdf_import(file)
|
||||
return
|
||||
end
|
||||
|
||||
family_document = Current.family.upload_document(
|
||||
file_content: file.read,
|
||||
filename: filename
|
||||
)
|
||||
|
||||
if family_document
|
||||
redirect_to new_import_path, notice: t("imports.create.document_uploaded")
|
||||
else
|
||||
redirect_to new_import_path, alert: t("imports.create.document_upload_failed")
|
||||
end
|
||||
end
|
||||
|
||||
def document_upload_supported_extensions
|
||||
adapter = VectorStore.adapter
|
||||
return [] unless adapter
|
||||
|
||||
adapter.supported_extensions.map(&:downcase).uniq.sort
|
||||
end
|
||||
|
||||
def document_upload_request?
|
||||
params.dig(:import, :type) == "DocumentImport"
|
||||
end
|
||||
|
||||
def valid_pdf_file?(file)
|
||||
header = file.read(5)
|
||||
file.rewind
|
||||
header&.start_with?("%PDF-")
|
||||
end
|
||||
end
|
||||
|
||||
380
app/controllers/indexa_capital_items_controller.rb
Normal file
380
app/controllers/indexa_capital_items_controller.rb
Normal file
@@ -0,0 +1,380 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class IndexaCapitalItemsController < ApplicationController
|
||||
ALLOWED_ACCOUNTABLE_TYPES = %w[Depository CreditCard Investment Loan OtherAsset OtherLiability Crypto Property Vehicle].freeze
|
||||
|
||||
before_action :set_indexa_capital_item, only: [ :show, :edit, :update, :destroy, :sync, :setup_accounts, :complete_account_setup ]
|
||||
|
||||
def index
|
||||
@indexa_capital_items = Current.family.indexa_capital_items.ordered
|
||||
end
|
||||
|
||||
def show
|
||||
end
|
||||
|
||||
def new
|
||||
@indexa_capital_item = Current.family.indexa_capital_items.build
|
||||
end
|
||||
|
||||
def edit
|
||||
end
|
||||
|
||||
def create
|
||||
@indexa_capital_item = Current.family.indexa_capital_items.build(indexa_capital_item_params)
|
||||
@indexa_capital_item.name ||= "IndexaCapital Connection"
|
||||
|
||||
if @indexa_capital_item.save
|
||||
if turbo_frame_request?
|
||||
flash.now[:notice] = t(".success", default: "Successfully configured IndexaCapital.")
|
||||
@indexa_capital_items = Current.family.indexa_capital_items.ordered
|
||||
render turbo_stream: [
|
||||
turbo_stream.replace(
|
||||
"indexa_capital-providers-panel",
|
||||
partial: "settings/providers/indexa_capital_panel",
|
||||
locals: { indexa_capital_items: @indexa_capital_items }
|
||||
),
|
||||
*flash_notification_stream_items
|
||||
]
|
||||
else
|
||||
redirect_to settings_providers_path, notice: t(".success"), status: :see_other
|
||||
end
|
||||
else
|
||||
@error_message = @indexa_capital_item.errors.full_messages.join(", ")
|
||||
|
||||
if turbo_frame_request?
|
||||
render turbo_stream: turbo_stream.replace(
|
||||
"indexa_capital-providers-panel",
|
||||
partial: "settings/providers/indexa_capital_panel",
|
||||
locals: { error_message: @error_message }
|
||||
), status: :unprocessable_entity
|
||||
else
|
||||
redirect_to settings_providers_path, alert: @error_message, status: :unprocessable_entity
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
def update
|
||||
if @indexa_capital_item.update(indexa_capital_item_params)
|
||||
if turbo_frame_request?
|
||||
flash.now[:notice] = t(".success", default: "Successfully updated IndexaCapital configuration.")
|
||||
@indexa_capital_items = Current.family.indexa_capital_items.ordered
|
||||
render turbo_stream: [
|
||||
turbo_stream.replace(
|
||||
"indexa_capital-providers-panel",
|
||||
partial: "settings/providers/indexa_capital_panel",
|
||||
locals: { indexa_capital_items: @indexa_capital_items }
|
||||
),
|
||||
*flash_notification_stream_items
|
||||
]
|
||||
else
|
||||
redirect_to settings_providers_path, notice: t(".success"), status: :see_other
|
||||
end
|
||||
else
|
||||
@error_message = @indexa_capital_item.errors.full_messages.join(", ")
|
||||
|
||||
if turbo_frame_request?
|
||||
render turbo_stream: turbo_stream.replace(
|
||||
"indexa_capital-providers-panel",
|
||||
partial: "settings/providers/indexa_capital_panel",
|
||||
locals: { error_message: @error_message }
|
||||
), status: :unprocessable_entity
|
||||
else
|
||||
redirect_to settings_providers_path, alert: @error_message, status: :unprocessable_entity
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
def destroy
|
||||
@indexa_capital_item.destroy_later
|
||||
redirect_to settings_providers_path, notice: t(".success", default: "Scheduled IndexaCapital connection for deletion.")
|
||||
end
|
||||
|
||||
def sync
|
||||
unless @indexa_capital_item.syncing?
|
||||
@indexa_capital_item.sync_later
|
||||
end
|
||||
|
||||
respond_to do |format|
|
||||
format.html { redirect_back_or_to accounts_path }
|
||||
format.json { head :ok }
|
||||
end
|
||||
end
|
||||
|
||||
# Collection actions for account linking flow
|
||||
|
||||
def preload_accounts
|
||||
# Trigger a sync to fetch accounts from the provider
|
||||
indexa_capital_item = Current.family.indexa_capital_items.first
|
||||
unless indexa_capital_item&.credentials_configured?
|
||||
redirect_to settings_providers_path, alert: t(".no_credentials_configured")
|
||||
return
|
||||
end
|
||||
|
||||
indexa_capital_item.sync_later unless indexa_capital_item.syncing?
|
||||
redirect_to select_accounts_indexa_capital_items_path(accountable_type: params[:accountable_type], return_to: params[:return_to])
|
||||
end
|
||||
|
||||
def select_accounts
|
||||
@accountable_type = params[:accountable_type]
|
||||
@return_to = params[:return_to]
|
||||
|
||||
indexa_capital_item = Current.family.indexa_capital_items.first
|
||||
unless indexa_capital_item&.credentials_configured?
|
||||
redirect_to settings_providers_path, alert: t(".no_credentials_configured")
|
||||
return
|
||||
end
|
||||
|
||||
# Always fetch fresh data (accounts + balances) when user visits this page
|
||||
fetch_accounts_synchronously(indexa_capital_item)
|
||||
|
||||
@indexa_capital_accounts = indexa_capital_item.indexa_capital_accounts
|
||||
.left_joins(:account_provider)
|
||||
.where(account_providers: { id: nil })
|
||||
.order(:name)
|
||||
end
|
||||
|
||||
def link_accounts
|
||||
indexa_capital_item = Current.family.indexa_capital_items.first
|
||||
unless indexa_capital_item&.credentials_configured?
|
||||
redirect_to settings_providers_path, alert: t(".no_api_key")
|
||||
return
|
||||
end
|
||||
|
||||
selected_ids = params[:selected_account_ids] || []
|
||||
if selected_ids.empty?
|
||||
redirect_to select_accounts_indexa_capital_items_path, alert: t(".no_accounts_selected")
|
||||
return
|
||||
end
|
||||
|
||||
accountable_type = params[:accountable_type] || "Depository"
|
||||
created_count = 0
|
||||
already_linked_count = 0
|
||||
invalid_count = 0
|
||||
|
||||
indexa_capital_item.indexa_capital_accounts.where(id: selected_ids).find_each do |indexa_capital_account|
|
||||
# Skip if already linked
|
||||
if indexa_capital_account.account_provider.present?
|
||||
already_linked_count += 1
|
||||
next
|
||||
end
|
||||
|
||||
# Skip if invalid name
|
||||
if indexa_capital_account.name.blank?
|
||||
invalid_count += 1
|
||||
next
|
||||
end
|
||||
|
||||
# Create Sure account and link
|
||||
link_indexa_capital_account(indexa_capital_account, accountable_type)
|
||||
created_count += 1
|
||||
rescue => e
|
||||
Rails.logger.error "IndexaCapitalItemsController#link_accounts - Failed to link account: #{e.message}"
|
||||
end
|
||||
|
||||
if created_count > 0
|
||||
indexa_capital_item.sync_later unless indexa_capital_item.syncing?
|
||||
redirect_to accounts_path, notice: t(".success", count: created_count)
|
||||
else
|
||||
redirect_to select_accounts_indexa_capital_items_path, alert: t(".link_failed")
|
||||
end
|
||||
end
|
||||
|
||||
def select_existing_account
|
||||
@account = Current.family.accounts.find(params[:account_id])
|
||||
@indexa_capital_item = Current.family.indexa_capital_items.first
|
||||
|
||||
unless @indexa_capital_item&.credentials_configured?
|
||||
redirect_to settings_providers_path, alert: t(".no_credentials_configured")
|
||||
return
|
||||
end
|
||||
|
||||
@indexa_capital_accounts = @indexa_capital_item.indexa_capital_accounts
|
||||
.left_joins(:account_provider)
|
||||
.where(account_providers: { id: nil })
|
||||
.order(:name)
|
||||
end
|
||||
|
||||
def link_existing_account
|
||||
account = Current.family.accounts.find(params[:account_id])
|
||||
indexa_capital_item = Current.family.indexa_capital_items.first
|
||||
|
||||
unless indexa_capital_item&.credentials_configured?
|
||||
redirect_to settings_providers_path, alert: t(".no_api_key")
|
||||
return
|
||||
end
|
||||
|
||||
indexa_capital_account = indexa_capital_item.indexa_capital_accounts.find(params[:indexa_capital_account_id])
|
||||
|
||||
if indexa_capital_account.account_provider.present?
|
||||
redirect_to account_path(account), alert: t(".provider_account_already_linked")
|
||||
return
|
||||
end
|
||||
|
||||
indexa_capital_account.ensure_account_provider!(account)
|
||||
indexa_capital_item.sync_later unless indexa_capital_item.syncing?
|
||||
|
||||
redirect_to account_path(account), notice: t(".success", account_name: account.name)
|
||||
end
|
||||
|
||||
def setup_accounts
|
||||
@unlinked_accounts = @indexa_capital_item.unlinked_indexa_capital_accounts.order(:name)
|
||||
end
|
||||
|
||||
def complete_account_setup
|
||||
account_configs = params[:accounts] || {}
|
||||
|
||||
if account_configs.empty?
|
||||
redirect_to setup_accounts_indexa_capital_item_path(@indexa_capital_item), alert: t(".no_accounts")
|
||||
return
|
||||
end
|
||||
|
||||
created_count = 0
|
||||
skipped_count = 0
|
||||
|
||||
account_configs.each do |indexa_capital_account_id, config|
|
||||
next if config[:account_type] == "skip"
|
||||
|
||||
indexa_capital_account = @indexa_capital_item.indexa_capital_accounts.find_by(id: indexa_capital_account_id)
|
||||
next unless indexa_capital_account
|
||||
next if indexa_capital_account.account_provider.present?
|
||||
|
||||
accountable_type = infer_accountable_type(config[:account_type], config[:subtype])
|
||||
account = create_account_from_indexa_capital(indexa_capital_account, accountable_type, config)
|
||||
|
||||
if account&.persisted?
|
||||
indexa_capital_account.ensure_account_provider!(account)
|
||||
indexa_capital_account.update!(sync_start_date: config[:sync_start_date]) if config[:sync_start_date].present?
|
||||
created_count += 1
|
||||
else
|
||||
skipped_count += 1
|
||||
end
|
||||
rescue => e
|
||||
Rails.logger.error "IndexaCapitalItemsController#complete_account_setup - Error: #{e.message}"
|
||||
skipped_count += 1
|
||||
end
|
||||
|
||||
if created_count > 0
|
||||
@indexa_capital_item.sync_later unless @indexa_capital_item.syncing?
|
||||
redirect_to accounts_path, notice: t(".success", count: created_count)
|
||||
elsif skipped_count > 0 && created_count == 0
|
||||
redirect_to accounts_path, notice: t(".all_skipped")
|
||||
else
|
||||
redirect_to setup_accounts_indexa_capital_item_path(@indexa_capital_item), alert: t(".creation_failed", error: "Unknown error")
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def set_indexa_capital_item
|
||||
@indexa_capital_item = Current.family.indexa_capital_items.find(params[:id])
|
||||
end
|
||||
|
||||
def indexa_capital_item_params
|
||||
params.require(:indexa_capital_item).permit(
|
||||
:name,
|
||||
:sync_start_date,
|
||||
:api_token,
|
||||
:username,
|
||||
:document,
|
||||
:password
|
||||
)
|
||||
end
|
||||
|
||||
def link_indexa_capital_account(indexa_capital_account, accountable_type)
|
||||
accountable_class = validated_accountable_class(accountable_type)
|
||||
|
||||
account = Current.family.accounts.create!(
|
||||
name: indexa_capital_account.name,
|
||||
balance: indexa_capital_account.current_balance || 0,
|
||||
currency: indexa_capital_account.currency || "EUR",
|
||||
accountable: accountable_class.new
|
||||
)
|
||||
|
||||
indexa_capital_account.ensure_account_provider!(account)
|
||||
account
|
||||
end
|
||||
|
||||
def create_account_from_indexa_capital(indexa_capital_account, accountable_type, config)
|
||||
accountable_class = validated_accountable_class(accountable_type)
|
||||
accountable_attrs = {}
|
||||
|
||||
# Set subtype if the accountable supports it
|
||||
if config[:subtype].present? && accountable_class.respond_to?(:subtypes)
|
||||
accountable_attrs[:subtype] = config[:subtype]
|
||||
end
|
||||
|
||||
Current.family.accounts.create!(
|
||||
name: indexa_capital_account.name,
|
||||
balance: config[:balance].present? ? config[:balance].to_d : (indexa_capital_account.current_balance || 0),
|
||||
currency: indexa_capital_account.currency || "EUR",
|
||||
accountable: accountable_class.new(accountable_attrs)
|
||||
)
|
||||
end
|
||||
|
||||
def infer_accountable_type(account_type, subtype = nil)
|
||||
case account_type&.downcase
|
||||
when "depository"
|
||||
"Depository"
|
||||
when "credit_card"
|
||||
"CreditCard"
|
||||
when "investment"
|
||||
"Investment"
|
||||
when "loan"
|
||||
"Loan"
|
||||
when "other_asset"
|
||||
"OtherAsset"
|
||||
when "other_liability"
|
||||
"OtherLiability"
|
||||
when "crypto"
|
||||
"Crypto"
|
||||
when "property"
|
||||
"Property"
|
||||
when "vehicle"
|
||||
"Vehicle"
|
||||
else
|
||||
"Depository"
|
||||
end
|
||||
end
|
||||
|
||||
def validated_accountable_class(accountable_type)
|
||||
unless ALLOWED_ACCOUNTABLE_TYPES.include?(accountable_type)
|
||||
raise ArgumentError, "Invalid accountable type: #{accountable_type}"
|
||||
end
|
||||
|
||||
accountable_type.constantize
|
||||
end
|
||||
|
||||
def fetch_accounts_synchronously(indexa_capital_item)
|
||||
provider = indexa_capital_item.indexa_capital_provider
|
||||
return unless provider
|
||||
|
||||
accounts_data = provider.list_accounts
|
||||
|
||||
accounts_data.each do |account_data|
|
||||
account_number = account_data[:account_number].to_s
|
||||
next if account_number.blank?
|
||||
|
||||
# Fetch current balance from performance endpoint
|
||||
balance = provider.get_account_balance(account_number: account_number)
|
||||
account_data[:current_balance] = balance
|
||||
rescue => e
|
||||
Rails.logger.warn "IndexaCapitalItemsController - Failed to fetch balance for #{account_number}: #{e.message}"
|
||||
end
|
||||
|
||||
accounts_data.each do |account_data|
|
||||
account_number = account_data[:account_number].to_s
|
||||
next if account_number.blank?
|
||||
|
||||
indexa_capital_account = indexa_capital_item.indexa_capital_accounts.find_or_initialize_by(
|
||||
indexa_capital_account_id: account_number
|
||||
)
|
||||
indexa_capital_account.upsert_from_indexa_capital!(account_data)
|
||||
end
|
||||
rescue Provider::IndexaCapital::AuthenticationError => e
|
||||
Rails.logger.error "IndexaCapitalItemsController - Auth failed during sync: #{e.message}"
|
||||
flash.now[:alert] = t("indexa_capital_items.select_accounts.api_error", message: e.message)
|
||||
rescue Provider::IndexaCapital::Error => e
|
||||
Rails.logger.error "IndexaCapitalItemsController - API error during sync: #{e.message}"
|
||||
flash.now[:alert] = t("indexa_capital_items.select_accounts.api_error", message: e.message)
|
||||
end
|
||||
end
|
||||
@@ -15,8 +15,16 @@ class InvitationsController < ApplicationController
|
||||
@invitation.inviter = Current.user
|
||||
|
||||
if @invitation.save
|
||||
InvitationMailer.invite_email(@invitation).deliver_later unless self_hosted?
|
||||
flash[:notice] = t(".success")
|
||||
normalized_email = @invitation.email.to_s.strip.downcase
|
||||
existing_user = User.find_by(email: normalized_email)
|
||||
if existing_user && @invitation.accept_for(existing_user)
|
||||
flash[:notice] = t(".existing_user_added")
|
||||
elsif existing_user
|
||||
flash[:alert] = t(".failure")
|
||||
else
|
||||
InvitationMailer.invite_email(@invitation).deliver_later unless self_hosted?
|
||||
flash[:notice] = t(".success")
|
||||
end
|
||||
else
|
||||
flash[:alert] = t(".failure")
|
||||
end
|
||||
@@ -28,7 +36,7 @@ class InvitationsController < ApplicationController
|
||||
@invitation = Invitation.find_by!(token: params[:id])
|
||||
|
||||
if @invitation.pending?
|
||||
redirect_to new_registration_path(invitation: @invitation.token)
|
||||
render :accept_choice, layout: "auth"
|
||||
else
|
||||
raise ActiveRecord::RecordNotFound
|
||||
end
|
||||
|
||||
@@ -32,6 +32,7 @@ class MfaController < ApplicationController
|
||||
if @user&.verify_otp?(params[:code])
|
||||
session.delete(:mfa_user_id)
|
||||
@session = create_session_for(@user)
|
||||
flash[:notice] = t("invitations.accept_choice.joined_household") if accept_pending_invitation_for(@user)
|
||||
redirect_to root_path
|
||||
else
|
||||
flash.now[:alert] = t(".invalid_code")
|
||||
|
||||
@@ -47,13 +47,17 @@ class OidcAccountsController < ApplicationController
|
||||
# Clear pending auth from session
|
||||
session.delete(:pending_oidc_auth)
|
||||
|
||||
# Check if user has MFA enabled
|
||||
if user.otp_required?
|
||||
session[:mfa_user_id] = user.id
|
||||
redirect_to verify_mfa_path
|
||||
else
|
||||
@session = create_session_for(user)
|
||||
redirect_to root_path, notice: "Account successfully linked to #{@pending_auth['provider']}"
|
||||
notice = if accept_pending_invitation_for(user)
|
||||
t("invitations.accept_choice.joined_household")
|
||||
else
|
||||
t("sessions.openid_connect.account_linked", provider: @pending_auth["provider"])
|
||||
end
|
||||
redirect_to root_path, notice: notice
|
||||
end
|
||||
else
|
||||
@email = params[:email]
|
||||
@@ -139,9 +143,9 @@ class OidcAccountsController < ApplicationController
|
||||
# Clear pending auth from session
|
||||
session.delete(:pending_oidc_auth)
|
||||
|
||||
# Create session and log them in
|
||||
@session = create_session_for(@user)
|
||||
redirect_to root_path, notice: "Welcome! Your account has been created."
|
||||
notice = accept_pending_invitation_for(@user) ? t("invitations.accept_choice.joined_household") : "Welcome! Your account has been created."
|
||||
redirect_to root_path, notice: notice
|
||||
else
|
||||
render :new_user, status: :unprocessable_entity
|
||||
end
|
||||
|
||||
@@ -2,8 +2,13 @@ class PagesController < ApplicationController
|
||||
include Periodable
|
||||
|
||||
skip_authentication only: %i[redis_configuration_error privacy terms]
|
||||
before_action :ensure_intro_guest!, only: :intro
|
||||
|
||||
def dashboard
|
||||
if Current.user&.ui_layout_intro?
|
||||
redirect_to chats_path and return
|
||||
end
|
||||
|
||||
@balance_sheet = Current.family.balance_sheet
|
||||
@investment_statement = Current.family.investment_statement
|
||||
@accounts = Current.family.accounts.visible.with_attached_logo
|
||||
@@ -22,6 +27,10 @@ class PagesController < ApplicationController
|
||||
@breadcrumbs = [ [ "Home", root_path ], [ "Dashboard", nil ] ]
|
||||
end
|
||||
|
||||
def intro
|
||||
@breadcrumbs = [ [ "Home", chats_path ], [ "Intro", nil ] ]
|
||||
end
|
||||
|
||||
def update_preferences
|
||||
if Current.user.update_dashboard_preferences(preferences_params)
|
||||
head :ok
|
||||
@@ -268,4 +277,10 @@ class PagesController < ApplicationController
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
def ensure_intro_guest!
|
||||
return if Current.user&.guest?
|
||||
|
||||
redirect_to root_path, alert: t("pages.intro.not_authorized", default: "Intro is only available to guest users.")
|
||||
end
|
||||
end
|
||||
|
||||
@@ -11,7 +11,7 @@ class RulesController < ApplicationController
|
||||
@sort_by = "name" unless allowed_columns.include?(@sort_by)
|
||||
@direction = "asc" unless [ "asc", "desc" ].include?(@direction)
|
||||
|
||||
@rules = Current.family.rules.order(@sort_by => @direction)
|
||||
@rules = Current.family.rules.includes(conditions: :sub_conditions).order(@sort_by => @direction)
|
||||
|
||||
# Fetch recent rule runs with pagination
|
||||
recent_runs_scope = RuleRun
|
||||
@@ -128,6 +128,11 @@ class RulesController < ApplicationController
|
||||
redirect_back_or_to rules_path, notice: t("rules.apply_all.success")
|
||||
end
|
||||
|
||||
def clear_ai_cache
|
||||
ClearAiCacheJob.perform_later(Current.family)
|
||||
redirect_to rules_path, notice: t("rules.clear_ai_cache.success")
|
||||
end
|
||||
|
||||
private
|
||||
def set_rule
|
||||
@rule = Current.family.rules.find(params[:id])
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
class SessionsController < ApplicationController
|
||||
extend SslConfigurable
|
||||
|
||||
before_action :set_session, only: :destroy
|
||||
skip_authentication only: %i[index new create openid_connect failure post_logout]
|
||||
skip_authentication only: %i[index new create openid_connect failure post_logout mobile_sso_start]
|
||||
|
||||
layout "auth"
|
||||
|
||||
@@ -10,6 +12,10 @@ class SessionsController < ApplicationController
|
||||
end
|
||||
|
||||
def new
|
||||
store_pending_invitation_if_valid
|
||||
# Clear any stale mobile SSO session flag from an abandoned mobile flow
|
||||
session.delete(:mobile_sso)
|
||||
|
||||
begin
|
||||
demo = Rails.application.config_for(:demo)
|
||||
@prefill_demo_credentials = demo_host_match?(demo)
|
||||
@@ -29,6 +35,9 @@ class SessionsController < ApplicationController
|
||||
end
|
||||
|
||||
def create
|
||||
# Clear any stale mobile SSO session flag from an abandoned mobile flow
|
||||
session.delete(:mobile_sso)
|
||||
|
||||
user = nil
|
||||
|
||||
if AuthConfig.local_login_enabled?
|
||||
@@ -58,6 +67,7 @@ class SessionsController < ApplicationController
|
||||
else
|
||||
log_super_admin_override_login(user)
|
||||
@session = create_session_for(user)
|
||||
flash[:notice] = t("invitations.accept_choice.joined_household") if accept_pending_invitation_for(user)
|
||||
redirect_to root_path
|
||||
end
|
||||
else
|
||||
@@ -104,6 +114,34 @@ class SessionsController < ApplicationController
|
||||
redirect_to new_session_path, notice: t(".logout_successful")
|
||||
end
|
||||
|
||||
def mobile_sso_start
|
||||
provider = params[:provider].to_s
|
||||
configured_providers = Rails.configuration.x.auth.sso_providers.map { |p| p[:name].to_s }
|
||||
|
||||
unless configured_providers.include?(provider)
|
||||
mobile_sso_redirect(error: "invalid_provider", message: "SSO provider not configured")
|
||||
return
|
||||
end
|
||||
|
||||
device_params = params.permit(:device_id, :device_name, :device_type, :os_version, :app_version)
|
||||
unless device_params[:device_id].present? && device_params[:device_name].present? && device_params[:device_type].present?
|
||||
mobile_sso_redirect(error: "missing_device_info", message: "Device information is required")
|
||||
return
|
||||
end
|
||||
|
||||
session[:mobile_sso] = {
|
||||
device_id: device_params[:device_id],
|
||||
device_name: device_params[:device_name],
|
||||
device_type: device_params[:device_type],
|
||||
os_version: device_params[:os_version],
|
||||
app_version: device_params[:app_version]
|
||||
}
|
||||
|
||||
# Render auto-submitting form to POST to OmniAuth (required by omniauth-rails_csrf_protection)
|
||||
@provider = provider
|
||||
render layout: false
|
||||
end
|
||||
|
||||
def openid_connect
|
||||
auth = request.env["omniauth.auth"]
|
||||
|
||||
@@ -122,22 +160,41 @@ class SessionsController < ApplicationController
|
||||
oidc_identity.record_authentication!
|
||||
oidc_identity.sync_user_attributes!(auth)
|
||||
|
||||
# Log successful SSO login
|
||||
SsoAuditLog.log_login!(user: user, provider: auth.provider, request: request)
|
||||
|
||||
# Mobile SSO: issue Doorkeeper tokens and redirect to app
|
||||
if session[:mobile_sso].present?
|
||||
if user.otp_required?
|
||||
session.delete(:mobile_sso)
|
||||
mobile_sso_redirect(error: "mfa_not_supported", message: "MFA users should sign in with email and password")
|
||||
else
|
||||
handle_mobile_sso_callback(user)
|
||||
end
|
||||
return
|
||||
end
|
||||
|
||||
# Store id_token and provider for RP-initiated logout
|
||||
session[:id_token_hint] = auth.credentials&.id_token if auth.credentials&.id_token
|
||||
session[:sso_login_provider] = auth.provider
|
||||
|
||||
# Log successful SSO login
|
||||
SsoAuditLog.log_login!(user: user, provider: auth.provider, request: request)
|
||||
|
||||
# MFA check: If user has MFA enabled, require verification
|
||||
if user.otp_required?
|
||||
session[:mfa_user_id] = user.id
|
||||
redirect_to verify_mfa_path
|
||||
else
|
||||
@session = create_session_for(user)
|
||||
flash[:notice] = t("invitations.accept_choice.joined_household") if accept_pending_invitation_for(user)
|
||||
redirect_to root_path
|
||||
end
|
||||
else
|
||||
# Mobile SSO with no linked identity - redirect back with error
|
||||
if session[:mobile_sso].present?
|
||||
session.delete(:mobile_sso)
|
||||
mobile_sso_redirect(error: "account_not_linked", message: "Please link your Google account from the web app first")
|
||||
return
|
||||
end
|
||||
|
||||
# No existing OIDC identity - need to link to account
|
||||
# Store auth data in session and redirect to linking page
|
||||
session[:pending_oidc_auth] = {
|
||||
@@ -164,6 +221,13 @@ class SessionsController < ApplicationController
|
||||
reason: sanitized_reason
|
||||
)
|
||||
|
||||
# Mobile SSO: redirect back to the app with error instead of web login page
|
||||
if session[:mobile_sso].present?
|
||||
session.delete(:mobile_sso)
|
||||
mobile_sso_redirect(error: sanitized_reason, message: "SSO authentication failed")
|
||||
return
|
||||
end
|
||||
|
||||
message = case sanitized_reason
|
||||
when "sso_provider_unavailable"
|
||||
t("sessions.failure.sso_provider_unavailable")
|
||||
@@ -177,6 +241,42 @@ class SessionsController < ApplicationController
|
||||
end
|
||||
|
||||
private
|
||||
def handle_mobile_sso_callback(user)
|
||||
device_info = session.delete(:mobile_sso)
|
||||
|
||||
unless device_info.present?
|
||||
mobile_sso_redirect(error: "missing_session", message: "Mobile SSO session expired")
|
||||
return
|
||||
end
|
||||
|
||||
device = MobileDevice.upsert_device!(user, device_info.symbolize_keys)
|
||||
token_response = device.issue_token!
|
||||
|
||||
# Store tokens behind a one-time authorization code instead of passing in URL
|
||||
authorization_code = SecureRandom.urlsafe_base64(32)
|
||||
Rails.cache.write(
|
||||
"mobile_sso:#{authorization_code}",
|
||||
token_response.merge(
|
||||
user_id: user.id,
|
||||
user_email: user.email,
|
||||
user_first_name: user.first_name,
|
||||
user_last_name: user.last_name,
|
||||
user_ui_layout: user.ui_layout,
|
||||
user_ai_enabled: user.ai_enabled?
|
||||
),
|
||||
expires_in: 5.minutes
|
||||
)
|
||||
|
||||
mobile_sso_redirect(code: authorization_code)
|
||||
rescue ActiveRecord::RecordInvalid => e
|
||||
Rails.logger.warn("[Mobile SSO] Device save failed: #{e.record.errors.full_messages.join(', ')}")
|
||||
mobile_sso_redirect(error: "device_error", message: "Unable to register device")
|
||||
end
|
||||
|
||||
def mobile_sso_redirect(params = {})
|
||||
redirect_to "sureapp://oauth/callback?#{params.to_query}", allow_other_host: true
|
||||
end
|
||||
|
||||
def set_session
|
||||
@session = Current.user.sessions.find(params[:id])
|
||||
end
|
||||
@@ -209,7 +309,7 @@ class SessionsController < ApplicationController
|
||||
if provider_config[:strategy] == "openid_connect" && provider_config[:issuer].present?
|
||||
begin
|
||||
discovery_url = discovery_url_for(provider_config[:issuer])
|
||||
response = Faraday.get(discovery_url) do |req|
|
||||
response = Faraday.new(ssl: self.class.faraday_ssl_options).get(discovery_url) do |req|
|
||||
req.options.timeout = 5
|
||||
req.options.open_timeout = 3
|
||||
end
|
||||
|
||||
@@ -16,7 +16,7 @@ class Settings::ApiKeysController < ApplicationController
|
||||
def new
|
||||
# Allow regeneration by not redirecting if user explicitly wants to create a new key
|
||||
# Only redirect if user stumbles onto new page without explicit intent
|
||||
redirect_to settings_api_key_path if Current.user.api_keys.active.exists? && !params[:regenerate]
|
||||
redirect_to settings_api_key_path if Current.user.api_keys.active.visible.exists? && !params[:regenerate]
|
||||
@api_key = ApiKey.new
|
||||
end
|
||||
|
||||
@@ -25,8 +25,9 @@ class Settings::ApiKeysController < ApplicationController
|
||||
@api_key = Current.user.api_keys.build(api_key_params)
|
||||
@api_key.key = @plain_key
|
||||
|
||||
# Temporarily revoke existing keys for validation to pass
|
||||
existing_keys = Current.user.api_keys.active
|
||||
# Temporarily revoke existing visible keys for validation to pass
|
||||
# (demo monitoring key is excluded and remains active)
|
||||
existing_keys = Current.user.api_keys.active.visible
|
||||
existing_keys.each { |key| key.update_column(:revoked_at, Time.current) }
|
||||
|
||||
if @api_key.save
|
||||
@@ -40,7 +41,11 @@ class Settings::ApiKeysController < ApplicationController
|
||||
end
|
||||
|
||||
def destroy
|
||||
if @api_key&.revoke!
|
||||
if @api_key.nil?
|
||||
flash[:alert] = "API key not found"
|
||||
elsif @api_key.demo_monitoring_key?
|
||||
flash[:alert] = "This API key cannot be revoked"
|
||||
elsif @api_key.revoke!
|
||||
flash[:notice] = "API key has been revoked successfully"
|
||||
else
|
||||
flash[:alert] = "Failed to revoke API key"
|
||||
@@ -51,7 +56,7 @@ class Settings::ApiKeysController < ApplicationController
|
||||
private
|
||||
|
||||
def set_api_key
|
||||
@api_key = Current.user.api_keys.active.first
|
||||
@api_key = Current.user.api_keys.active.visible.first
|
||||
end
|
||||
|
||||
def api_key_params
|
||||
|
||||
@@ -25,6 +25,7 @@ class Settings::HostingsController < ApplicationController
|
||||
if @show_twelve_data_settings
|
||||
twelve_data_provider = Provider::Registry.get_provider(:twelve_data)
|
||||
@twelve_data_usage = twelve_data_provider&.usage
|
||||
@plan_restricted_securities = Current.family.securities_with_plan_restrictions(provider: "TwelveData")
|
||||
end
|
||||
|
||||
if @show_yahoo_finance_settings
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
class Settings::ProfilesController < ApplicationController
|
||||
layout "settings"
|
||||
layout :layout_for_settings_profile
|
||||
|
||||
def show
|
||||
@user = Current.user
|
||||
@@ -36,4 +36,10 @@ class Settings::ProfilesController < ApplicationController
|
||||
|
||||
redirect_to settings_profile_path
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def layout_for_settings_profile
|
||||
Current.user&.ui_layout_intro? ? "application" : "settings"
|
||||
end
|
||||
end
|
||||
|
||||
@@ -129,7 +129,8 @@ class Settings::ProvidersController < ApplicationController
|
||||
config.provider_key.to_s.casecmp("coinstats").zero? || \
|
||||
config.provider_key.to_s.casecmp("mercury").zero? || \
|
||||
config.provider_key.to_s.casecmp("coinbase").zero? || \
|
||||
config.provider_key.to_s.casecmp("snaptrade").zero?
|
||||
config.provider_key.to_s.casecmp("snaptrade").zero? || \
|
||||
config.provider_key.to_s.casecmp("indexa_capital").zero?
|
||||
end
|
||||
|
||||
# Providers page only needs to know whether any SimpleFin/Lunchflow connections exist with valid credentials
|
||||
@@ -140,5 +141,6 @@ class Settings::ProvidersController < ApplicationController
|
||||
@mercury_items = Current.family.mercury_items.ordered.select(:id)
|
||||
@coinbase_items = Current.family.coinbase_items.ordered # Coinbase panel needs name and sync info for status display
|
||||
@snaptrade_items = Current.family.snaptrade_items.includes(:snaptrade_accounts).ordered
|
||||
@indexa_capital_items = Current.family.indexa_capital_items.ordered.select(:id)
|
||||
end
|
||||
end
|
||||
|
||||
@@ -106,27 +106,17 @@ class SnaptradeItemsController < ApplicationController
|
||||
|
||||
# Redirect user to SnapTrade connection portal
|
||||
def connect
|
||||
# Ensure user is registered first
|
||||
unless @snaptrade_item.user_registered?
|
||||
begin
|
||||
@snaptrade_item.ensure_user_registered!
|
||||
rescue => e
|
||||
Rails.logger.error "SnapTrade registration error: #{e.class} - #{e.message}\n#{e.backtrace&.first(5)&.join("\n")}"
|
||||
redirect_to settings_providers_path, alert: t(".registration_failed", message: e.message)
|
||||
return
|
||||
end
|
||||
end
|
||||
@snaptrade_item.ensure_user_registered! unless @snaptrade_item.user_registered?
|
||||
|
||||
# Get the connection portal URL - include item ID in callback for proper routing
|
||||
redirect_url = callback_snaptrade_items_url(item_id: @snaptrade_item.id)
|
||||
|
||||
begin
|
||||
portal_url = @snaptrade_item.connection_portal_url(redirect_url: redirect_url)
|
||||
redirect_to portal_url, allow_other_host: true
|
||||
rescue => e
|
||||
Rails.logger.error "SnapTrade connection portal error: #{e.class} - #{e.message}\n#{e.backtrace&.first(5)&.join("\n")}"
|
||||
redirect_to settings_providers_path, alert: t(".portal_error", message: e.message)
|
||||
end
|
||||
portal_url = @snaptrade_item.connection_portal_url(redirect_url: redirect_url)
|
||||
redirect_to portal_url, allow_other_host: true
|
||||
rescue ActiveRecord::Encryption::Errors::Decryption => e
|
||||
Rails.logger.error "SnapTrade decryption error for item #{@snaptrade_item.id}: #{e.class} - #{e.message}\n#{e.backtrace&.first(5)&.join("\n")}"
|
||||
redirect_to settings_providers_path, alert: t(".decryption_failed")
|
||||
rescue => e
|
||||
Rails.logger.error "SnapTrade connection error: #{e.class} - #{e.message}\n#{e.backtrace&.first(5)&.join("\n")}"
|
||||
redirect_to settings_providers_path, alert: t(".connection_failed", message: e.message)
|
||||
end
|
||||
|
||||
# Handle callback from SnapTrade after user connects brokerage
|
||||
@@ -164,6 +154,14 @@ class SnaptradeItemsController < ApplicationController
|
||||
@snaptrade_item.sync_later
|
||||
end
|
||||
|
||||
# Existing unlinked, visible investment/crypto accounts that could be linked instead of creating duplicates
|
||||
@linkable_accounts = Current.family.accounts
|
||||
.visible
|
||||
.where(accountable_type: %w[Investment Crypto])
|
||||
.left_joins(:account_providers)
|
||||
.where(account_providers: { id: nil })
|
||||
.order(:name)
|
||||
|
||||
# Determine view state
|
||||
@syncing = @snaptrade_item.syncing?
|
||||
@waiting_for_sync = no_accounts && @syncing
|
||||
@@ -379,9 +377,10 @@ class SnaptradeItemsController < ApplicationController
|
||||
def link_existing_account
|
||||
account_id = params[:account_id]
|
||||
snaptrade_account_id = params[:snaptrade_account_id]
|
||||
snaptrade_item_id = params[:snaptrade_item_id]
|
||||
|
||||
account = Current.family.accounts.find_by(id: account_id)
|
||||
snaptrade_item = Current.family.snaptrade_items.first
|
||||
snaptrade_item = Current.family.snaptrade_items.find_by(id: snaptrade_item_id)
|
||||
snaptrade_account = snaptrade_item&.snaptrade_accounts&.find_by(id: snaptrade_account_id)
|
||||
|
||||
if account && snaptrade_account
|
||||
|
||||
@@ -6,7 +6,7 @@ class Transactions::BulkUpdatesController < ApplicationController
|
||||
updated = Current.family
|
||||
.entries
|
||||
.where(id: bulk_update_params[:entry_ids])
|
||||
.bulk_update!(bulk_update_params)
|
||||
.bulk_update!(bulk_update_params, update_tags: tags_provided?)
|
||||
|
||||
redirect_back_or_to transactions_path, notice: "#{updated} transactions updated"
|
||||
end
|
||||
@@ -16,4 +16,11 @@ class Transactions::BulkUpdatesController < ApplicationController
|
||||
params.require(:bulk_update)
|
||||
.permit(:date, :notes, :category_id, :merchant_id, entry_ids: [], tag_ids: [])
|
||||
end
|
||||
|
||||
# Check if tag_ids was explicitly provided in the request.
|
||||
# This distinguishes between "user wants to update tags" vs "user didn't touch tags field".
|
||||
def tags_provided?
|
||||
bulk_update = params[:bulk_update]
|
||||
bulk_update.respond_to?(:key?) && bulk_update.key?(:tag_ids)
|
||||
end
|
||||
end
|
||||
|
||||
@@ -24,11 +24,11 @@ class TransactionsController < ApplicationController
|
||||
|
||||
@pagy, @transactions = pagy(base_scope, limit: safe_per_page)
|
||||
|
||||
# Load projected recurring transactions for next month
|
||||
# Load projected recurring transactions for next 10 days
|
||||
@projected_recurring = Current.family.recurring_transactions
|
||||
.active
|
||||
.where("next_expected_date <= ? AND next_expected_date >= ?",
|
||||
1.month.from_now.to_date,
|
||||
10.days.from_now.to_date,
|
||||
Date.current)
|
||||
.includes(:merchant)
|
||||
end
|
||||
|
||||
@@ -10,7 +10,18 @@ class TransferMatchesController < ApplicationController
|
||||
@transfer = build_transfer
|
||||
Transfer.transaction do
|
||||
@transfer.save!
|
||||
@transfer.outflow_transaction.update!(kind: Transfer.kind_for_account(@transfer.outflow_transaction.entry.account))
|
||||
|
||||
# Use DESTINATION (inflow) account for kind, matching Transfer::Creator logic
|
||||
destination_account = @transfer.inflow_transaction.entry.account
|
||||
outflow_kind = Transfer.kind_for_account(destination_account)
|
||||
outflow_attrs = { kind: outflow_kind }
|
||||
|
||||
if outflow_kind == "investment_contribution"
|
||||
category = destination_account.family.investment_contributions_category
|
||||
outflow_attrs[:category] = category if category.present? && @transfer.outflow_transaction.category_id.blank?
|
||||
end
|
||||
|
||||
@transfer.outflow_transaction.update!(outflow_attrs)
|
||||
@transfer.inflow_transaction.update!(kind: "funds_movement")
|
||||
end
|
||||
|
||||
|
||||
@@ -55,7 +55,7 @@ class TransfersController < ApplicationController
|
||||
@transfer = Transfer
|
||||
.where(id: params[:id])
|
||||
.where(inflow_transaction_id: Current.family.transactions.select(:id))
|
||||
.first
|
||||
.first!
|
||||
end
|
||||
|
||||
def transfer_params
|
||||
|
||||
@@ -12,6 +12,7 @@ class UsersController < ApplicationController
|
||||
|
||||
def update
|
||||
@user = Current.user
|
||||
return if moniker_change_requested? && !ensure_admin
|
||||
|
||||
if email_changed?
|
||||
if @user.initiate_email_change(user_params[:email])
|
||||
@@ -106,7 +107,7 @@ class UsersController < ApplicationController
|
||||
params.require(:user).permit(
|
||||
:first_name, :last_name, :email, :profile_image, :redirect_to, :delete_profile_image, :onboarded_at,
|
||||
:show_sidebar, :default_period, :default_account_order, :show_ai_sidebar, :ai_enabled, :theme, :set_onboarding_preferences_at, :set_onboarding_goals_at, :locale,
|
||||
family_attributes: [ :name, :currency, :country, :date_format, :timezone, :locale, :id ],
|
||||
family_attributes: [ :name, :currency, :country, :date_format, :timezone, :locale, :month_start_day, :moniker, :id ],
|
||||
goals: []
|
||||
)
|
||||
end
|
||||
@@ -115,7 +116,17 @@ class UsersController < ApplicationController
|
||||
@user = Current.user
|
||||
end
|
||||
|
||||
def moniker_change_requested?
|
||||
requested_moniker = params.dig(:user, :family_attributes, :moniker)
|
||||
return false if requested_moniker.blank?
|
||||
|
||||
requested_moniker != Current.family.moniker
|
||||
end
|
||||
|
||||
def ensure_admin
|
||||
redirect_to settings_profile_path, alert: I18n.t("users.reset.unauthorized") unless Current.user.admin?
|
||||
return true if Current.user.admin?
|
||||
|
||||
redirect_to settings_profile_path, alert: I18n.t("users.reset.unauthorized")
|
||||
false
|
||||
end
|
||||
end
|
||||
|
||||
@@ -70,6 +70,23 @@ module ApplicationHelper
|
||||
end
|
||||
end
|
||||
|
||||
|
||||
def family_moniker
|
||||
Current.family&.moniker_label || "Family"
|
||||
end
|
||||
|
||||
def family_moniker_downcase
|
||||
family_moniker.downcase
|
||||
end
|
||||
|
||||
def family_moniker_plural
|
||||
Current.family&.moniker_label_plural || "Families"
|
||||
end
|
||||
|
||||
def family_moniker_plural_downcase
|
||||
family_moniker_plural.downcase
|
||||
end
|
||||
|
||||
def format_money(number_or_money, options = {})
|
||||
return nil unless number_or_money
|
||||
|
||||
|
||||
@@ -301,6 +301,7 @@ module LanguagesHelper
|
||||
NO: "🇳🇴 Norway",
|
||||
OM: "🇴🇲 Oman",
|
||||
PK: "🇵🇰 Pakistan",
|
||||
PS: "🇵🇸 Palestine",
|
||||
PW: "🇵🇼 Palau",
|
||||
PA: "🇵🇦 Panama",
|
||||
PG: "🇵🇬 Papua New Guinea",
|
||||
|
||||
@@ -63,7 +63,7 @@ module SettingsHelper
|
||||
previous_setting = adjacent_setting(request.path, -1)
|
||||
next_setting = adjacent_setting(request.path, 1)
|
||||
|
||||
content_tag :div, class: "md:hidden flex flex-col gap-4" do
|
||||
content_tag :div, class: "md:hidden flex flex-col gap-4 pb-[env(safe-area-inset-bottom)]" do
|
||||
concat(previous_setting)
|
||||
concat(next_setting)
|
||||
end
|
||||
|
||||
@@ -2,6 +2,18 @@ import { Controller } from "@hotwired/stimulus";
|
||||
|
||||
// Connects to data-controller="onboarding"
|
||||
export default class extends Controller {
|
||||
static targets = ["nameField", "monikerRadio"]
|
||||
static values = {
|
||||
householdNameLabel: String,
|
||||
householdNamePlaceholder: String,
|
||||
groupNameLabel: String,
|
||||
groupNamePlaceholder: String
|
||||
}
|
||||
|
||||
connect() {
|
||||
this.updateNameFieldForCurrentMoniker();
|
||||
}
|
||||
|
||||
setLocale(event) {
|
||||
this.refreshWithParam("locale", event.target.value);
|
||||
}
|
||||
@@ -18,6 +30,30 @@ export default class extends Controller {
|
||||
document.documentElement.setAttribute("data-theme", event.target.value);
|
||||
}
|
||||
|
||||
updateNameFieldForCurrentMoniker(event = null) {
|
||||
if (!this.hasNameFieldTarget) {
|
||||
return;
|
||||
}
|
||||
|
||||
const selectedMonikerRadio = event?.target?.dataset?.onboardingMoniker ? event.target : this.monikerRadioTargets.find((radio) => radio.checked);
|
||||
const selectedMoniker = selectedMonikerRadio?.dataset?.onboardingMoniker;
|
||||
const isGroup = selectedMoniker === "Group";
|
||||
|
||||
this.nameFieldTarget.placeholder = isGroup ? this.groupNamePlaceholderValue : this.householdNamePlaceholderValue;
|
||||
|
||||
const label = this.nameFieldTarget.closest(".form-field")?.querySelector(".form-field__label");
|
||||
if (!label) {
|
||||
return;
|
||||
}
|
||||
|
||||
if (isGroup) {
|
||||
label.textContent = this.groupNameLabelValue;
|
||||
return;
|
||||
}
|
||||
|
||||
label.textContent = this.householdNameLabelValue;
|
||||
}
|
||||
|
||||
refreshWithParam(key, value) {
|
||||
const url = new URL(window.location);
|
||||
url.searchParams.set(key, value);
|
||||
|
||||
@@ -11,6 +11,9 @@ export default class extends Controller {
|
||||
];
|
||||
|
||||
remove(e) {
|
||||
e.preventDefault();
|
||||
e.stopPropagation();
|
||||
|
||||
if (e.params.destroy) {
|
||||
this.destroyFieldTarget.value = true;
|
||||
this.element.classList.add("hidden");
|
||||
|
||||
@@ -26,6 +26,9 @@ export default class extends Controller {
|
||||
}
|
||||
|
||||
remove(e) {
|
||||
e.preventDefault();
|
||||
e.stopPropagation();
|
||||
|
||||
// Find the parent rules controller before removing the condition
|
||||
const rulesEl = this.element.closest('[data-controller~="rules"]');
|
||||
|
||||
|
||||
@@ -349,7 +349,7 @@ export default class extends Controller {
|
||||
const dialog = this.element.closest("dialog");
|
||||
this.tooltip = d3.select(dialog || document.body)
|
||||
.append("div")
|
||||
.attr("class", "bg-gray-700 text-white text-sm p-2 rounded pointer-events-none absolute z-50")
|
||||
.attr("class", "bg-gray-700 text-white text-sm p-2 rounded pointer-events-none absolute z-50 top-0")
|
||||
.style("opacity", 0)
|
||||
.style("pointer-events", "none");
|
||||
}
|
||||
|
||||
46
app/javascript/controllers/viewport_controller.js
Normal file
46
app/javascript/controllers/viewport_controller.js
Normal file
@@ -0,0 +1,46 @@
|
||||
import { Controller } from "@hotwired/stimulus"
|
||||
|
||||
export default class extends Controller {
|
||||
static targets = ["content", "bottomNav"]
|
||||
|
||||
connect() {
|
||||
this.updateViewport()
|
||||
this.updateBottomSpacing()
|
||||
|
||||
window.addEventListener("resize", this.handleResize)
|
||||
window.addEventListener("orientationchange", this.handleResize)
|
||||
|
||||
if (this.hasBottomNavTarget) {
|
||||
this.resizeObserver = new ResizeObserver(() => {
|
||||
this.updateBottomSpacing()
|
||||
})
|
||||
this.resizeObserver.observe(this.bottomNavTarget)
|
||||
}
|
||||
}
|
||||
|
||||
disconnect() {
|
||||
window.removeEventListener("resize", this.handleResize)
|
||||
window.removeEventListener("orientationchange", this.handleResize)
|
||||
|
||||
if (this.resizeObserver) {
|
||||
this.resizeObserver.disconnect()
|
||||
}
|
||||
}
|
||||
|
||||
handleResize = () => {
|
||||
this.updateViewport()
|
||||
this.updateBottomSpacing()
|
||||
}
|
||||
|
||||
updateViewport() {
|
||||
const height = window.innerHeight
|
||||
document.documentElement.style.setProperty("--app-height", `${height}px`)
|
||||
}
|
||||
|
||||
updateBottomSpacing() {
|
||||
if (!this.hasBottomNavTarget || !this.hasContentTarget) return
|
||||
|
||||
const navHeight = this.bottomNavTarget.offsetHeight
|
||||
this.contentTarget.style.paddingBottom = `${navHeight}px`
|
||||
}
|
||||
}
|
||||
28
app/jobs/clear_ai_cache_job.rb
Normal file
28
app/jobs/clear_ai_cache_job.rb
Normal file
@@ -0,0 +1,28 @@
|
||||
class ClearAiCacheJob < ApplicationJob
|
||||
queue_as :low_priority
|
||||
|
||||
def perform(family)
|
||||
if family.nil?
|
||||
Rails.logger.warn("ClearAiCacheJob called with nil family, skipping")
|
||||
return
|
||||
end
|
||||
|
||||
Rails.logger.info("Clearing AI cache for family #{family.id}")
|
||||
|
||||
# Clear AI enrichment data for transactions
|
||||
begin
|
||||
count = Transaction.clear_ai_cache(family)
|
||||
Rails.logger.info("Cleared AI cache for #{count} transactions")
|
||||
rescue => e
|
||||
Rails.logger.error("Failed to clear AI cache for transactions: #{e.message}")
|
||||
end
|
||||
|
||||
# Clear AI enrichment data for entries
|
||||
begin
|
||||
count = Entry.clear_ai_cache(family)
|
||||
Rails.logger.info("Cleared AI cache for #{count} entries")
|
||||
rescue => e
|
||||
Rails.logger.error("Failed to clear AI cache for entries: #{e.message}")
|
||||
end
|
||||
end
|
||||
end
|
||||
41
app/jobs/indexa_capital_activities_fetch_job.rb
Normal file
41
app/jobs/indexa_capital_activities_fetch_job.rb
Normal file
@@ -0,0 +1,41 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class IndexaCapitalActivitiesFetchJob < ApplicationJob
|
||||
queue_as :default
|
||||
|
||||
sidekiq_options lock: :until_executed,
|
||||
lock_args_method: ->(args) { args.first },
|
||||
on_conflict: :log
|
||||
|
||||
# Indexa Capital API does not provide an activities/transactions endpoint.
|
||||
# This job simply clears the pending flag and broadcasts updates.
|
||||
def perform(indexa_capital_account, start_date: nil, retry_count: 0)
|
||||
@indexa_capital_account = indexa_capital_account
|
||||
return clear_pending_flag unless @indexa_capital_account&.indexa_capital_item
|
||||
|
||||
Rails.logger.info "IndexaCapitalActivitiesFetchJob - No activities endpoint available for Indexa Capital, clearing pending flag"
|
||||
clear_pending_flag
|
||||
broadcast_updates
|
||||
rescue => e
|
||||
Rails.logger.error("IndexaCapitalActivitiesFetchJob error: #{e.class} - #{e.message}")
|
||||
clear_pending_flag
|
||||
raise
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def clear_pending_flag
|
||||
@indexa_capital_account&.update!(activities_fetch_pending: false)
|
||||
end
|
||||
|
||||
def broadcast_updates
|
||||
@indexa_capital_account.current_account&.broadcast_sync_complete
|
||||
@indexa_capital_account.indexa_capital_item&.broadcast_replace_to(
|
||||
@indexa_capital_account.indexa_capital_item.family,
|
||||
target: "indexa_capital_item_#{@indexa_capital_account.indexa_capital_item.id}",
|
||||
partial: "indexa_capital_items/indexa_capital_item"
|
||||
)
|
||||
rescue => e
|
||||
Rails.logger.warn("IndexaCapitalActivitiesFetchJob - Broadcast failed: #{e.message}")
|
||||
end
|
||||
end
|
||||
55
app/jobs/indexa_capital_connection_cleanup_job.rb
Normal file
55
app/jobs/indexa_capital_connection_cleanup_job.rb
Normal file
@@ -0,0 +1,55 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class IndexaCapitalConnectionCleanupJob < ApplicationJob
|
||||
queue_as :default
|
||||
|
||||
def perform(indexa_capital_item_id:, authorization_id:, account_id:)
|
||||
Rails.logger.info(
|
||||
"IndexaCapitalConnectionCleanupJob - Cleaning up connection #{authorization_id} " \
|
||||
"for former account #{account_id}"
|
||||
)
|
||||
|
||||
indexa_capital_item = IndexaCapitalItem.find_by(id: indexa_capital_item_id)
|
||||
return unless indexa_capital_item
|
||||
|
||||
# Check if other accounts still use this connection
|
||||
if indexa_capital_item.indexa_capital_accounts
|
||||
.where(indexa_capital_authorization_id: authorization_id)
|
||||
.exists?
|
||||
Rails.logger.info("IndexaCapitalConnectionCleanupJob - Connection still in use, skipping")
|
||||
return
|
||||
end
|
||||
|
||||
# Delete from provider API
|
||||
delete_connection(indexa_capital_item, authorization_id)
|
||||
|
||||
Rails.logger.info("IndexaCapitalConnectionCleanupJob - Connection #{authorization_id} deleted")
|
||||
rescue => e
|
||||
Rails.logger.warn(
|
||||
"IndexaCapitalConnectionCleanupJob - Failed: #{e.class} - #{e.message}"
|
||||
)
|
||||
# Don't raise - cleanup failures shouldn't block other operations
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def delete_connection(indexa_capital_item, authorization_id)
|
||||
provider = indexa_capital_item.indexa_capital_provider
|
||||
return unless provider
|
||||
|
||||
credentials = indexa_capital_item.indexa_capital_credentials
|
||||
return unless credentials
|
||||
|
||||
# TODO: Implement API call to delete connection
|
||||
# Example:
|
||||
# provider.delete_connection(
|
||||
# authorization_id: authorization_id,
|
||||
# **credentials
|
||||
# )
|
||||
nil # Placeholder until provider.delete_connection is implemented
|
||||
rescue => e
|
||||
Rails.logger.warn(
|
||||
"IndexaCapitalConnectionCleanupJob - API delete failed: #{e.message}"
|
||||
)
|
||||
end
|
||||
end
|
||||
88
app/jobs/process_pdf_job.rb
Normal file
88
app/jobs/process_pdf_job.rb
Normal file
@@ -0,0 +1,88 @@
|
||||
class ProcessPdfJob < ApplicationJob
|
||||
queue_as :medium_priority
|
||||
|
||||
def perform(pdf_import)
|
||||
return unless pdf_import.is_a?(PdfImport)
|
||||
return unless pdf_import.pdf_uploaded?
|
||||
return if pdf_import.status == "complete"
|
||||
return if pdf_import.ai_processed? && (!pdf_import.bank_statement? || pdf_import.rows_count > 0)
|
||||
|
||||
pdf_import.update!(status: :importing)
|
||||
|
||||
begin
|
||||
process_result = pdf_import.process_with_ai
|
||||
document_type = resolve_document_type(pdf_import, process_result)
|
||||
upload_to_vector_store(pdf_import, document_type: document_type)
|
||||
|
||||
# For bank statements, extract transactions and generate import rows
|
||||
if bank_statement_document?(document_type)
|
||||
Rails.logger.info("ProcessPdfJob: Extracting transactions for bank statement import #{pdf_import.id}")
|
||||
pdf_import.extract_transactions
|
||||
Rails.logger.info("ProcessPdfJob: Extracted #{pdf_import.extracted_transactions.size} transactions")
|
||||
|
||||
pdf_import.generate_rows_from_extracted_data
|
||||
pdf_import.sync_mappings
|
||||
Rails.logger.info("ProcessPdfJob: Generated #{pdf_import.rows_count} import rows")
|
||||
end
|
||||
|
||||
# Find the user who created this import (first admin or any user in the family)
|
||||
user = pdf_import.family.users.find_by(role: :admin) || pdf_import.family.users.first
|
||||
|
||||
if user
|
||||
pdf_import.send_next_steps_email(user)
|
||||
end
|
||||
|
||||
# Bank statements with rows go to pending for user review/publish
|
||||
# Non-bank statements are marked complete (no further action needed)
|
||||
final_status = bank_statement_document?(document_type) && pdf_import.rows_count > 0 ? :pending : :complete
|
||||
pdf_import.update!(status: final_status)
|
||||
rescue StandardError => e
|
||||
sanitized_error = sanitize_error_message(e)
|
||||
Rails.logger.error("PDF processing failed for import #{pdf_import.id}: #{e.class.name} - #{sanitized_error}")
|
||||
begin
|
||||
pdf_import.update!(status: :failed, error: sanitized_error)
|
||||
rescue StandardError => update_error
|
||||
Rails.logger.error("Failed to update import status: #{update_error.message}")
|
||||
end
|
||||
raise
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def sanitize_error_message(error)
|
||||
case error
|
||||
when RuntimeError, ArgumentError
|
||||
I18n.t("imports.pdf_import.processing_failed_with_message",
|
||||
message: error.message.truncate(500))
|
||||
else
|
||||
I18n.t("imports.pdf_import.processing_failed_generic",
|
||||
error: error.class.name.demodulize)
|
||||
end
|
||||
end
|
||||
|
||||
def upload_to_vector_store(pdf_import, document_type:)
|
||||
filename = pdf_import.pdf_file.filename.to_s
|
||||
file_content = pdf_import.pdf_file_content
|
||||
|
||||
family_document = pdf_import.family.upload_document(
|
||||
file_content: file_content,
|
||||
filename: filename,
|
||||
metadata: { "type" => document_type }
|
||||
)
|
||||
|
||||
return if family_document
|
||||
|
||||
Rails.logger.warn("ProcessPdfJob: Vector store upload failed for import #{pdf_import.id}")
|
||||
end
|
||||
|
||||
def resolve_document_type(pdf_import, process_result)
|
||||
return process_result.document_type if process_result.respond_to?(:document_type) && process_result.document_type.present?
|
||||
|
||||
pdf_import.reload.document_type
|
||||
end
|
||||
|
||||
def bank_statement_document?(document_type)
|
||||
document_type == "bank_statement"
|
||||
end
|
||||
end
|
||||
@@ -8,7 +8,7 @@ class InvitationMailer < ApplicationMailer
|
||||
subject: t(
|
||||
".subject",
|
||||
inviter: @invitation.inviter.display_name,
|
||||
product: product_name
|
||||
product_name: product_name
|
||||
)
|
||||
)
|
||||
end
|
||||
|
||||
12
app/mailers/pdf_import_mailer.rb
Normal file
12
app/mailers/pdf_import_mailer.rb
Normal file
@@ -0,0 +1,12 @@
|
||||
class PdfImportMailer < ApplicationMailer
|
||||
def next_steps
|
||||
@user = params[:user]
|
||||
@pdf_import = params[:pdf_import]
|
||||
@import_url = import_url(@pdf_import)
|
||||
|
||||
mail(
|
||||
to: @user.email,
|
||||
subject: t(".subject", product_name: product_name)
|
||||
)
|
||||
end
|
||||
end
|
||||
@@ -36,7 +36,7 @@ class Account < ApplicationRecord
|
||||
manual.where.not(status: :pending_deletion)
|
||||
}
|
||||
|
||||
has_one_attached :logo
|
||||
has_one_attached :logo, dependent: :purge_later
|
||||
|
||||
delegated_type :accountable, types: Accountable::TYPES, dependent: :destroy
|
||||
delegate :subtype, to: :accountable, allow_nil: true
|
||||
@@ -74,6 +74,11 @@ class Account < ApplicationRecord
|
||||
end
|
||||
|
||||
class << self
|
||||
def human_attribute_name(attribute, options = {})
|
||||
options = { moniker: Current.family&.moniker_label || "Family" }.merge(options)
|
||||
super(attribute, options)
|
||||
end
|
||||
|
||||
def create_and_sync(attributes, skip_initial_sync: false)
|
||||
attributes[:accountable_attributes] ||= {} # Ensure accountable is created, even if empty
|
||||
# Default cash_balance to balance unless explicitly provided (e.g., Crypto sets it to 0)
|
||||
|
||||
@@ -77,10 +77,14 @@ class Account::ProviderImportAdapter
|
||||
end
|
||||
|
||||
# If still a new entry and this is a POSTED transaction, check for matching pending transactions
|
||||
incoming_pending = extra.is_a?(Hash) && (
|
||||
ActiveModel::Type::Boolean.new.cast(extra.dig("simplefin", "pending")) ||
|
||||
ActiveModel::Type::Boolean.new.cast(extra.dig("plaid", "pending"))
|
||||
)
|
||||
incoming_pending = false
|
||||
if extra.is_a?(Hash)
|
||||
pending_extra = extra.with_indifferent_access
|
||||
incoming_pending =
|
||||
ActiveModel::Type::Boolean.new.cast(pending_extra.dig("simplefin", "pending")) ||
|
||||
ActiveModel::Type::Boolean.new.cast(pending_extra.dig("plaid", "pending")) ||
|
||||
ActiveModel::Type::Boolean.new.cast(pending_extra.dig("lunchflow", "pending"))
|
||||
end
|
||||
|
||||
if entry.new_record? && !incoming_pending
|
||||
pending_match = nil
|
||||
@@ -686,6 +690,7 @@ class Account::ProviderImportAdapter
|
||||
.where(<<~SQL.squish)
|
||||
(transactions.extra -> 'simplefin' ->> 'pending')::boolean = true
|
||||
OR (transactions.extra -> 'plaid' ->> 'pending')::boolean = true
|
||||
OR (transactions.extra -> 'lunchflow' ->> 'pending')::boolean = true
|
||||
SQL
|
||||
.order(date: :desc) # Prefer most recent pending transaction
|
||||
|
||||
@@ -731,6 +736,7 @@ class Account::ProviderImportAdapter
|
||||
.where(<<~SQL.squish)
|
||||
(transactions.extra -> 'simplefin' ->> 'pending')::boolean = true
|
||||
OR (transactions.extra -> 'plaid' ->> 'pending')::boolean = true
|
||||
OR (transactions.extra -> 'lunchflow' ->> 'pending')::boolean = true
|
||||
SQL
|
||||
|
||||
# If merchant_id is provided, prioritize matching by merchant
|
||||
@@ -799,6 +805,7 @@ class Account::ProviderImportAdapter
|
||||
.where(<<~SQL.squish)
|
||||
(transactions.extra -> 'simplefin' ->> 'pending')::boolean = true
|
||||
OR (transactions.extra -> 'plaid' ->> 'pending')::boolean = true
|
||||
OR (transactions.extra -> 'lunchflow' ->> 'pending')::boolean = true
|
||||
SQL
|
||||
|
||||
# For low confidence, require BOTH merchant AND name match (stronger signal needed)
|
||||
@@ -836,6 +843,11 @@ class Account::ProviderImportAdapter
|
||||
# Don't overwrite if already has a suggestion (keep first one found)
|
||||
return if existing_extra["potential_posted_match"].present?
|
||||
|
||||
# Don't suggest if the posted entry is also still pending (pending→pending match)
|
||||
# Suggestions are only for pending→posted reconciliation
|
||||
posted_transaction = posted_entry.entryable
|
||||
return if posted_transaction.is_a?(Transaction) && posted_transaction.pending?
|
||||
|
||||
pending_transaction.update!(
|
||||
extra: existing_extra.merge(
|
||||
"potential_posted_match" => {
|
||||
|
||||
@@ -9,7 +9,8 @@ class ApiKey < ApplicationRecord
|
||||
end
|
||||
|
||||
# Constants
|
||||
SOURCES = [ "web", "mobile" ].freeze
|
||||
SOURCES = [ "web", "mobile", "monitoring" ].freeze
|
||||
DEMO_MONITORING_KEY = "demo_monitoring_key_a1b2c3d4e5f6g7h8i9j0k1l2m3n4o5p6"
|
||||
|
||||
# Validations
|
||||
validates :display_key, presence: true, uniqueness: true
|
||||
@@ -21,9 +22,11 @@ class ApiKey < ApplicationRecord
|
||||
|
||||
# Callbacks
|
||||
before_validation :set_display_key
|
||||
before_destroy :prevent_demo_monitoring_key_destroy!
|
||||
|
||||
# Scopes
|
||||
scope :active, -> { where(revoked_at: nil).where("expires_at IS NULL OR expires_at > ?", Time.current) }
|
||||
scope :visible, -> { where.not(display_key: DEMO_MONITORING_KEY) }
|
||||
|
||||
# Class methods
|
||||
def self.find_by_value(plain_key)
|
||||
@@ -57,9 +60,19 @@ class ApiKey < ApplicationRecord
|
||||
end
|
||||
|
||||
def revoke!
|
||||
raise ActiveRecord::RecordNotDestroyed, "Cannot revoke demo monitoring API key" if demo_monitoring_key?
|
||||
update!(revoked_at: Time.current)
|
||||
end
|
||||
|
||||
def delete
|
||||
raise ActiveRecord::RecordNotDestroyed, "Cannot destroy demo monitoring API key" if demo_monitoring_key?
|
||||
super
|
||||
end
|
||||
|
||||
def demo_monitoring_key?
|
||||
display_key == DEMO_MONITORING_KEY
|
||||
end
|
||||
|
||||
def update_last_used!
|
||||
update_column(:last_used_at, Time.current)
|
||||
end
|
||||
@@ -95,4 +108,11 @@ class ApiKey < ApplicationRecord
|
||||
errors.add(:user, "can only have one active API key per source (#{source})")
|
||||
end
|
||||
end
|
||||
|
||||
def prevent_demo_monitoring_key_destroy!
|
||||
return unless demo_monitoring_key?
|
||||
|
||||
errors.add(:base, "Cannot destroy demo monitoring API key")
|
||||
throw(:abort)
|
||||
end
|
||||
end
|
||||
|
||||
@@ -6,20 +6,60 @@ module Assistant::Configurable
|
||||
preferred_currency = Money::Currency.new(chat.user.family.currency)
|
||||
preferred_date_format = chat.user.family.date_format
|
||||
|
||||
{
|
||||
instructions: default_instructions(preferred_currency, preferred_date_format),
|
||||
functions: default_functions
|
||||
}
|
||||
if chat.user.ui_layout_intro?
|
||||
{
|
||||
instructions: intro_instructions(preferred_currency, preferred_date_format),
|
||||
functions: []
|
||||
}
|
||||
else
|
||||
{
|
||||
instructions: default_instructions(preferred_currency, preferred_date_format),
|
||||
functions: default_functions
|
||||
}
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
def intro_instructions(preferred_currency, preferred_date_format)
|
||||
<<~PROMPT
|
||||
## Your identity
|
||||
|
||||
You are Sure, a warm and curious financial guide welcoming a new household to the Sure personal finance application.
|
||||
|
||||
## Your purpose
|
||||
|
||||
Host an introductory conversation that helps you understand the user's stage of life, financial responsibilities, and near-term priorities so future guidance feels personal and relevant.
|
||||
|
||||
## Conversation approach
|
||||
|
||||
- Ask one thoughtful question at a time and tailor follow-ups based on what the user shares.
|
||||
- Reflect key details back to the user to confirm understanding.
|
||||
- Keep responses concise, friendly, and free of filler phrases.
|
||||
- If the user requests detailed analytics, let them know the dashboard experience will cover it soon and guide them back to sharing context.
|
||||
|
||||
## Information to uncover
|
||||
|
||||
- Household composition and stage of life milestones (education, career, retirement, dependents, caregiving, etc.).
|
||||
- Primary financial goals, concerns, and timelines.
|
||||
- Notable upcoming events or obligations.
|
||||
|
||||
## Formatting guidelines
|
||||
|
||||
- Use markdown for any lists or emphasis.
|
||||
- When money or timeframes are discussed, format currency with #{preferred_currency.symbol} (#{preferred_currency.iso_code}) and dates using #{preferred_date_format}.
|
||||
- Do not call external tools or functions.
|
||||
PROMPT
|
||||
end
|
||||
|
||||
def default_functions
|
||||
[
|
||||
Assistant::Function::GetTransactions,
|
||||
Assistant::Function::GetAccounts,
|
||||
Assistant::Function::GetHoldings,
|
||||
Assistant::Function::GetBalanceSheet,
|
||||
Assistant::Function::GetIncomeStatement
|
||||
Assistant::Function::GetIncomeStatement,
|
||||
Assistant::Function::ImportBankStatement,
|
||||
Assistant::Function::SearchFamilyFiles
|
||||
]
|
||||
end
|
||||
|
||||
|
||||
188
app/models/assistant/function/import_bank_statement.rb
Normal file
188
app/models/assistant/function/import_bank_statement.rb
Normal file
@@ -0,0 +1,188 @@
|
||||
require "csv"
|
||||
|
||||
class Assistant::Function::ImportBankStatement < Assistant::Function
|
||||
class << self
|
||||
def name
|
||||
"import_bank_statement"
|
||||
end
|
||||
|
||||
def description
|
||||
<<~INSTRUCTIONS
|
||||
Use this to import transactions from a bank statement PDF that has already been uploaded.
|
||||
|
||||
This function will:
|
||||
1. Extract transaction data from the PDF using AI
|
||||
2. Create a transaction import with the extracted data
|
||||
3. Return the import ID and extracted transactions for review
|
||||
|
||||
The PDF must have already been uploaded via the PDF import feature.
|
||||
Only use this for PDFs that are identified as bank statements.
|
||||
|
||||
Example:
|
||||
|
||||
```
|
||||
import_bank_statement({
|
||||
pdf_import_id: "abc123-def456",
|
||||
account_id: "xyz789"
|
||||
})
|
||||
```
|
||||
|
||||
If account_id is not provided, you should ask the user which account to import to.
|
||||
INSTRUCTIONS
|
||||
end
|
||||
end
|
||||
|
||||
def strict_mode?
|
||||
false
|
||||
end
|
||||
|
||||
def params_schema
|
||||
build_schema(
|
||||
required: [ "pdf_import_id" ],
|
||||
properties: {
|
||||
pdf_import_id: {
|
||||
type: "string",
|
||||
description: "The ID of the PDF import to extract transactions from"
|
||||
},
|
||||
account_id: {
|
||||
type: "string",
|
||||
description: "The ID of the account to import transactions into. If not provided, will return available accounts."
|
||||
}
|
||||
}
|
||||
)
|
||||
end
|
||||
|
||||
def call(params = {})
|
||||
pdf_import = family.imports.find_by(id: params["pdf_import_id"], type: "PdfImport")
|
||||
|
||||
unless pdf_import
|
||||
return {
|
||||
success: false,
|
||||
error: "PDF import not found",
|
||||
message: "Could not find a PDF import with ID: #{params["pdf_import_id"]}"
|
||||
}
|
||||
end
|
||||
|
||||
unless pdf_import.document_type == "bank_statement"
|
||||
return {
|
||||
success: false,
|
||||
error: "not_bank_statement",
|
||||
message: "This PDF is not a bank statement. Document type: #{pdf_import.document_type}",
|
||||
available_actions: [ "Use a different PDF that is a bank statement" ]
|
||||
}
|
||||
end
|
||||
|
||||
# If no account specified, return available accounts
|
||||
if params["account_id"].blank?
|
||||
return {
|
||||
success: false,
|
||||
error: "account_required",
|
||||
message: "Please specify which account to import transactions into",
|
||||
available_accounts: family.accounts.visible.depository.map { |a| { id: a.id, name: a.name } }
|
||||
}
|
||||
end
|
||||
|
||||
account = family.accounts.find_by(id: params["account_id"])
|
||||
unless account
|
||||
return {
|
||||
success: false,
|
||||
error: "account_not_found",
|
||||
message: "Account not found",
|
||||
available_accounts: family.accounts.visible.depository.map { |a| { id: a.id, name: a.name } }
|
||||
}
|
||||
end
|
||||
|
||||
# Extract transactions from the PDF using provider
|
||||
provider = Provider::Registry.get_provider(:openai)
|
||||
unless provider
|
||||
return {
|
||||
success: false,
|
||||
error: "provider_not_configured",
|
||||
message: "OpenAI provider is not configured"
|
||||
}
|
||||
end
|
||||
|
||||
response = provider.extract_bank_statement(
|
||||
pdf_content: pdf_import.pdf_file_content,
|
||||
model: openai_model,
|
||||
family: family
|
||||
)
|
||||
|
||||
unless response.success?
|
||||
error_message = response.error&.message || "Unknown extraction error"
|
||||
return {
|
||||
success: false,
|
||||
error: "extraction_failed",
|
||||
message: "Failed to extract transactions: #{error_message}"
|
||||
}
|
||||
end
|
||||
|
||||
result = response.data
|
||||
|
||||
if result[:transactions].blank?
|
||||
return {
|
||||
success: false,
|
||||
error: "no_transactions_found",
|
||||
message: "Could not extract any transactions from the bank statement"
|
||||
}
|
||||
end
|
||||
|
||||
# Create a CSV from extracted transactions
|
||||
csv_content = generate_csv(result[:transactions])
|
||||
|
||||
# Create a TransactionImport
|
||||
import = family.imports.create!(
|
||||
type: "TransactionImport",
|
||||
account: account,
|
||||
raw_file_str: csv_content,
|
||||
date_col_label: "date",
|
||||
amount_col_label: "amount",
|
||||
name_col_label: "name",
|
||||
category_col_label: "category",
|
||||
notes_col_label: "notes",
|
||||
date_format: "%Y-%m-%d",
|
||||
signage_convention: "inflows_positive"
|
||||
)
|
||||
|
||||
import.generate_rows_from_csv
|
||||
|
||||
{
|
||||
success: true,
|
||||
import_id: import.id,
|
||||
transaction_count: result[:transactions].size,
|
||||
transactions_preview: result[:transactions].first(5),
|
||||
statement_period: result[:period],
|
||||
account_holder: result[:account_holder],
|
||||
message: "Successfully extracted #{result[:transactions].size} transactions. Import created with ID: #{import.id}. Review and publish when ready."
|
||||
}
|
||||
rescue Provider::ProviderError, Faraday::Error, Timeout::Error, RuntimeError => e
|
||||
Rails.logger.error("ImportBankStatement error: #{e.class.name} - #{e.message}")
|
||||
Rails.logger.error(e.backtrace.first(10).join("\n"))
|
||||
{
|
||||
success: false,
|
||||
error: "extraction_failed",
|
||||
message: "Failed to extract transactions: #{e.message.truncate(200)}"
|
||||
}
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def generate_csv(transactions)
|
||||
CSV.generate do |csv|
|
||||
csv << %w[date amount name category notes]
|
||||
transactions.each do |txn|
|
||||
csv << [
|
||||
txn[:date],
|
||||
txn[:amount],
|
||||
txn[:name] || txn[:description],
|
||||
txn[:category],
|
||||
txn[:notes]
|
||||
]
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
def openai_model
|
||||
ENV["OPENAI_MODEL"].presence || Provider::Openai::DEFAULT_MODEL
|
||||
end
|
||||
end
|
||||
118
app/models/assistant/function/search_family_files.rb
Normal file
118
app/models/assistant/function/search_family_files.rb
Normal file
@@ -0,0 +1,118 @@
|
||||
class Assistant::Function::SearchFamilyFiles < Assistant::Function
|
||||
class << self
|
||||
def name
|
||||
"search_family_files"
|
||||
end
|
||||
|
||||
def description
|
||||
<<~DESC
|
||||
Search through documents that the family has uploaded to their financial document store.
|
||||
|
||||
Use this when the user asks questions about their uploaded financial documents such as
|
||||
tax returns, bank statements, contracts, insurance policies, investment reports, or any
|
||||
other files they've imported.
|
||||
|
||||
Returns relevant excerpts from matching documents along with the source filename and
|
||||
a relevance score.
|
||||
|
||||
Supported file types include: PDF, DOCX, XLSX, PPTX, TXT, CSV, JSON, XML, HTML, MD,
|
||||
and common source code formats.
|
||||
|
||||
Example:
|
||||
|
||||
```
|
||||
search_family_files({
|
||||
query: "What was the total income on my 2024 tax return?"
|
||||
})
|
||||
```
|
||||
DESC
|
||||
end
|
||||
end
|
||||
|
||||
def strict_mode?
|
||||
false
|
||||
end
|
||||
|
||||
def params_schema
|
||||
build_schema(
|
||||
required: [ "query" ],
|
||||
properties: {
|
||||
query: {
|
||||
type: "string",
|
||||
description: "The search query to find relevant information in the family's uploaded documents"
|
||||
},
|
||||
max_results: {
|
||||
type: "integer",
|
||||
description: "Maximum number of results to return (default: 10, max: 20)"
|
||||
}
|
||||
}
|
||||
)
|
||||
end
|
||||
|
||||
def call(params = {})
|
||||
query = params["query"]
|
||||
max_results = (params["max_results"] || 10).to_i.clamp(1, 20)
|
||||
|
||||
unless family.vector_store_id.present?
|
||||
return {
|
||||
success: false,
|
||||
error: "no_documents",
|
||||
message: "No documents have been uploaded to the family document store yet."
|
||||
}
|
||||
end
|
||||
|
||||
adapter = VectorStore.adapter
|
||||
|
||||
unless adapter
|
||||
return {
|
||||
success: false,
|
||||
error: "provider_not_configured",
|
||||
message: "No vector store is configured. Set VECTOR_STORE_PROVIDER or configure OpenAI."
|
||||
}
|
||||
end
|
||||
|
||||
response = adapter.search(
|
||||
store_id: family.vector_store_id,
|
||||
query: query,
|
||||
max_results: max_results
|
||||
)
|
||||
|
||||
unless response.success?
|
||||
return {
|
||||
success: false,
|
||||
error: "search_failed",
|
||||
message: "Failed to search documents: #{response.error&.message}"
|
||||
}
|
||||
end
|
||||
|
||||
results = response.data
|
||||
|
||||
if results.empty?
|
||||
return {
|
||||
success: true,
|
||||
results: [],
|
||||
message: "No matching documents found for the query."
|
||||
}
|
||||
end
|
||||
|
||||
{
|
||||
success: true,
|
||||
query: query,
|
||||
result_count: results.size,
|
||||
results: results.map do |result|
|
||||
{
|
||||
content: result[:content],
|
||||
filename: result[:filename],
|
||||
score: result[:score]
|
||||
}
|
||||
end
|
||||
}
|
||||
rescue => e
|
||||
Rails.logger.error("SearchFamilyFiles error: #{e.class.name} - #{e.message}")
|
||||
{
|
||||
success: false,
|
||||
error: "search_failed",
|
||||
message: "An error occurred while searching documents: #{e.message.truncate(200)}"
|
||||
}
|
||||
end
|
||||
end
|
||||
@@ -19,24 +19,41 @@ class Budget < ApplicationRecord
|
||||
date.strftime(PARAM_DATE_FORMAT).downcase
|
||||
end
|
||||
|
||||
def param_to_date(param)
|
||||
Date.strptime(param, PARAM_DATE_FORMAT).beginning_of_month
|
||||
def param_to_date(param, family: nil)
|
||||
base_date = Date.strptime(param, PARAM_DATE_FORMAT)
|
||||
if family&.uses_custom_month_start?
|
||||
Date.new(base_date.year, base_date.month, family.month_start_day)
|
||||
else
|
||||
base_date.beginning_of_month
|
||||
end
|
||||
end
|
||||
|
||||
def budget_date_valid?(date, family:)
|
||||
beginning_of_month = date.beginning_of_month
|
||||
|
||||
beginning_of_month >= oldest_valid_budget_date(family) && beginning_of_month <= Date.current.end_of_month
|
||||
if family.uses_custom_month_start?
|
||||
budget_start = family.custom_month_start_for(date)
|
||||
budget_start >= oldest_valid_budget_date(family) && budget_start <= family.custom_month_end_for(Date.current)
|
||||
else
|
||||
beginning_of_month = date.beginning_of_month
|
||||
beginning_of_month >= oldest_valid_budget_date(family) && beginning_of_month <= Date.current.end_of_month
|
||||
end
|
||||
end
|
||||
|
||||
def find_or_bootstrap(family, start_date:)
|
||||
return nil unless budget_date_valid?(start_date, family: family)
|
||||
|
||||
Budget.transaction do
|
||||
if family.uses_custom_month_start?
|
||||
budget_start = family.custom_month_start_for(start_date)
|
||||
budget_end = family.custom_month_end_for(start_date)
|
||||
else
|
||||
budget_start = start_date.beginning_of_month
|
||||
budget_end = start_date.end_of_month
|
||||
end
|
||||
|
||||
budget = Budget.find_or_create_by!(
|
||||
family: family,
|
||||
start_date: start_date.beginning_of_month,
|
||||
end_date: start_date.end_of_month
|
||||
start_date: budget_start,
|
||||
end_date: budget_end
|
||||
) do |b|
|
||||
b.currency = family.currency
|
||||
end
|
||||
@@ -49,7 +66,6 @@ class Budget < ApplicationRecord
|
||||
|
||||
private
|
||||
def oldest_valid_budget_date(family)
|
||||
# Allow going back to either the earliest entry date OR 2 years ago, whichever is earlier
|
||||
two_years_ago = 2.years.ago.beginning_of_month
|
||||
oldest_entry_date = family.oldest_entry_date.beginning_of_month
|
||||
[ two_years_ago, oldest_entry_date ].min
|
||||
@@ -95,7 +111,15 @@ class Budget < ApplicationRecord
|
||||
end
|
||||
|
||||
def name
|
||||
start_date.strftime("%B %Y")
|
||||
if family.uses_custom_month_start?
|
||||
I18n.t(
|
||||
"budgets.name.custom_range",
|
||||
start: start_date.strftime("%b %d"),
|
||||
end_date: end_date.strftime("%b %d, %Y")
|
||||
)
|
||||
else
|
||||
I18n.t("budgets.name.month_year", month: start_date.strftime("%B %Y"))
|
||||
end
|
||||
end
|
||||
|
||||
def initialized?
|
||||
@@ -111,7 +135,12 @@ class Budget < ApplicationRecord
|
||||
end
|
||||
|
||||
def current?
|
||||
start_date == Date.today.beginning_of_month && end_date == Date.today.end_of_month
|
||||
if family.uses_custom_month_start?
|
||||
current_period = family.current_custom_month_period
|
||||
start_date == current_period.start_date && end_date == current_period.end_date
|
||||
else
|
||||
start_date == Date.current.beginning_of_month && end_date == Date.current.end_of_month
|
||||
end
|
||||
end
|
||||
|
||||
def previous_budget_param
|
||||
@@ -155,11 +184,14 @@ class Budget < ApplicationRecord
|
||||
end
|
||||
|
||||
def actual_spending
|
||||
expense_totals.total
|
||||
[ expense_totals.total - refunds_in_expense_categories, 0 ].max
|
||||
end
|
||||
|
||||
def budget_category_actual_spending(budget_category)
|
||||
expense_totals.category_totals.find { |ct| ct.category.id == budget_category.category.id }&.total || 0
|
||||
cat_id = budget_category.category_id
|
||||
expense = expense_totals_by_category[cat_id]&.total || 0
|
||||
refund = income_totals_by_category[cat_id]&.total || 0
|
||||
[ expense - refund, 0 ].max
|
||||
end
|
||||
|
||||
def category_median_monthly_expense(category)
|
||||
@@ -235,6 +267,14 @@ class Budget < ApplicationRecord
|
||||
end
|
||||
|
||||
private
|
||||
def refunds_in_expense_categories
|
||||
expense_category_ids = budget_categories.map(&:category_id).to_set
|
||||
income_totals.category_totals
|
||||
.reject { |ct| ct.category.subcategory? }
|
||||
.select { |ct| expense_category_ids.include?(ct.category.id) || ct.category.uncategorized? }
|
||||
.sum(&:total)
|
||||
end
|
||||
|
||||
def income_statement
|
||||
@income_statement ||= family.income_statement
|
||||
end
|
||||
@@ -246,4 +286,12 @@ class Budget < ApplicationRecord
|
||||
def income_totals
|
||||
@income_totals ||= family.income_statement.income_totals(period: period)
|
||||
end
|
||||
|
||||
def expense_totals_by_category
|
||||
@expense_totals_by_category ||= expense_totals.category_totals.index_by { |ct| ct.category.id }
|
||||
end
|
||||
|
||||
def income_totals_by_category
|
||||
@income_totals_by_category ||= income_totals.category_totals.index_by { |ct| ct.category.id }
|
||||
end
|
||||
end
|
||||
|
||||
@@ -109,6 +109,14 @@ class Category < ApplicationRecord
|
||||
I18n.t(UNCATEGORIZED_NAME_KEY)
|
||||
end
|
||||
|
||||
# Returns all possible uncategorized names across all supported locales
|
||||
# Used to detect uncategorized filter regardless of URL parameter language
|
||||
def all_uncategorized_names
|
||||
LanguagesHelper::SUPPORTED_LOCALES.map do |locale|
|
||||
I18n.t(UNCATEGORIZED_NAME_KEY, locale: locale)
|
||||
end.uniq
|
||||
end
|
||||
|
||||
# Helper to get the localized name for other investments
|
||||
def other_investments_name
|
||||
I18n.t(OTHER_INVESTMENTS_NAME_KEY)
|
||||
@@ -119,6 +127,14 @@ class Category < ApplicationRecord
|
||||
I18n.t(INVESTMENT_CONTRIBUTIONS_NAME_KEY)
|
||||
end
|
||||
|
||||
# Returns all possible investment contributions names across all supported locales
|
||||
# Used to detect investment contributions category regardless of locale
|
||||
def all_investment_contributions_names
|
||||
LanguagesHelper::SUPPORTED_LOCALES.map do |locale|
|
||||
I18n.t(INVESTMENT_CONTRIBUTIONS_NAME_KEY, locale: locale)
|
||||
end.uniq
|
||||
end
|
||||
|
||||
private
|
||||
def default_categories
|
||||
[
|
||||
|
||||
@@ -1,5 +1,11 @@
|
||||
class CoinbaseAccount < ApplicationRecord
|
||||
include CurrencyNormalizable
|
||||
include CurrencyNormalizable, Encryptable
|
||||
|
||||
# Encrypt raw payloads if ActiveRecord encryption is configured
|
||||
if encryption_ready?
|
||||
encrypts :raw_payload
|
||||
encrypts :raw_transactions_payload
|
||||
end
|
||||
|
||||
belongs_to :coinbase_item
|
||||
|
||||
|
||||
@@ -24,12 +24,13 @@ class CoinbaseItem < ApplicationRecord
|
||||
validates :api_secret, presence: true
|
||||
|
||||
belongs_to :family
|
||||
has_one_attached :logo
|
||||
has_one_attached :logo, dependent: :purge_later
|
||||
|
||||
has_many :coinbase_accounts, dependent: :destroy
|
||||
has_many :accounts, through: :coinbase_accounts
|
||||
|
||||
scope :active, -> { where(scheduled_for_deletion: false) }
|
||||
scope :syncable, -> { active }
|
||||
scope :ordered, -> { order(created_at: :desc) }
|
||||
scope :needs_update, -> { where(status: :requires_update) }
|
||||
|
||||
|
||||
@@ -1,7 +1,13 @@
|
||||
# Represents a single crypto token/coin within a CoinStats wallet.
|
||||
# Each wallet address may have multiple CoinstatsAccounts (one per token).
|
||||
class CoinstatsAccount < ApplicationRecord
|
||||
include CurrencyNormalizable
|
||||
include CurrencyNormalizable, Encryptable
|
||||
|
||||
# Encrypt raw payloads if ActiveRecord encryption is configured
|
||||
if encryption_ready?
|
||||
encrypts :raw_payload
|
||||
encrypts :raw_transactions_payload
|
||||
end
|
||||
|
||||
belongs_to :coinstats_item
|
||||
|
||||
|
||||
@@ -22,12 +22,13 @@ class CoinstatsItem < ApplicationRecord
|
||||
validates :api_key, presence: true
|
||||
|
||||
belongs_to :family
|
||||
has_one_attached :logo
|
||||
has_one_attached :logo, dependent: :purge_later
|
||||
|
||||
has_many :coinstats_accounts, dependent: :destroy
|
||||
has_many :accounts, through: :coinstats_accounts
|
||||
|
||||
scope :active, -> { where(scheduled_for_deletion: false) }
|
||||
scope :syncable, -> { active }
|
||||
scope :ordered, -> { order(created_at: :desc) }
|
||||
scope :needs_update, -> { where(status: :requires_update) }
|
||||
|
||||
|
||||
@@ -15,6 +15,8 @@ module Enrichable
|
||||
InvalidAttributeError = Class.new(StandardError)
|
||||
|
||||
included do
|
||||
has_many :data_enrichments, as: :enrichable, dependent: :destroy
|
||||
|
||||
scope :enrichable, ->(attrs) {
|
||||
attrs = Array(attrs).map(&:to_s)
|
||||
json_condition = attrs.each_with_object({}) { |attr, hash| hash[attr] = true }
|
||||
@@ -22,6 +24,22 @@ module Enrichable
|
||||
}
|
||||
end
|
||||
|
||||
class_methods do
|
||||
# Override in models to define family-scoped query
|
||||
def family_scope(family)
|
||||
none
|
||||
end
|
||||
|
||||
def clear_ai_cache(family)
|
||||
count = 0
|
||||
family_scope(family).find_each do |record|
|
||||
record.clear_ai_cache
|
||||
count += 1
|
||||
end
|
||||
count
|
||||
end
|
||||
end
|
||||
|
||||
# Convenience method for a single attribute
|
||||
def enrich_attribute(attr, value, source:, metadata: {})
|
||||
enrich_attributes({ attr => value }, source:, metadata:)
|
||||
@@ -124,6 +142,29 @@ module Enrichable
|
||||
end
|
||||
end
|
||||
|
||||
def clear_ai_cache
|
||||
ActiveRecord::Base.transaction do
|
||||
ai_enrichments = data_enrichments.where(source: "ai")
|
||||
|
||||
# Only unlock attributes where current value still matches what AI set
|
||||
# If user changed the value, they took ownership - don't unlock
|
||||
attrs_to_unlock = ai_enrichments.select do |enrichment|
|
||||
attr_name = enrichment.attribute_name
|
||||
current_value = respond_to?(attr_name) ? send(attr_name) : self[attr_name]
|
||||
current_value.to_s == enrichment.value.to_s
|
||||
end.map(&:attribute_name).uniq
|
||||
|
||||
# Batch unlock in a single update
|
||||
if attrs_to_unlock.any?
|
||||
new_locked_attrs = locked_attributes.except(*attrs_to_unlock)
|
||||
update_column(:locked_attributes, new_locked_attrs) if new_locked_attrs != locked_attributes
|
||||
end
|
||||
|
||||
# Delete AI enrichment records
|
||||
ai_enrichments.delete_all
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
def log_enrichment(attribute_name:, attribute_value:, source:, metadata: {})
|
||||
de = DataEnrichment.find_or_create_by(
|
||||
|
||||
126
app/models/concerns/ssl_configurable.rb
Normal file
126
app/models/concerns/ssl_configurable.rb
Normal file
@@ -0,0 +1,126 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
# Provides centralized SSL configuration for HTTP clients.
|
||||
#
|
||||
# This module enables support for self-signed certificates in self-hosted
|
||||
# environments by reading configuration from Rails.configuration.x.ssl.
|
||||
#
|
||||
# Features:
|
||||
# - Custom CA certificate support for self-signed certificates
|
||||
# - Optional SSL verification bypass (for development/testing only)
|
||||
# - Debug logging for troubleshooting SSL issues
|
||||
#
|
||||
# Usage (extend for class methods — the only supported pattern):
|
||||
# class MyHttpClient
|
||||
# extend SslConfigurable
|
||||
#
|
||||
# def self.make_request
|
||||
# Faraday.new(url, ssl: faraday_ssl_options) { |f| ... }
|
||||
# end
|
||||
# end
|
||||
#
|
||||
# Environment Variables (configured in config/initializers/00_ssl.rb):
|
||||
# SSL_CA_FILE - Path to custom CA certificate file (PEM format)
|
||||
# SSL_VERIFY - Set to "false" to disable SSL verification
|
||||
# SSL_DEBUG - Set to "true" to enable verbose SSL logging
|
||||
module SslConfigurable
|
||||
# Returns SSL options hash for Faraday connections
|
||||
#
|
||||
# @return [Hash] SSL options for Faraday
|
||||
# @example
|
||||
# Faraday.new(url, ssl: faraday_ssl_options) do |f|
|
||||
# f.request :json
|
||||
# f.response :raise_error
|
||||
# end
|
||||
def faraday_ssl_options
|
||||
options = {}
|
||||
|
||||
options[:verify] = ssl_verify?
|
||||
|
||||
if ssl_ca_file.present?
|
||||
options[:ca_file] = ssl_ca_file
|
||||
log_ssl_debug("Faraday SSL: Using custom CA file: #{ssl_ca_file}")
|
||||
end
|
||||
|
||||
log_ssl_debug("Faraday SSL: Verification disabled") unless ssl_verify?
|
||||
log_ssl_debug("Faraday SSL options: #{options.inspect}") if options.present?
|
||||
|
||||
options
|
||||
end
|
||||
|
||||
# Returns SSL options hash for HTTParty requests
|
||||
#
|
||||
# @return [Hash] SSL options for HTTParty
|
||||
# @example
|
||||
# class MyProvider
|
||||
# include HTTParty
|
||||
# extend SslConfigurable
|
||||
# default_options.merge!(httparty_ssl_options)
|
||||
# end
|
||||
def httparty_ssl_options
|
||||
options = { verify: ssl_verify? }
|
||||
|
||||
if ssl_ca_file.present?
|
||||
options[:ssl_ca_file] = ssl_ca_file
|
||||
log_ssl_debug("HTTParty SSL: Using custom CA file: #{ssl_ca_file}")
|
||||
end
|
||||
|
||||
log_ssl_debug("HTTParty SSL: Verification disabled") unless ssl_verify?
|
||||
|
||||
options
|
||||
end
|
||||
|
||||
# Returns SSL verify mode for Net::HTTP
|
||||
#
|
||||
# @return [Integer] OpenSSL verify mode constant (VERIFY_PEER or VERIFY_NONE)
|
||||
# @example
|
||||
# http = Net::HTTP.new(uri.host, uri.port)
|
||||
# http.use_ssl = true
|
||||
# http.verify_mode = net_http_verify_mode
|
||||
# http.ca_file = ssl_ca_file if ssl_ca_file.present?
|
||||
def net_http_verify_mode
|
||||
mode = ssl_verify? ? OpenSSL::SSL::VERIFY_PEER : OpenSSL::SSL::VERIFY_NONE
|
||||
log_ssl_debug("Net::HTTP verify mode: #{mode == OpenSSL::SSL::VERIFY_PEER ? 'VERIFY_PEER' : 'VERIFY_NONE'}")
|
||||
mode
|
||||
end
|
||||
|
||||
# Returns CA file path if configured
|
||||
#
|
||||
# @return [String, nil] Path to CA file or nil if not configured
|
||||
def ssl_ca_file
|
||||
ssl_configuration.ca_file
|
||||
end
|
||||
|
||||
# Returns whether SSL verification is enabled
|
||||
# nil or true both mean verification is enabled; only explicit false disables it
|
||||
#
|
||||
# @return [Boolean] true if SSL verification is enabled
|
||||
def ssl_verify?
|
||||
ssl_configuration.verify != false
|
||||
end
|
||||
|
||||
# Returns whether SSL debug logging is enabled
|
||||
#
|
||||
# @return [Boolean] true if debug logging is enabled
|
||||
def ssl_debug?
|
||||
ssl_configuration.debug == true
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
# Returns the SSL configuration from Rails
|
||||
#
|
||||
# @return [ActiveSupport::OrderedOptions] SSL configuration
|
||||
def ssl_configuration
|
||||
Rails.configuration.x.ssl
|
||||
end
|
||||
|
||||
# Logs a debug message if SSL debug mode is enabled
|
||||
#
|
||||
# @param message [String] Message to log
|
||||
def log_ssl_debug(message)
|
||||
return unless ssl_debug?
|
||||
|
||||
Rails.logger.debug("[SSL Debug] #{message}")
|
||||
end
|
||||
end
|
||||
@@ -1,5 +1,5 @@
|
||||
class DataEnrichment < ApplicationRecord
|
||||
belongs_to :enrichable, polymorphic: true
|
||||
|
||||
enum :source, { rule: "rule", plaid: "plaid", simplefin: "simplefin", lunchflow: "lunchflow", synth: "synth", ai: "ai", enable_banking: "enable_banking", coinstats: "coinstats", mercury: "mercury" }
|
||||
enum :source, { rule: "rule", plaid: "plaid", simplefin: "simplefin", lunchflow: "lunchflow", synth: "synth", ai: "ai", enable_banking: "enable_banking", coinstats: "coinstats", mercury: "mercury", indexa_capital: "indexa_capital" }
|
||||
end
|
||||
|
||||
@@ -93,6 +93,9 @@ class Demo::Generator
|
||||
puts "👥 Creating demo family..."
|
||||
family = create_family_and_users!("Demo Family", email, onboarded: true, subscribed: true)
|
||||
|
||||
puts "🔑 Creating monitoring API key..."
|
||||
create_monitoring_api_key!(family)
|
||||
|
||||
puts "📊 Creating realistic financial data..."
|
||||
create_realistic_categories!(family)
|
||||
create_realistic_accounts!(family)
|
||||
@@ -168,7 +171,7 @@ class Demo::Generator
|
||||
onboarded_at: onboarded ? Time.current : nil
|
||||
)
|
||||
|
||||
# Member user
|
||||
# Family member user
|
||||
family.users.create!(
|
||||
email: "partner_#{email}",
|
||||
first_name: "Eve",
|
||||
@@ -181,6 +184,33 @@ class Demo::Generator
|
||||
family
|
||||
end
|
||||
|
||||
def create_monitoring_api_key!(family)
|
||||
admin_user = family.users.find_by(role: "admin")
|
||||
return unless admin_user
|
||||
|
||||
# Find existing key scoped to this admin user by the deterministic display_key value
|
||||
existing_key = admin_user.api_keys.find_by(display_key: ApiKey::DEMO_MONITORING_KEY)
|
||||
|
||||
if existing_key
|
||||
puts " → Use existing monitoring API key"
|
||||
return existing_key
|
||||
end
|
||||
|
||||
# Revoke any existing user-created web API keys to keep demo access predictable.
|
||||
# (the monitoring key uses the dedicated "monitoring" source and cannot be revoked)
|
||||
admin_user.api_keys.active.visible.where(source: "web").find_each(&:revoke!)
|
||||
|
||||
api_key = admin_user.api_keys.create!(
|
||||
name: "monitoring",
|
||||
key: ApiKey::DEMO_MONITORING_KEY,
|
||||
scopes: [ "read" ],
|
||||
source: "monitoring"
|
||||
)
|
||||
|
||||
puts " → Created monitoring API key: #{ApiKey::DEMO_MONITORING_KEY}"
|
||||
api_key
|
||||
end
|
||||
|
||||
def create_realistic_categories!(family)
|
||||
# Income categories (3 total)
|
||||
@salary_cat = family.categories.create!(name: "Salary", color: "#10b981", classification: "income")
|
||||
|
||||
@@ -17,12 +17,13 @@ class EnableBankingItem < ApplicationRecord
|
||||
validates :client_certificate, presence: true, on: :create
|
||||
|
||||
belongs_to :family
|
||||
has_one_attached :logo
|
||||
has_one_attached :logo, dependent: :purge_later
|
||||
|
||||
has_many :enable_banking_accounts, dependent: :destroy
|
||||
has_many :accounts, through: :enable_banking_accounts
|
||||
|
||||
scope :active, -> { where(scheduled_for_deletion: false) }
|
||||
scope :syncable, -> { active }
|
||||
scope :ordered, -> { order(created_at: :desc) }
|
||||
scope :needs_update, -> { where(status: :requires_update) }
|
||||
|
||||
|
||||
@@ -42,6 +42,7 @@ class Entry < ApplicationRecord
|
||||
.where(<<~SQL.squish)
|
||||
(transactions.extra -> 'simplefin' ->> 'pending')::boolean = true
|
||||
OR (transactions.extra -> 'plaid' ->> 'pending')::boolean = true
|
||||
OR (transactions.extra -> 'lunchflow' ->> 'pending')::boolean = true
|
||||
SQL
|
||||
}
|
||||
|
||||
@@ -56,6 +57,7 @@ class Entry < ApplicationRecord
|
||||
AND (
|
||||
(t.extra -> 'simplefin' ->> 'pending')::boolean = true
|
||||
OR (t.extra -> 'plaid' ->> 'pending')::boolean = true
|
||||
OR (t.extra -> 'lunchflow' ->> 'pending')::boolean = true
|
||||
)
|
||||
)
|
||||
SQL
|
||||
@@ -66,6 +68,11 @@ class Entry < ApplicationRecord
|
||||
pending.where("entries.date < ?", days.days.ago.to_date)
|
||||
}
|
||||
|
||||
# Family-scoped query for Enrichable#clear_ai_cache
|
||||
def self.family_scope(family)
|
||||
joins(:account).where(accounts: { family_id: family.id })
|
||||
end
|
||||
|
||||
# Auto-exclude stale pending transactions for an account
|
||||
# Called during sync to clean up pending transactions that never posted
|
||||
# @param account [Account] The account to clean up
|
||||
@@ -113,6 +120,7 @@ class Entry < ApplicationRecord
|
||||
.where(<<~SQL.squish)
|
||||
(transactions.extra -> 'simplefin' ->> 'pending')::boolean IS NOT TRUE
|
||||
AND (transactions.extra -> 'plaid' ->> 'pending')::boolean IS NOT TRUE
|
||||
AND (transactions.extra -> 'lunchflow' ->> 'pending')::boolean IS NOT TRUE
|
||||
SQL
|
||||
.limit(2) # Only need to know if 0, 1, or 2+ candidates
|
||||
.to_a # Load limited records to avoid COUNT(*) on .size
|
||||
@@ -159,6 +167,7 @@ class Entry < ApplicationRecord
|
||||
.where(<<~SQL.squish)
|
||||
(transactions.extra -> 'simplefin' ->> 'pending')::boolean IS NOT TRUE
|
||||
AND (transactions.extra -> 'plaid' ->> 'pending')::boolean IS NOT TRUE
|
||||
AND (transactions.extra -> 'lunchflow' ->> 'pending')::boolean IS NOT TRUE
|
||||
SQL
|
||||
|
||||
# Match by name similarity (first 3 words)
|
||||
@@ -314,27 +323,50 @@ class Entry < ApplicationRecord
|
||||
30.years.ago.to_date
|
||||
end
|
||||
|
||||
def bulk_update!(bulk_update_params)
|
||||
# Bulk update entries with the given parameters.
|
||||
#
|
||||
# Tags are handled separately from other entryable attributes because they use
|
||||
# a join table (taggings) rather than a direct column. This means:
|
||||
# - category_id: nil means "no category" (column value)
|
||||
# - tag_ids: [] means "delete all taggings" (join table operation)
|
||||
#
|
||||
# To avoid accidentally clearing tags when only updating other fields,
|
||||
# tags are only modified when explicitly requested via update_tags: true.
|
||||
#
|
||||
# @param bulk_update_params [Hash] The parameters to update
|
||||
# @param update_tags [Boolean] Whether to update tags (default: false)
|
||||
def bulk_update!(bulk_update_params, update_tags: false)
|
||||
bulk_attributes = {
|
||||
date: bulk_update_params[:date],
|
||||
notes: bulk_update_params[:notes],
|
||||
entryable_attributes: {
|
||||
category_id: bulk_update_params[:category_id],
|
||||
merchant_id: bulk_update_params[:merchant_id],
|
||||
tag_ids: bulk_update_params[:tag_ids]
|
||||
merchant_id: bulk_update_params[:merchant_id]
|
||||
}.compact_blank
|
||||
}.compact_blank
|
||||
|
||||
return 0 if bulk_attributes.blank?
|
||||
tag_ids = Array.wrap(bulk_update_params[:tag_ids]).reject(&:blank?)
|
||||
has_updates = bulk_attributes.present? || update_tags
|
||||
|
||||
return 0 unless has_updates
|
||||
|
||||
transaction do
|
||||
all.each do |entry|
|
||||
bulk_attributes[:entryable_attributes][:id] = entry.entryable_id if bulk_attributes[:entryable_attributes].present?
|
||||
entry.update! bulk_attributes
|
||||
# Update standard attributes
|
||||
if bulk_attributes.present?
|
||||
bulk_attributes[:entryable_attributes][:id] = entry.entryable_id if bulk_attributes[:entryable_attributes].present?
|
||||
entry.update! bulk_attributes
|
||||
end
|
||||
|
||||
# Handle tags separately - only when explicitly requested
|
||||
if update_tags && entry.transaction?
|
||||
entry.transaction.tag_ids = tag_ids
|
||||
entry.transaction.save!
|
||||
entry.entryable.lock_attr!(:tag_ids) if entry.transaction.tags.any?
|
||||
end
|
||||
|
||||
entry.lock_saved_attributes!
|
||||
entry.mark_user_modified!
|
||||
entry.entryable.lock_attr!(:tag_ids) if entry.transaction? && entry.transaction.tags.any?
|
||||
end
|
||||
end
|
||||
|
||||
|
||||
@@ -70,6 +70,7 @@ class EntrySearch
|
||||
AND (
|
||||
(t.extra -> 'simplefin' ->> 'pending')::boolean = true
|
||||
OR (t.extra -> 'plaid' ->> 'pending')::boolean = true
|
||||
OR (t.extra -> 'lunchflow' ->> 'pending')::boolean = true
|
||||
)
|
||||
)
|
||||
SQL
|
||||
@@ -82,6 +83,7 @@ class EntrySearch
|
||||
AND (
|
||||
(t.extra -> 'simplefin' ->> 'pending')::boolean = true
|
||||
OR (t.extra -> 'plaid' ->> 'pending')::boolean = true
|
||||
OR (t.extra -> 'lunchflow' ->> 'pending')::boolean = true
|
||||
)
|
||||
)
|
||||
SQL
|
||||
|
||||
@@ -1,9 +1,30 @@
|
||||
class Eval::Langfuse::Client
|
||||
extend SslConfigurable
|
||||
|
||||
BASE_URLS = {
|
||||
us: "https://us.cloud.langfuse.com/api/public",
|
||||
eu: "https://cloud.langfuse.com/api/public"
|
||||
}.freeze
|
||||
|
||||
# OpenSSL 3.x version threshold for CRL workaround
|
||||
# See: https://github.com/ruby/openssl/issues/619
|
||||
OPENSSL_3_VERSION = 0x30000000
|
||||
|
||||
# CRL-related OpenSSL error codes that can be safely bypassed
|
||||
# These errors occur when CRL (Certificate Revocation List) is unavailable
|
||||
def self.crl_errors
|
||||
@crl_errors ||= begin
|
||||
errors = [
|
||||
OpenSSL::X509::V_ERR_UNABLE_TO_GET_CRL,
|
||||
OpenSSL::X509::V_ERR_CRL_HAS_EXPIRED,
|
||||
OpenSSL::X509::V_ERR_CRL_NOT_YET_VALID
|
||||
]
|
||||
# V_ERR_UNABLE_TO_GET_CRL_ISSUER may not exist in all OpenSSL versions
|
||||
errors << OpenSSL::X509::V_ERR_UNABLE_TO_GET_CRL_ISSUER if defined?(OpenSSL::X509::V_ERR_UNABLE_TO_GET_CRL_ISSUER)
|
||||
errors.freeze
|
||||
end
|
||||
end
|
||||
|
||||
class Error < StandardError; end
|
||||
class ConfigurationError < Error; end
|
||||
class ApiError < Error
|
||||
@@ -176,12 +197,24 @@ class Eval::Langfuse::Client
|
||||
http.read_timeout = 30
|
||||
http.open_timeout = 10
|
||||
|
||||
# Fix for OpenSSL 3.x CRL checking issues
|
||||
# Apply SSL configuration from centralized config
|
||||
http.verify_mode = self.class.net_http_verify_mode
|
||||
http.ca_file = self.class.ssl_ca_file if self.class.ssl_ca_file.present?
|
||||
|
||||
# Fix for OpenSSL 3.x CRL checking issues (only when verification is enabled)
|
||||
# See: https://github.com/ruby/openssl/issues/619
|
||||
http.verify_mode = OpenSSL::SSL::VERIFY_PEER
|
||||
if OpenSSL::OPENSSL_VERSION_NUMBER >= 0x30000000
|
||||
# Disable CRL checking which can fail on some certificates
|
||||
http.verify_callback = ->(_preverify_ok, _store_ctx) { true }
|
||||
# Only bypass CRL-specific errors, not all certificate verification
|
||||
if self.class.ssl_verify? && OpenSSL::OPENSSL_VERSION_NUMBER >= OPENSSL_3_VERSION
|
||||
crl_error_codes = self.class.crl_errors
|
||||
http.verify_callback = ->(preverify_ok, store_ctx) {
|
||||
# Bypass only CRL-specific errors (these fail when CRL is unavailable)
|
||||
# For all other errors, preserve the original verification result
|
||||
if crl_error_codes.include?(store_ctx.error)
|
||||
true
|
||||
else
|
||||
preverify_ok
|
||||
end
|
||||
}
|
||||
end
|
||||
|
||||
response = http.request(request)
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
class Family < ApplicationRecord
|
||||
include Syncable, AutoTransferMatchable, Subscribeable, VectorSearchable
|
||||
include PlaidConnectable, SimplefinConnectable, LunchflowConnectable, EnableBankingConnectable
|
||||
include CoinbaseConnectable, CoinstatsConnectable, SnaptradeConnectable, MercuryConnectable
|
||||
include PlaidConnectable, SimplefinConnectable, LunchflowConnectable, EnableBankingConnectable, Syncable, AutoTransferMatchable, Subscribeable
|
||||
include IndexaCapitalConnectable
|
||||
|
||||
DATE_FORMATS = [
|
||||
[ "MM-DD-YYYY", "%m-%d-%Y" ],
|
||||
@@ -15,6 +17,9 @@ class Family < ApplicationRecord
|
||||
[ "YYYYMMDD", "%Y%m%d" ]
|
||||
].freeze
|
||||
|
||||
|
||||
MONIKERS = [ "Family", "Group" ].freeze
|
||||
|
||||
has_many :users, dependent: :destroy
|
||||
has_many :accounts, dependent: :destroy
|
||||
has_many :invitations, dependent: :destroy
|
||||
@@ -40,6 +45,42 @@ class Family < ApplicationRecord
|
||||
|
||||
validates :locale, inclusion: { in: I18n.available_locales.map(&:to_s) }
|
||||
validates :date_format, inclusion: { in: DATE_FORMATS.map(&:last) }
|
||||
validates :month_start_day, inclusion: { in: 1..28 }
|
||||
validates :moniker, inclusion: { in: MONIKERS }
|
||||
|
||||
|
||||
def moniker_label
|
||||
moniker.presence || "Family"
|
||||
end
|
||||
|
||||
def moniker_label_plural
|
||||
moniker_label == "Group" ? "Groups" : "Families"
|
||||
end
|
||||
|
||||
def uses_custom_month_start?
|
||||
month_start_day != 1
|
||||
end
|
||||
|
||||
def custom_month_start_for(date)
|
||||
if date.day >= month_start_day
|
||||
Date.new(date.year, date.month, month_start_day)
|
||||
else
|
||||
previous_month = date - 1.month
|
||||
Date.new(previous_month.year, previous_month.month, month_start_day)
|
||||
end
|
||||
end
|
||||
|
||||
def custom_month_end_for(date)
|
||||
start_date = custom_month_start_for(date)
|
||||
next_month_start = start_date + 1.month
|
||||
next_month_start - 1.day
|
||||
end
|
||||
|
||||
def current_custom_month_period
|
||||
start_date = custom_month_start_for(Date.current)
|
||||
end_date = custom_month_end_for(Date.current)
|
||||
Period.custom(start_date: start_date, end_date: end_date)
|
||||
end
|
||||
|
||||
def assigned_merchants
|
||||
merchant_ids = transactions.where.not(merchant_id: nil).pluck(:merchant_id).uniq
|
||||
@@ -80,10 +121,47 @@ class Family < ApplicationRecord
|
||||
@income_statement ||= IncomeStatement.new(self)
|
||||
end
|
||||
|
||||
# Returns the Investment Contributions category for this family, or nil if not found.
|
||||
# This is a bootstrapped category used for auto-categorizing transfers to investment accounts.
|
||||
# Returns the Investment Contributions category for this family, creating it if it doesn't exist.
|
||||
# This is used for auto-categorizing transfers to investment accounts.
|
||||
# Always uses the family's locale to ensure consistent category naming across all users.
|
||||
def investment_contributions_category
|
||||
categories.find_by(name: Category.investment_contributions_name)
|
||||
# Find ALL legacy categories (created under old request-locale behavior)
|
||||
legacy = categories.where(name: Category.all_investment_contributions_names).order(:created_at).to_a
|
||||
|
||||
if legacy.any?
|
||||
keeper = legacy.first
|
||||
duplicates = legacy[1..]
|
||||
|
||||
# Reassign transactions and subcategories from duplicates to keeper
|
||||
if duplicates.any?
|
||||
duplicate_ids = duplicates.map(&:id)
|
||||
categories.where(parent_id: duplicate_ids).update_all(parent_id: keeper.id)
|
||||
Transaction.where(category_id: duplicate_ids).update_all(category_id: keeper.id)
|
||||
BudgetCategory.where(category_id: duplicate_ids).update_all(category_id: keeper.id)
|
||||
categories.where(id: duplicate_ids).delete_all
|
||||
end
|
||||
|
||||
# Rename keeper to family's locale name if needed
|
||||
I18n.with_locale(locale) do
|
||||
correct_name = Category.investment_contributions_name
|
||||
keeper.update!(name: correct_name) unless keeper.name == correct_name
|
||||
end
|
||||
return keeper
|
||||
end
|
||||
|
||||
# Create new category using family's locale
|
||||
I18n.with_locale(locale) do
|
||||
categories.find_or_create_by!(name: Category.investment_contributions_name) do |cat|
|
||||
cat.color = "#0d9488"
|
||||
cat.classification = "expense"
|
||||
cat.lucide_icon = "trending-up"
|
||||
end
|
||||
end
|
||||
rescue ActiveRecord::RecordNotUnique, ActiveRecord::RecordInvalid
|
||||
# Handle race condition: another process created the category
|
||||
I18n.with_locale(locale) do
|
||||
categories.find_by!(name: Category.investment_contributions_name)
|
||||
end
|
||||
end
|
||||
|
||||
# Returns account IDs for tax-advantaged accounts (401k, IRA, HSA, etc.)
|
||||
@@ -141,6 +219,27 @@ class Family < ApplicationRecord
|
||||
(requires_exchange_rates_data_provider? && ExchangeRate.provider.nil?)
|
||||
end
|
||||
|
||||
# Returns securities with plan restrictions for a specific provider
|
||||
# @param provider [String] The provider name (e.g., "TwelveData")
|
||||
# @return [Array<Hash>] Array of hashes with ticker, name, required_plan, provider
|
||||
def securities_with_plan_restrictions(provider:)
|
||||
security_ids = trades.joins(:security).pluck("securities.id").uniq
|
||||
return [] if security_ids.empty?
|
||||
|
||||
restrictions = Security.plan_restrictions_for(security_ids, provider: provider)
|
||||
return [] if restrictions.empty?
|
||||
|
||||
Security.where(id: restrictions.keys).map do |security|
|
||||
restriction = restrictions[security.id]
|
||||
{
|
||||
ticker: security.ticker,
|
||||
name: security.name,
|
||||
required_plan: restriction[:required_plan],
|
||||
provider: restriction[:provider]
|
||||
}
|
||||
end
|
||||
end
|
||||
|
||||
def oldest_entry_date
|
||||
entries.order(:date).first&.date || Date.current
|
||||
end
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
module Family::AutoTransferMatchable
|
||||
def transfer_match_candidates(date_window: 4)
|
||||
def transfer_match_candidates(date_window: 4, exchange_rate_tolerance: 0.1)
|
||||
Entry.select([
|
||||
"inflow_candidates.entryable_id as inflow_transaction_id",
|
||||
"outflow_candidates.entryable_id as outflow_transaction_id",
|
||||
@@ -39,7 +39,7 @@ module Family::AutoTransferMatchable
|
||||
inflow_candidates.amount = -outflow_candidates.amount
|
||||
) OR (
|
||||
inflow_candidates.currency <> outflow_candidates.currency AND
|
||||
ABS(inflow_candidates.amount / NULLIF(outflow_candidates.amount * exchange_rates.rate, 0)) BETWEEN 0.95 AND 1.05
|
||||
ABS(inflow_candidates.amount / NULLIF(outflow_candidates.amount * exchange_rates.rate, 0)) BETWEEN #{1 - exchange_rate_tolerance} AND #{1 + exchange_rate_tolerance}
|
||||
)
|
||||
")
|
||||
.where(existing_transfers: { id: nil })
|
||||
@@ -69,8 +69,22 @@ module Family::AutoTransferMatchable
|
||||
# Another concurrent job created the transfer; safe to ignore
|
||||
end
|
||||
|
||||
Transaction.find(match.inflow_transaction_id).update!(kind: "funds_movement")
|
||||
Transaction.find(match.outflow_transaction_id).update!(kind: Transfer.kind_for_account(Transaction.find(match.outflow_transaction_id).entry.account))
|
||||
inflow_transaction = Transaction.find(match.inflow_transaction_id)
|
||||
outflow_transaction = Transaction.find(match.outflow_transaction_id)
|
||||
|
||||
# The kind is determined by the DESTINATION account (inflow), matching Transfer::Creator logic
|
||||
inflow_transaction.update!(kind: "funds_movement")
|
||||
outflow_transaction.update!(kind: Transfer.kind_for_account(inflow_transaction.entry.account))
|
||||
|
||||
# Assign Investment Contributions category for transfers to investment accounts
|
||||
destination_account = Transaction.find(match.inflow_transaction_id).entry.account
|
||||
if Transfer.kind_for_account(destination_account) == "investment_contribution"
|
||||
outflow_txn = Transaction.find(match.outflow_transaction_id)
|
||||
if outflow_txn.category_id.blank?
|
||||
category = destination_account.family.investment_contributions_category
|
||||
outflow_txn.update!(category: category) if category.present?
|
||||
end
|
||||
end
|
||||
|
||||
used_transaction_ids << match.inflow_transaction_id
|
||||
used_transaction_ids << match.outflow_transaction_id
|
||||
|
||||
33
app/models/family/indexa_capital_connectable.rb
Normal file
33
app/models/family/indexa_capital_connectable.rb
Normal file
@@ -0,0 +1,33 @@
|
||||
module Family::IndexaCapitalConnectable
|
||||
extend ActiveSupport::Concern
|
||||
|
||||
included do
|
||||
has_many :indexa_capital_items, dependent: :destroy
|
||||
end
|
||||
|
||||
def can_connect_indexa_capital?
|
||||
# Families can configure their own IndexaCapital credentials
|
||||
true
|
||||
end
|
||||
|
||||
def create_indexa_capital_item!(username:, document:, password:, item_name: nil)
|
||||
indexa_capital_item = indexa_capital_items.create!(
|
||||
name: item_name || "Indexa Capital Connection",
|
||||
username: username,
|
||||
document: document,
|
||||
password: password
|
||||
)
|
||||
|
||||
indexa_capital_item.sync_later
|
||||
|
||||
indexa_capital_item
|
||||
end
|
||||
|
||||
def has_indexa_capital_credentials?
|
||||
indexa_capital_items.where.not(api_token: [ nil, "" ]).or(
|
||||
indexa_capital_items.where.not(username: [ nil, "" ])
|
||||
.where.not(document: [ nil, "" ])
|
||||
.where.not(password: [ nil, "" ])
|
||||
).exists?
|
||||
end
|
||||
end
|
||||
@@ -1,6 +1,25 @@
|
||||
class Family::Syncer
|
||||
attr_reader :family
|
||||
|
||||
# Registry of item association names that participate in family sync.
|
||||
# Each model must:
|
||||
# 1. Include Syncable
|
||||
# 2. Define a `syncable` scope (items ready for auto-sync)
|
||||
#
|
||||
# To add a new provider: add its association name here.
|
||||
# The model handles its own "ready to sync" logic via the syncable scope.
|
||||
SYNCABLE_ITEM_ASSOCIATIONS = %i[
|
||||
plaid_items
|
||||
simplefin_items
|
||||
lunchflow_items
|
||||
enable_banking_items
|
||||
indexa_capital_items
|
||||
coinbase_items
|
||||
coinstats_items
|
||||
mercury_items
|
||||
snaptrade_items
|
||||
].freeze
|
||||
|
||||
def initialize(family)
|
||||
@family = family
|
||||
end
|
||||
@@ -25,7 +44,15 @@ class Family::Syncer
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
# Collects all syncable items from registered providers + manual accounts.
|
||||
# Each provider model defines its own `syncable` scope that encapsulates
|
||||
# the "ready to sync" business logic (active, configured, etc.)
|
||||
def child_syncables
|
||||
family.plaid_items + family.simplefin_items.active + family.lunchflow_items.active + family.enable_banking_items.active + family.accounts.manual
|
||||
provider_items = SYNCABLE_ITEM_ASSOCIATIONS.flat_map do |association|
|
||||
family.public_send(association).syncable
|
||||
end
|
||||
|
||||
provider_items + family.accounts.manual
|
||||
end
|
||||
end
|
||||
|
||||
85
app/models/family/vector_searchable.rb
Normal file
85
app/models/family/vector_searchable.rb
Normal file
@@ -0,0 +1,85 @@
|
||||
module Family::VectorSearchable
|
||||
extend ActiveSupport::Concern
|
||||
|
||||
included do
|
||||
has_many :family_documents, dependent: :destroy
|
||||
end
|
||||
|
||||
def ensure_vector_store!
|
||||
return vector_store_id if vector_store_id.present?
|
||||
|
||||
adapter = vector_store_adapter
|
||||
return nil unless adapter
|
||||
|
||||
response = adapter.create_store(name: "Family #{id} Documents")
|
||||
return nil unless response.success?
|
||||
|
||||
if update(vector_store_id: response.data[:id])
|
||||
vector_store_id
|
||||
else
|
||||
adapter.delete_store(store_id: response.data[:id]) rescue nil
|
||||
nil
|
||||
end
|
||||
end
|
||||
|
||||
def search_documents(query, max_results: 10)
|
||||
return [] unless vector_store_id.present?
|
||||
|
||||
adapter = vector_store_adapter
|
||||
return [] unless adapter
|
||||
|
||||
response = adapter.search(
|
||||
store_id: vector_store_id,
|
||||
query: query,
|
||||
max_results: max_results
|
||||
)
|
||||
|
||||
response.success? ? response.data : []
|
||||
end
|
||||
|
||||
def upload_document(file_content:, filename:, metadata: {})
|
||||
adapter = vector_store_adapter
|
||||
return nil unless adapter
|
||||
|
||||
store_id = ensure_vector_store!
|
||||
return nil unless store_id
|
||||
|
||||
response = adapter.upload_file(
|
||||
store_id: store_id,
|
||||
file_content: file_content,
|
||||
filename: filename
|
||||
)
|
||||
|
||||
return nil unless response.success?
|
||||
|
||||
family_documents.create!(
|
||||
filename: filename,
|
||||
content_type: Marcel::MimeType.for(name: filename),
|
||||
file_size: file_content.bytesize,
|
||||
provider_file_id: response.data[:file_id],
|
||||
status: "ready",
|
||||
metadata: metadata || {}
|
||||
)
|
||||
end
|
||||
|
||||
def remove_document(family_document)
|
||||
adapter = vector_store_adapter
|
||||
return false unless adapter && vector_store_id.present? && family_document.provider_file_id.present?
|
||||
|
||||
response = adapter.remove_file(
|
||||
store_id: vector_store_id,
|
||||
file_id: family_document.provider_file_id
|
||||
)
|
||||
|
||||
return false unless response.success?
|
||||
|
||||
family_document.destroy
|
||||
true
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def vector_store_adapter
|
||||
VectorStore.adapter
|
||||
end
|
||||
end
|
||||
25
app/models/family_document.rb
Normal file
25
app/models/family_document.rb
Normal file
@@ -0,0 +1,25 @@
|
||||
class FamilyDocument < ApplicationRecord
|
||||
belongs_to :family
|
||||
|
||||
has_one_attached :file
|
||||
|
||||
SUPPORTED_EXTENSIONS = VectorStore::Base::SUPPORTED_EXTENSIONS
|
||||
|
||||
validates :filename, presence: true
|
||||
validates :status, inclusion: { in: %w[pending processing ready error] }
|
||||
|
||||
scope :ready, -> { where(status: "ready") }
|
||||
|
||||
def mark_ready!
|
||||
update!(status: "ready")
|
||||
end
|
||||
|
||||
def mark_error!(error_message = nil)
|
||||
update!(status: "error", metadata: (metadata || {}).merge("error" => error_message))
|
||||
end
|
||||
|
||||
def supported_extension?
|
||||
ext = File.extname(filename).downcase
|
||||
SUPPORTED_EXTENSIONS.include?(ext)
|
||||
end
|
||||
end
|
||||
@@ -112,7 +112,9 @@ class Holding < ApplicationRecord
|
||||
return new_source == "calculated"
|
||||
end
|
||||
|
||||
new_priority > cost_basis_source_priority
|
||||
# Allow refreshes from the same source (e.g., new trades change calculated cost basis,
|
||||
# or providers send updated cost basis).
|
||||
new_priority >= cost_basis_source_priority
|
||||
end
|
||||
|
||||
# Set cost_basis from user input (locks the value)
|
||||
|
||||
@@ -40,6 +40,15 @@ class Holding::CostBasisReconciler
|
||||
# Check priority - can the incoming source replace the existing?
|
||||
if existing_holding.cost_basis_replaceable_by?(incoming_source)
|
||||
if incoming_cost_basis.present?
|
||||
# Avoid writes when nothing would change (common when re-materializing)
|
||||
if existing_holding.cost_basis_source == incoming_source && existing_holding.cost_basis == incoming_cost_basis
|
||||
return {
|
||||
cost_basis: existing_holding.cost_basis,
|
||||
cost_basis_source: existing_holding.cost_basis_source,
|
||||
should_update: false
|
||||
}
|
||||
end
|
||||
|
||||
return {
|
||||
cost_basis: incoming_cost_basis,
|
||||
cost_basis_source: incoming_source,
|
||||
|
||||
@@ -3,9 +3,13 @@ class Import < ApplicationRecord
|
||||
MappingError = Class.new(StandardError)
|
||||
|
||||
MAX_CSV_SIZE = 10.megabytes
|
||||
ALLOWED_MIME_TYPES = %w[text/csv text/plain application/vnd.ms-excel application/csv].freeze
|
||||
MAX_PDF_SIZE = 25.megabytes
|
||||
ALLOWED_CSV_MIME_TYPES = %w[text/csv text/plain application/vnd.ms-excel application/csv].freeze
|
||||
ALLOWED_PDF_MIME_TYPES = %w[application/pdf].freeze
|
||||
|
||||
TYPES = %w[TransactionImport TradeImport AccountImport MintImport CategoryImport RuleImport].freeze
|
||||
DOCUMENT_TYPES = %w[bank_statement credit_card_statement investment_statement financial_document contract other].freeze
|
||||
|
||||
TYPES = %w[TransactionImport TradeImport AccountImport MintImport CategoryImport RuleImport PdfImport].freeze
|
||||
SIGNAGE_CONVENTIONS = %w[inflows_positive inflows_negative]
|
||||
SEPARATORS = [ [ "Comma (,)", "," ], [ "Semicolon (;)", ";" ] ].freeze
|
||||
|
||||
@@ -134,6 +138,14 @@ class Import < ApplicationRecord
|
||||
[]
|
||||
end
|
||||
|
||||
# Returns false for import types that don't need CSV column mapping (e.g., PdfImport).
|
||||
# Override in subclasses that handle data extraction differently.
|
||||
def requires_csv_workflow?
|
||||
true
|
||||
end
|
||||
|
||||
# Subclasses that require CSV workflow must override this.
|
||||
# Non-CSV imports (e.g., PdfImport) can return [].
|
||||
def column_keys
|
||||
raise NotImplementedError, "Subclass must implement column_keys"
|
||||
end
|
||||
|
||||
99
app/models/indexa_capital_account.rb
Normal file
99
app/models/indexa_capital_account.rb
Normal file
@@ -0,0 +1,99 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class IndexaCapitalAccount < ApplicationRecord
|
||||
include CurrencyNormalizable
|
||||
include IndexaCapitalAccount::DataHelpers
|
||||
|
||||
belongs_to :indexa_capital_item
|
||||
|
||||
# Association through account_providers
|
||||
has_one :account_provider, as: :provider, dependent: :destroy
|
||||
has_one :account, through: :account_provider, source: :account
|
||||
has_one :linked_account, through: :account_provider, source: :account
|
||||
|
||||
validates :name, :currency, presence: true
|
||||
|
||||
# Scopes
|
||||
scope :with_linked, -> { joins(:account_provider) }
|
||||
scope :without_linked, -> { left_joins(:account_provider).where(account_providers: { id: nil }) }
|
||||
scope :ordered, -> { order(created_at: :desc) }
|
||||
|
||||
# Callbacks
|
||||
after_destroy :enqueue_connection_cleanup
|
||||
|
||||
# Helper to get account using account_providers system
|
||||
def current_account
|
||||
account
|
||||
end
|
||||
|
||||
# Idempotently create or update AccountProvider link
|
||||
# CRITICAL: After creation, reload association to avoid stale nil
|
||||
def ensure_account_provider!(linked_account)
|
||||
return nil unless linked_account
|
||||
|
||||
provider = account_provider || build_account_provider
|
||||
provider.account = linked_account
|
||||
provider.save!
|
||||
|
||||
# Reload to clear cached nil value
|
||||
reload_account_provider
|
||||
account_provider
|
||||
end
|
||||
|
||||
def upsert_from_indexa_capital!(account_data)
|
||||
data = sdk_object_to_hash(account_data).with_indifferent_access
|
||||
|
||||
# Indexa Capital API field mapping:
|
||||
# account_number → unique account identifier
|
||||
# name → display name (constructed by provider)
|
||||
# type → mutual / pension / epsv
|
||||
# status → active / inactive
|
||||
# currency → always EUR for Indexa Capital
|
||||
attrs = {
|
||||
indexa_capital_account_id: data[:account_number]&.to_s,
|
||||
account_number: data[:account_number]&.to_s,
|
||||
name: data[:name] || "Indexa Capital Account",
|
||||
currency: data[:currency] || "EUR",
|
||||
account_status: data[:status],
|
||||
account_type: data[:type],
|
||||
provider: "Indexa Capital",
|
||||
raw_payload: account_data
|
||||
}
|
||||
attrs[:current_balance] = data[:current_balance].to_d unless data[:current_balance].nil?
|
||||
|
||||
update!(attrs)
|
||||
end
|
||||
|
||||
# Store holdings snapshot - return early if empty to avoid setting timestamps incorrectly
|
||||
def upsert_holdings_snapshot!(holdings_data)
|
||||
return if holdings_data.blank?
|
||||
|
||||
update!(
|
||||
raw_holdings_payload: holdings_data,
|
||||
last_holdings_sync: Time.current
|
||||
)
|
||||
end
|
||||
|
||||
# Store activities snapshot - return early if empty to avoid setting timestamps incorrectly
|
||||
def upsert_activities_snapshot!(activities_data)
|
||||
return if activities_data.blank?
|
||||
|
||||
update!(
|
||||
raw_activities_payload: activities_data,
|
||||
last_activities_sync: Time.current
|
||||
)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def enqueue_connection_cleanup
|
||||
return unless indexa_capital_item
|
||||
return unless indexa_capital_authorization_id.present?
|
||||
|
||||
IndexaCapitalConnectionCleanupJob.perform_later(
|
||||
indexa_capital_item_id: indexa_capital_item.id,
|
||||
authorization_id: indexa_capital_authorization_id,
|
||||
account_id: id
|
||||
)
|
||||
end
|
||||
end
|
||||
229
app/models/indexa_capital_account/activities_processor.rb
Normal file
229
app/models/indexa_capital_account/activities_processor.rb
Normal file
@@ -0,0 +1,229 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class IndexaCapitalAccount::ActivitiesProcessor
|
||||
include IndexaCapitalAccount::DataHelpers
|
||||
|
||||
# Map provider activity types to Sure activity labels
|
||||
# TODO: Customize for your provider's activity types
|
||||
ACTIVITY_TYPE_TO_LABEL = {
|
||||
"BUY" => "Buy",
|
||||
"SELL" => "Sell",
|
||||
"DIVIDEND" => "Dividend",
|
||||
"DIV" => "Dividend",
|
||||
"CONTRIBUTION" => "Contribution",
|
||||
"WITHDRAWAL" => "Withdrawal",
|
||||
"TRANSFER_IN" => "Transfer",
|
||||
"TRANSFER_OUT" => "Transfer",
|
||||
"TRANSFER" => "Transfer",
|
||||
"INTEREST" => "Interest",
|
||||
"FEE" => "Fee",
|
||||
"TAX" => "Fee",
|
||||
"REINVEST" => "Reinvestment",
|
||||
"SPLIT" => "Other",
|
||||
"MERGER" => "Other",
|
||||
"OTHER" => "Other"
|
||||
}.freeze
|
||||
|
||||
# Activity types that result in Trade records (involves securities)
|
||||
TRADE_TYPES = %w[BUY SELL REINVEST].freeze
|
||||
|
||||
# Sell-side activity types (quantity should be negative)
|
||||
SELL_SIDE_TYPES = %w[SELL].freeze
|
||||
|
||||
# Activity types that result in Transaction records (cash movements)
|
||||
CASH_TYPES = %w[DIVIDEND DIV CONTRIBUTION WITHDRAWAL TRANSFER_IN TRANSFER_OUT TRANSFER INTEREST FEE TAX].freeze
|
||||
|
||||
def initialize(indexa_capital_account)
|
||||
@indexa_capital_account = indexa_capital_account
|
||||
end
|
||||
|
||||
def process
|
||||
activities_data = @indexa_capital_account.raw_activities_payload
|
||||
return { trades: 0, transactions: 0 } if activities_data.blank?
|
||||
|
||||
Rails.logger.info "IndexaCapitalAccount::ActivitiesProcessor - Processing #{activities_data.size} activities"
|
||||
|
||||
@trades_count = 0
|
||||
@transactions_count = 0
|
||||
|
||||
activities_data.each do |activity_data|
|
||||
process_activity(activity_data.with_indifferent_access)
|
||||
rescue => e
|
||||
Rails.logger.error "IndexaCapitalAccount::ActivitiesProcessor - Failed to process activity: #{e.message}"
|
||||
Rails.logger.error e.backtrace.first(5).join("\n") if e.backtrace
|
||||
end
|
||||
|
||||
{ trades: @trades_count, transactions: @transactions_count }
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def account
|
||||
@indexa_capital_account.current_account
|
||||
end
|
||||
|
||||
def import_adapter
|
||||
@import_adapter ||= Account::ProviderImportAdapter.new(account)
|
||||
end
|
||||
|
||||
def process_activity(data)
|
||||
# TODO: Customize activity type field name
|
||||
activity_type = (data[:type] || data[:activity_type])&.upcase
|
||||
return if activity_type.blank?
|
||||
|
||||
# Get external ID for deduplication
|
||||
external_id = (data[:id] || data[:transaction_id]).to_s
|
||||
return if external_id.blank?
|
||||
|
||||
Rails.logger.info "IndexaCapitalAccount::ActivitiesProcessor - Processing activity: type=#{activity_type}, id=#{external_id}"
|
||||
|
||||
# Determine if this is a trade or cash activity
|
||||
if trade_activity?(activity_type)
|
||||
process_trade(data, activity_type, external_id)
|
||||
else
|
||||
process_cash_activity(data, activity_type, external_id)
|
||||
end
|
||||
end
|
||||
|
||||
def trade_activity?(activity_type)
|
||||
TRADE_TYPES.include?(activity_type)
|
||||
end
|
||||
|
||||
def process_trade(data, activity_type, external_id)
|
||||
# TODO: Customize ticker extraction based on your provider's format
|
||||
ticker = data[:symbol] || data[:ticker]
|
||||
if ticker.blank?
|
||||
Rails.logger.warn "IndexaCapitalAccount::ActivitiesProcessor - Skipping trade without symbol: #{external_id}"
|
||||
return
|
||||
end
|
||||
|
||||
# Resolve security
|
||||
security = resolve_security(ticker, data)
|
||||
return unless security
|
||||
|
||||
# TODO: Customize field names based on your provider's format
|
||||
quantity = parse_decimal(data[:units]) || parse_decimal(data[:quantity])
|
||||
price = parse_decimal(data[:price])
|
||||
|
||||
if quantity.nil?
|
||||
Rails.logger.warn "IndexaCapitalAccount::ActivitiesProcessor - Skipping trade without quantity: #{external_id}"
|
||||
return
|
||||
end
|
||||
|
||||
# Determine sign based on activity type (sell-side should be negative)
|
||||
quantity = if SELL_SIDE_TYPES.include?(activity_type)
|
||||
-quantity.abs
|
||||
else
|
||||
quantity.abs
|
||||
end
|
||||
|
||||
# Calculate amount
|
||||
amount = if price
|
||||
quantity * price
|
||||
else
|
||||
parse_decimal(data[:amount]) || parse_decimal(data[:trade_value])
|
||||
end
|
||||
|
||||
if amount.nil?
|
||||
Rails.logger.warn "IndexaCapitalAccount::ActivitiesProcessor - Skipping trade without amount: #{external_id}"
|
||||
return
|
||||
end
|
||||
|
||||
# Get the activity date
|
||||
# TODO: Customize date field names
|
||||
activity_date = parse_date(data[:settlement_date]) ||
|
||||
parse_date(data[:trade_date]) ||
|
||||
parse_date(data[:date]) ||
|
||||
Date.current
|
||||
|
||||
currency = extract_currency(data, fallback: account.currency)
|
||||
description = data[:description] || "#{activity_type} #{ticker}"
|
||||
|
||||
Rails.logger.info "IndexaCapitalAccount::ActivitiesProcessor - Importing trade: #{ticker} qty=#{quantity} price=#{price} date=#{activity_date}"
|
||||
|
||||
result = import_adapter.import_trade(
|
||||
external_id: external_id,
|
||||
security: security,
|
||||
quantity: quantity,
|
||||
price: price,
|
||||
amount: amount,
|
||||
currency: currency,
|
||||
date: activity_date,
|
||||
name: description,
|
||||
source: "indexa_capital",
|
||||
activity_label: label_from_type(activity_type)
|
||||
)
|
||||
@trades_count += 1 if result
|
||||
end
|
||||
|
||||
def process_cash_activity(data, activity_type, external_id)
|
||||
# TODO: Customize amount field names
|
||||
amount = parse_decimal(data[:amount]) ||
|
||||
parse_decimal(data[:net_amount])
|
||||
return if amount.nil?
|
||||
# Note: Zero-amount transactions (splits, free shares) are allowed
|
||||
|
||||
# Get the activity date
|
||||
# TODO: Customize date field names
|
||||
activity_date = parse_date(data[:settlement_date]) ||
|
||||
parse_date(data[:trade_date]) ||
|
||||
parse_date(data[:date]) ||
|
||||
Date.current
|
||||
|
||||
# Build description
|
||||
symbol = data[:symbol] || data[:ticker]
|
||||
description = data[:description] || build_description(activity_type, symbol)
|
||||
|
||||
# Normalize amount sign for certain activity types
|
||||
amount = normalize_cash_amount(amount, activity_type)
|
||||
|
||||
currency = extract_currency(data, fallback: account.currency)
|
||||
|
||||
Rails.logger.info "IndexaCapitalAccount::ActivitiesProcessor - Importing cash activity: type=#{activity_type} amount=#{amount} date=#{activity_date}"
|
||||
|
||||
result = import_adapter.import_transaction(
|
||||
external_id: external_id,
|
||||
amount: amount,
|
||||
currency: currency,
|
||||
date: activity_date,
|
||||
name: description,
|
||||
source: "indexa_capital",
|
||||
investment_activity_label: label_from_type(activity_type)
|
||||
)
|
||||
@transactions_count += 1 if result
|
||||
end
|
||||
|
||||
def normalize_cash_amount(amount, activity_type)
|
||||
case activity_type
|
||||
when "WITHDRAWAL", "TRANSFER_OUT", "FEE", "TAX"
|
||||
-amount.abs # These should be negative (money out)
|
||||
when "CONTRIBUTION", "TRANSFER_IN", "DIVIDEND", "DIV", "INTEREST"
|
||||
amount.abs # These should be positive (money in)
|
||||
else
|
||||
amount
|
||||
end
|
||||
end
|
||||
|
||||
def build_description(activity_type, symbol)
|
||||
type_label = label_from_type(activity_type)
|
||||
if symbol.present?
|
||||
"#{type_label} - #{symbol}"
|
||||
else
|
||||
type_label
|
||||
end
|
||||
end
|
||||
|
||||
def label_from_type(activity_type)
|
||||
normalized_type = activity_type&.upcase
|
||||
label = ACTIVITY_TYPE_TO_LABEL[normalized_type]
|
||||
|
||||
if label.nil? && normalized_type.present?
|
||||
Rails.logger.warn(
|
||||
"IndexaCapitalAccount::ActivitiesProcessor - Unmapped activity type '#{normalized_type}' " \
|
||||
"for account #{@indexa_capital_account.id}. Consider adding to ACTIVITY_TYPE_TO_LABEL mapping."
|
||||
)
|
||||
end
|
||||
|
||||
label || "Other"
|
||||
end
|
||||
end
|
||||
156
app/models/indexa_capital_account/data_helpers.rb
Normal file
156
app/models/indexa_capital_account/data_helpers.rb
Normal file
@@ -0,0 +1,156 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
module IndexaCapitalAccount::DataHelpers
|
||||
extend ActiveSupport::Concern
|
||||
|
||||
private
|
||||
|
||||
# Convert SDK objects to hashes via JSON round-trip
|
||||
# Many SDKs return objects that don't have proper #to_h methods
|
||||
def sdk_object_to_hash(obj)
|
||||
return obj if obj.is_a?(Hash)
|
||||
|
||||
if obj.respond_to?(:to_json)
|
||||
JSON.parse(obj.to_json)
|
||||
elsif obj.respond_to?(:to_h)
|
||||
obj.to_h
|
||||
else
|
||||
obj
|
||||
end
|
||||
rescue JSON::ParserError, TypeError
|
||||
obj.respond_to?(:to_h) ? obj.to_h : {}
|
||||
end
|
||||
|
||||
def parse_decimal(value)
|
||||
return nil if value.nil?
|
||||
|
||||
case value
|
||||
when BigDecimal
|
||||
value
|
||||
when String
|
||||
BigDecimal(value)
|
||||
when Numeric
|
||||
BigDecimal(value.to_s)
|
||||
else
|
||||
nil
|
||||
end
|
||||
rescue ArgumentError => e
|
||||
Rails.logger.error("IndexaCapitalAccount::DataHelpers - Failed to parse decimal value: #{value.inspect} - #{e.message}")
|
||||
nil
|
||||
end
|
||||
|
||||
def parse_date(date_value)
|
||||
return nil if date_value.nil?
|
||||
|
||||
case date_value
|
||||
when Date
|
||||
date_value
|
||||
when String
|
||||
# Use Time.zone.parse for external timestamps (Rails timezone guidelines)
|
||||
Time.zone.parse(date_value)&.to_date
|
||||
when Time, DateTime, ActiveSupport::TimeWithZone
|
||||
date_value.to_date
|
||||
else
|
||||
nil
|
||||
end
|
||||
rescue ArgumentError, TypeError => e
|
||||
Rails.logger.error("IndexaCapitalAccount::DataHelpers - Failed to parse date: #{date_value.inspect} - #{e.message}")
|
||||
nil
|
||||
end
|
||||
|
||||
# Find or create security with race condition handling
|
||||
def resolve_security(symbol, symbol_data = {})
|
||||
ticker = symbol.to_s.upcase.strip
|
||||
return nil if ticker.blank?
|
||||
|
||||
security = Security.find_by(ticker: ticker)
|
||||
|
||||
# If security exists but has a bad name (looks like a hash), update it
|
||||
if security && security.name&.start_with?("{")
|
||||
new_name = extract_security_name(symbol_data, ticker)
|
||||
Rails.logger.info "IndexaCapitalAccount::DataHelpers - Fixing security name: #{security.name.first(50)}... -> #{new_name}"
|
||||
security.update!(name: new_name)
|
||||
end
|
||||
|
||||
return security if security
|
||||
|
||||
# Create new security
|
||||
security_name = extract_security_name(symbol_data, ticker)
|
||||
|
||||
Rails.logger.info "IndexaCapitalAccount::DataHelpers - Creating security: ticker=#{ticker}, name=#{security_name}"
|
||||
|
||||
Security.create!(
|
||||
ticker: ticker,
|
||||
name: security_name,
|
||||
exchange_mic: extract_exchange(symbol_data),
|
||||
country_code: extract_country_code(symbol_data)
|
||||
)
|
||||
rescue ActiveRecord::RecordInvalid, ActiveRecord::RecordNotUnique => e
|
||||
# Handle race condition - another process may have created it
|
||||
Rails.logger.error "IndexaCapitalAccount::DataHelpers - Failed to create security #{ticker}: #{e.message}"
|
||||
Security.find_by(ticker: ticker)
|
||||
end
|
||||
|
||||
def extract_security_name(symbol_data, fallback_ticker)
|
||||
symbol_data = symbol_data.with_indifferent_access if symbol_data.respond_to?(:with_indifferent_access)
|
||||
|
||||
# Try various paths where the name might be
|
||||
name = symbol_data[:name] || symbol_data[:description]
|
||||
|
||||
# If description is missing or looks like a type description, use ticker
|
||||
if name.blank? || name.is_a?(Hash) || name =~ /^(COMMON STOCK|CRYPTOCURRENCY|ETF|MUTUAL FUND)$/i
|
||||
name = fallback_ticker
|
||||
end
|
||||
|
||||
# Titleize for readability if it's all caps
|
||||
name = name.titleize if name == name.upcase && name.length > 4
|
||||
|
||||
name
|
||||
end
|
||||
|
||||
def extract_exchange(symbol_data)
|
||||
symbol_data = symbol_data.with_indifferent_access if symbol_data.respond_to?(:with_indifferent_access)
|
||||
|
||||
exchange = symbol_data[:exchange]
|
||||
return nil unless exchange.is_a?(Hash)
|
||||
|
||||
exchange.with_indifferent_access[:mic_code] || exchange.with_indifferent_access[:id]
|
||||
end
|
||||
|
||||
def extract_country_code(symbol_data)
|
||||
symbol_data = symbol_data.with_indifferent_access if symbol_data.respond_to?(:with_indifferent_access)
|
||||
|
||||
# Try to extract country from currency or exchange
|
||||
currency = symbol_data[:currency]
|
||||
currency = currency.dig(:code) if currency.is_a?(Hash)
|
||||
|
||||
case currency
|
||||
when "USD"
|
||||
"US"
|
||||
when "CAD"
|
||||
"CA"
|
||||
when "GBP", "GBX"
|
||||
"GB"
|
||||
when "EUR"
|
||||
nil # Could be many countries
|
||||
else
|
||||
nil
|
||||
end
|
||||
end
|
||||
|
||||
# Handle currency as string or object (API inconsistency)
|
||||
def extract_currency(data, fallback: nil)
|
||||
data = data.with_indifferent_access if data.respond_to?(:with_indifferent_access)
|
||||
|
||||
currency_data = data[:currency]
|
||||
return fallback if currency_data.blank?
|
||||
|
||||
if currency_data.is_a?(Hash)
|
||||
currency_data.with_indifferent_access[:code] || fallback
|
||||
elsif currency_data.is_a?(String)
|
||||
currency_data.upcase
|
||||
else
|
||||
fallback
|
||||
end
|
||||
end
|
||||
end
|
||||
130
app/models/indexa_capital_account/holdings_processor.rb
Normal file
130
app/models/indexa_capital_account/holdings_processor.rb
Normal file
@@ -0,0 +1,130 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class IndexaCapitalAccount::HoldingsProcessor
|
||||
include IndexaCapitalAccount::DataHelpers
|
||||
|
||||
def initialize(indexa_capital_account)
|
||||
@indexa_capital_account = indexa_capital_account
|
||||
end
|
||||
|
||||
def process
|
||||
return unless account.present?
|
||||
|
||||
holdings_data = @indexa_capital_account.raw_holdings_payload
|
||||
return if holdings_data.blank?
|
||||
|
||||
Rails.logger.info "IndexaCapitalAccount::HoldingsProcessor - Processing #{holdings_data.size} holdings"
|
||||
|
||||
holdings_data.each_with_index do |holding_data, idx|
|
||||
Rails.logger.info "IndexaCapitalAccount::HoldingsProcessor - Processing holding #{idx + 1}/#{holdings_data.size}"
|
||||
process_holding(holding_data.with_indifferent_access)
|
||||
rescue => e
|
||||
Rails.logger.error "IndexaCapitalAccount::HoldingsProcessor - Failed to process holding #{idx + 1}: #{e.class} - #{e.message}"
|
||||
Rails.logger.error e.backtrace.first(5).join("\n") if e.backtrace
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def account
|
||||
@indexa_capital_account.current_account
|
||||
end
|
||||
|
||||
def import_adapter
|
||||
@import_adapter ||= Account::ProviderImportAdapter.new(account)
|
||||
end
|
||||
|
||||
# Indexa Capital fiscal-results field mapping:
|
||||
# instrument.identifier (ISIN) → ticker
|
||||
# instrument.name → security name
|
||||
# titles → quantity (number of shares/units)
|
||||
# price → current price per unit
|
||||
# amount → total market value
|
||||
# cost_price → average purchase price (cost basis per unit)
|
||||
# cost_amount → total cost basis
|
||||
# profit_loss → unrealized P&L
|
||||
# subscription_date → purchase date
|
||||
def process_holding(data)
|
||||
ticker = extract_ticker(data)
|
||||
return if ticker.blank?
|
||||
|
||||
Rails.logger.info "IndexaCapitalAccount::HoldingsProcessor - Processing holding for ticker: #{ticker}"
|
||||
|
||||
security = resolve_security(ticker, data)
|
||||
return unless security
|
||||
|
||||
quantity = parse_decimal(data[:titles]) || parse_decimal(data[:quantity]) || parse_decimal(data[:units])
|
||||
price = parse_decimal(data[:price])
|
||||
return if quantity.nil? || price.nil?
|
||||
|
||||
amount = parse_decimal(data[:amount]) || (quantity * price)
|
||||
currency = "EUR" # Indexa Capital is EUR-only
|
||||
holding_date = Date.current
|
||||
|
||||
Rails.logger.info "IndexaCapitalAccount::HoldingsProcessor - Importing holding: #{ticker} qty=#{quantity} price=#{price} currency=#{currency}"
|
||||
|
||||
import_adapter.import_holding(
|
||||
security: security,
|
||||
quantity: quantity,
|
||||
amount: amount,
|
||||
currency: currency,
|
||||
date: holding_date,
|
||||
price: price,
|
||||
account_provider_id: @indexa_capital_account.account_provider&.id,
|
||||
source: "indexa_capital",
|
||||
delete_future_holdings: false
|
||||
)
|
||||
|
||||
# Store cost basis from cost_price (average purchase price per unit)
|
||||
cost_price = parse_decimal(data[:cost_price])
|
||||
update_holding_cost_basis(security, cost_price) if cost_price.present?
|
||||
end
|
||||
|
||||
# Extract ISIN from instrument data as ticker
|
||||
def extract_ticker(data)
|
||||
# Indexa Capital uses ISIN codes nested under instrument
|
||||
instrument = data[:instrument]
|
||||
if instrument.is_a?(Hash)
|
||||
instrument = instrument.with_indifferent_access
|
||||
return instrument[:identifier] || instrument[:isin]
|
||||
end
|
||||
|
||||
# Fallback to flat fields
|
||||
data[:isin] || data[:identifier] || data[:symbol] || data[:ticker]
|
||||
end
|
||||
|
||||
# Override security name extraction for Indexa Capital
|
||||
def extract_security_name(symbol_data, fallback_ticker)
|
||||
symbol_data = symbol_data.with_indifferent_access if symbol_data.respond_to?(:with_indifferent_access)
|
||||
|
||||
instrument = symbol_data[:instrument]
|
||||
if instrument.is_a?(Hash)
|
||||
instrument = instrument.with_indifferent_access
|
||||
name = instrument[:name] || instrument[:description]
|
||||
return name if name.present?
|
||||
end
|
||||
|
||||
name = symbol_data[:name] || symbol_data[:description]
|
||||
return fallback_ticker if name.blank? || name.is_a?(Hash)
|
||||
|
||||
name
|
||||
end
|
||||
|
||||
def update_holding_cost_basis(security, cost_price)
|
||||
holding = account.holdings
|
||||
.where(security: security)
|
||||
.where("cost_basis_source != 'manual' OR cost_basis_source IS NULL")
|
||||
.order(date: :desc)
|
||||
.first
|
||||
|
||||
return unless holding
|
||||
|
||||
cost_basis = parse_decimal(cost_price)
|
||||
return if cost_basis.nil?
|
||||
|
||||
holding.update!(
|
||||
cost_basis: cost_basis,
|
||||
cost_basis_source: "provider"
|
||||
)
|
||||
end
|
||||
end
|
||||
116
app/models/indexa_capital_account/processor.rb
Normal file
116
app/models/indexa_capital_account/processor.rb
Normal file
@@ -0,0 +1,116 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class IndexaCapitalAccount::Processor
|
||||
include IndexaCapitalAccount::DataHelpers
|
||||
|
||||
attr_reader :indexa_capital_account
|
||||
|
||||
def initialize(indexa_capital_account)
|
||||
@indexa_capital_account = indexa_capital_account
|
||||
end
|
||||
|
||||
def process
|
||||
account = indexa_capital_account.current_account
|
||||
return unless account
|
||||
|
||||
Rails.logger.info "IndexaCapitalAccount::Processor - Processing account #{indexa_capital_account.id} -> Sure account #{account.id}"
|
||||
|
||||
# Update account balance FIRST (before processing transactions/holdings/activities)
|
||||
update_account_balance(account)
|
||||
|
||||
# Process holdings
|
||||
holdings_count = indexa_capital_account.raw_holdings_payload&.size || 0
|
||||
Rails.logger.info "IndexaCapitalAccount::Processor - Holdings payload has #{holdings_count} items"
|
||||
|
||||
if indexa_capital_account.raw_holdings_payload.present?
|
||||
Rails.logger.info "IndexaCapitalAccount::Processor - Processing holdings..."
|
||||
IndexaCapitalAccount::HoldingsProcessor.new(indexa_capital_account).process
|
||||
else
|
||||
Rails.logger.warn "IndexaCapitalAccount::Processor - No holdings payload to process"
|
||||
end
|
||||
|
||||
# Process activities (trades, dividends, etc.)
|
||||
activities_count = indexa_capital_account.raw_activities_payload&.size || 0
|
||||
Rails.logger.info "IndexaCapitalAccount::Processor - Activities payload has #{activities_count} items"
|
||||
|
||||
if indexa_capital_account.raw_activities_payload.present?
|
||||
Rails.logger.info "IndexaCapitalAccount::Processor - Processing activities..."
|
||||
IndexaCapitalAccount::ActivitiesProcessor.new(indexa_capital_account).process
|
||||
else
|
||||
Rails.logger.warn "IndexaCapitalAccount::Processor - No activities payload to process"
|
||||
end
|
||||
|
||||
# Trigger immediate UI refresh so entries appear in the activity feed
|
||||
account.broadcast_sync_complete
|
||||
Rails.logger.info "IndexaCapitalAccount::Processor - Broadcast sync complete for account #{account.id}"
|
||||
|
||||
{ holdings_processed: holdings_count > 0, activities_processed: activities_count > 0 }
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def update_account_balance(account)
|
||||
# Calculate total balance and cash balance from provider data
|
||||
total_balance = calculate_total_balance
|
||||
cash_balance = calculate_cash_balance
|
||||
|
||||
Rails.logger.info "IndexaCapitalAccount::Processor - Balance update: total=#{total_balance}, cash=#{cash_balance}"
|
||||
|
||||
# Update the cached fields on the account
|
||||
account.assign_attributes(
|
||||
balance: total_balance,
|
||||
cash_balance: cash_balance,
|
||||
currency: indexa_capital_account.currency || account.currency
|
||||
)
|
||||
account.save!
|
||||
|
||||
# Create or update the current balance anchor valuation for linked accounts
|
||||
# This is critical for reverse sync to work correctly
|
||||
account.set_current_balance(total_balance)
|
||||
end
|
||||
|
||||
def calculate_total_balance
|
||||
# Calculate total from holdings + cash for accuracy
|
||||
holdings_value = calculate_holdings_value
|
||||
cash_value = indexa_capital_account.cash_balance || 0
|
||||
|
||||
calculated_total = holdings_value + cash_value
|
||||
|
||||
# Use calculated total if we have holdings, otherwise trust API value
|
||||
if holdings_value > 0
|
||||
Rails.logger.info "IndexaCapitalAccount::Processor - Using calculated total: holdings=#{holdings_value} + cash=#{cash_value} = #{calculated_total}"
|
||||
calculated_total
|
||||
elsif indexa_capital_account.current_balance.present?
|
||||
Rails.logger.info "IndexaCapitalAccount::Processor - Using API total: #{indexa_capital_account.current_balance}"
|
||||
indexa_capital_account.current_balance
|
||||
else
|
||||
calculated_total
|
||||
end
|
||||
end
|
||||
|
||||
def calculate_cash_balance
|
||||
# Use provider's cash_balance directly
|
||||
# Note: Can be negative for margin accounts
|
||||
cash = indexa_capital_account.cash_balance
|
||||
Rails.logger.info "IndexaCapitalAccount::Processor - Cash balance from API: #{cash.inspect}"
|
||||
cash || BigDecimal("0")
|
||||
end
|
||||
|
||||
def calculate_holdings_value
|
||||
holdings_data = indexa_capital_account.raw_holdings_payload || []
|
||||
return 0 if holdings_data.empty?
|
||||
|
||||
holdings_data.sum do |holding|
|
||||
data = holding.is_a?(Hash) ? holding.with_indifferent_access : {}
|
||||
# Indexa Capital: amount = total market value, or titles * price
|
||||
amount = parse_decimal(data[:amount])
|
||||
if amount
|
||||
amount
|
||||
else
|
||||
titles = parse_decimal(data[:titles] || data[:quantity] || data[:units]) || 0
|
||||
price = parse_decimal(data[:price]) || 0
|
||||
titles * price
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
181
app/models/indexa_capital_item.rb
Normal file
181
app/models/indexa_capital_item.rb
Normal file
@@ -0,0 +1,181 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class IndexaCapitalItem < ApplicationRecord
|
||||
include Syncable, Provided, Unlinking
|
||||
|
||||
enum :status, { good: "good", requires_update: "requires_update" }, default: :good
|
||||
|
||||
# Helper to detect if ActiveRecord Encryption is configured for this app
|
||||
def self.encryption_ready?
|
||||
creds_ready = Rails.application.credentials.active_record_encryption.present?
|
||||
env_ready = ENV["ACTIVE_RECORD_ENCRYPTION_PRIMARY_KEY"].present? &&
|
||||
ENV["ACTIVE_RECORD_ENCRYPTION_DETERMINISTIC_KEY"].present? &&
|
||||
ENV["ACTIVE_RECORD_ENCRYPTION_KEY_DERIVATION_SALT"].present?
|
||||
creds_ready || env_ready
|
||||
end
|
||||
|
||||
# Encrypt sensitive credentials if ActiveRecord encryption is configured
|
||||
if encryption_ready?
|
||||
encrypts :password, deterministic: true
|
||||
encrypts :api_token, deterministic: true
|
||||
end
|
||||
|
||||
validates :name, presence: true
|
||||
validate :credentials_present_on_create, on: :create
|
||||
|
||||
belongs_to :family
|
||||
has_one_attached :logo, dependent: :purge_later
|
||||
|
||||
has_many :indexa_capital_accounts, dependent: :destroy
|
||||
has_many :accounts, through: :indexa_capital_accounts
|
||||
|
||||
scope :active, -> { where(scheduled_for_deletion: false) }
|
||||
scope :syncable, -> { active }
|
||||
scope :ordered, -> { order(created_at: :desc) }
|
||||
scope :needs_update, -> { where(status: :requires_update) }
|
||||
|
||||
def syncer
|
||||
IndexaCapitalItem::Syncer.new(self)
|
||||
end
|
||||
|
||||
def destroy_later
|
||||
update!(scheduled_for_deletion: true)
|
||||
DestroyJob.perform_later(self)
|
||||
end
|
||||
|
||||
# Override syncing? to include background activities fetch
|
||||
def syncing?
|
||||
super || indexa_capital_accounts.where(activities_fetch_pending: true).exists?
|
||||
end
|
||||
|
||||
# Import data from provider API
|
||||
def import_latest_indexa_capital_data(sync: nil)
|
||||
provider = indexa_capital_provider
|
||||
unless provider
|
||||
Rails.logger.error "IndexaCapitalItem #{id} - Cannot import: provider is not configured"
|
||||
raise StandardError, I18n.t("indexa_capital_items.errors.provider_not_configured")
|
||||
end
|
||||
|
||||
IndexaCapitalItem::Importer.new(self, indexa_capital_provider: provider, sync: sync).import
|
||||
rescue => e
|
||||
Rails.logger.error "IndexaCapitalItem #{id} - Failed to import data: #{e.message}"
|
||||
raise
|
||||
end
|
||||
|
||||
# Process linked accounts after data import
|
||||
def process_accounts
|
||||
return [] if indexa_capital_accounts.empty?
|
||||
|
||||
results = []
|
||||
linked_indexa_capital_accounts.includes(account_provider: :account).each do |indexa_capital_account|
|
||||
begin
|
||||
result = IndexaCapitalAccount::Processor.new(indexa_capital_account).process
|
||||
results << { indexa_capital_account_id: indexa_capital_account.id, success: true, result: result }
|
||||
rescue => e
|
||||
Rails.logger.error "IndexaCapitalItem #{id} - Failed to process account #{indexa_capital_account.id}: #{e.message}"
|
||||
results << { indexa_capital_account_id: indexa_capital_account.id, success: false, error: e.message }
|
||||
end
|
||||
end
|
||||
|
||||
results
|
||||
end
|
||||
|
||||
# Schedule sync jobs for all linked accounts
|
||||
def schedule_account_syncs(parent_sync: nil, window_start_date: nil, window_end_date: nil)
|
||||
return [] if accounts.empty?
|
||||
|
||||
results = []
|
||||
accounts.visible.each do |account|
|
||||
begin
|
||||
account.sync_later(
|
||||
parent_sync: parent_sync,
|
||||
window_start_date: window_start_date,
|
||||
window_end_date: window_end_date
|
||||
)
|
||||
results << { account_id: account.id, success: true }
|
||||
rescue => e
|
||||
Rails.logger.error "IndexaCapitalItem #{id} - Failed to schedule sync for account #{account.id}: #{e.message}"
|
||||
results << { account_id: account.id, success: false, error: e.message }
|
||||
end
|
||||
end
|
||||
|
||||
results
|
||||
end
|
||||
|
||||
def upsert_indexa_capital_snapshot!(accounts_snapshot)
|
||||
assign_attributes(
|
||||
raw_payload: accounts_snapshot
|
||||
)
|
||||
|
||||
save!
|
||||
end
|
||||
|
||||
def has_completed_initial_setup?
|
||||
accounts.any?
|
||||
end
|
||||
|
||||
# Linked accounts (have AccountProvider association)
|
||||
def linked_indexa_capital_accounts
|
||||
indexa_capital_accounts.joins(:account_provider)
|
||||
end
|
||||
|
||||
# Unlinked accounts (no AccountProvider association)
|
||||
def unlinked_indexa_capital_accounts
|
||||
indexa_capital_accounts.left_joins(:account_provider).where(account_providers: { id: nil })
|
||||
end
|
||||
|
||||
def sync_status_summary
|
||||
total_accounts = total_accounts_count
|
||||
linked_count = linked_accounts_count
|
||||
unlinked_count = unlinked_accounts_count
|
||||
|
||||
if total_accounts == 0
|
||||
I18n.t("indexa_capital_items.sync_status.no_accounts")
|
||||
elsif unlinked_count == 0
|
||||
I18n.t("indexa_capital_items.sync_status.synced", count: linked_count)
|
||||
else
|
||||
I18n.t("indexa_capital_items.sync_status.synced_with_setup", linked: linked_count, unlinked: unlinked_count)
|
||||
end
|
||||
end
|
||||
|
||||
def linked_accounts_count
|
||||
indexa_capital_accounts.joins(:account_provider).count
|
||||
end
|
||||
|
||||
def unlinked_accounts_count
|
||||
indexa_capital_accounts.left_joins(:account_provider).where(account_providers: { id: nil }).count
|
||||
end
|
||||
|
||||
def total_accounts_count
|
||||
indexa_capital_accounts.count
|
||||
end
|
||||
|
||||
def institution_display_name
|
||||
institution_name.presence || institution_domain.presence || name
|
||||
end
|
||||
|
||||
def connected_institutions
|
||||
indexa_capital_accounts.includes(:account)
|
||||
.where.not(institution_metadata: nil)
|
||||
.map { |acc| acc.institution_metadata }
|
||||
.uniq { |inst| inst["name"] || inst["institution_name"] }
|
||||
end
|
||||
|
||||
def institution_summary
|
||||
institutions = connected_institutions
|
||||
case institutions.count
|
||||
when 0
|
||||
I18n.t("indexa_capital_items.institution_summary.none")
|
||||
else
|
||||
I18n.t("indexa_capital_items.institution_summary.count", count: institutions.count)
|
||||
end
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def credentials_present_on_create
|
||||
return if credentials_configured?
|
||||
|
||||
errors.add(:base, "Either INDEXA_API_TOKEN env var or username/document/password credentials are required")
|
||||
end
|
||||
end
|
||||
157
app/models/indexa_capital_item/importer.rb
Normal file
157
app/models/indexa_capital_item/importer.rb
Normal file
@@ -0,0 +1,157 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class IndexaCapitalItem::Importer
|
||||
include SyncStats::Collector
|
||||
include IndexaCapitalAccount::DataHelpers
|
||||
|
||||
attr_reader :indexa_capital_item, :indexa_capital_provider, :sync
|
||||
|
||||
def initialize(indexa_capital_item, indexa_capital_provider:, sync: nil)
|
||||
@indexa_capital_item = indexa_capital_item
|
||||
@indexa_capital_provider = indexa_capital_provider
|
||||
@sync = sync
|
||||
end
|
||||
|
||||
class CredentialsError < StandardError; end
|
||||
|
||||
def import
|
||||
Rails.logger.info "IndexaCapitalItem::Importer - Starting import for item #{indexa_capital_item.id}"
|
||||
|
||||
unless indexa_capital_provider
|
||||
raise CredentialsError, "No IndexaCapital provider configured for item #{indexa_capital_item.id}"
|
||||
end
|
||||
|
||||
# Step 1: Fetch and store all accounts
|
||||
import_accounts
|
||||
|
||||
# Step 2: For LINKED accounts only, fetch holdings data
|
||||
linked_accounts = IndexaCapitalAccount
|
||||
.where(indexa_capital_item_id: indexa_capital_item.id)
|
||||
.joins(:account_provider)
|
||||
|
||||
Rails.logger.info "IndexaCapitalItem::Importer - Found #{linked_accounts.count} linked accounts to process"
|
||||
|
||||
linked_accounts.each do |indexa_capital_account|
|
||||
Rails.logger.info "IndexaCapitalItem::Importer - Processing linked account #{indexa_capital_account.id}"
|
||||
import_holdings(indexa_capital_account)
|
||||
end
|
||||
|
||||
# Update raw payload on the item
|
||||
indexa_capital_item.upsert_indexa_capital_snapshot!(stats)
|
||||
rescue Provider::IndexaCapital::AuthenticationError
|
||||
indexa_capital_item.update!(status: :requires_update)
|
||||
raise
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def stats
|
||||
@stats ||= {}
|
||||
end
|
||||
|
||||
def persist_stats!
|
||||
return unless sync&.respond_to?(:sync_stats)
|
||||
merged = (sync.sync_stats || {}).merge(stats)
|
||||
sync.update_columns(sync_stats: merged)
|
||||
end
|
||||
|
||||
def import_accounts
|
||||
Rails.logger.info "IndexaCapitalItem::Importer - Fetching accounts from Indexa Capital API"
|
||||
|
||||
accounts_data = indexa_capital_provider.list_accounts
|
||||
|
||||
stats["api_requests"] = stats.fetch("api_requests", 0) + 1
|
||||
stats["total_accounts"] = accounts_data.size
|
||||
|
||||
upstream_account_ids = []
|
||||
|
||||
accounts_data.each do |account_data|
|
||||
import_account(account_data)
|
||||
upstream_account_ids << account_data[:account_number].to_s if account_data[:account_number]
|
||||
rescue => e
|
||||
Rails.logger.error "IndexaCapitalItem::Importer - Failed to import account: #{e.message}"
|
||||
stats["accounts_skipped"] = stats.fetch("accounts_skipped", 0) + 1
|
||||
register_error(e, account_data: account_data)
|
||||
end
|
||||
|
||||
persist_stats!
|
||||
|
||||
# Clean up accounts that no longer exist upstream
|
||||
prune_removed_accounts(upstream_account_ids)
|
||||
end
|
||||
|
||||
def import_account(account_data)
|
||||
account_number = account_data[:account_number].to_s
|
||||
return if account_number.blank?
|
||||
|
||||
# Fetch current balance from performance endpoint
|
||||
begin
|
||||
balance = indexa_capital_provider.get_account_balance(account_number: account_number)
|
||||
account_data[:current_balance] = balance
|
||||
stats["api_requests"] = stats.fetch("api_requests", 0) + 1
|
||||
rescue => e
|
||||
Rails.logger.warn "IndexaCapitalItem::Importer - Failed to fetch balance for #{account_number}: #{e.message}"
|
||||
end
|
||||
|
||||
indexa_capital_account = indexa_capital_item.indexa_capital_accounts.find_or_initialize_by(
|
||||
indexa_capital_account_id: account_number
|
||||
)
|
||||
|
||||
indexa_capital_account.upsert_from_indexa_capital!(account_data)
|
||||
|
||||
stats["accounts_imported"] = stats.fetch("accounts_imported", 0) + 1
|
||||
end
|
||||
|
||||
def import_holdings(indexa_capital_account)
|
||||
account_number = indexa_capital_account.indexa_capital_account_id
|
||||
Rails.logger.info "IndexaCapitalItem::Importer - Fetching holdings for account #{account_number}"
|
||||
|
||||
begin
|
||||
holdings_data = indexa_capital_provider.get_holdings(account_number: account_number)
|
||||
|
||||
stats["api_requests"] = stats.fetch("api_requests", 0) + 1
|
||||
|
||||
# The API returns fiscal-results which may be a hash with an array inside
|
||||
holdings_array = normalize_holdings_response(holdings_data)
|
||||
|
||||
if holdings_array.any?
|
||||
holdings_hashes = holdings_array.map { |h| sdk_object_to_hash(h) }
|
||||
indexa_capital_account.upsert_holdings_snapshot!(holdings_hashes)
|
||||
stats["holdings_found"] = stats.fetch("holdings_found", 0) + holdings_array.size
|
||||
end
|
||||
rescue => e
|
||||
Rails.logger.warn "IndexaCapitalItem::Importer - Failed to fetch holdings: #{e.message}"
|
||||
register_error(e, context: "holdings", account_id: indexa_capital_account.id)
|
||||
end
|
||||
end
|
||||
|
||||
# fiscal-results response may be an array or a hash containing an array
|
||||
def normalize_holdings_response(data)
|
||||
return data if data.is_a?(Array)
|
||||
return [] if data.nil?
|
||||
|
||||
# Try common response shapes
|
||||
data[:fiscal_results] || data[:results] || data[:positions] || data[:data] || []
|
||||
end
|
||||
|
||||
def prune_removed_accounts(upstream_account_ids)
|
||||
return if upstream_account_ids.empty?
|
||||
|
||||
removed = indexa_capital_item.indexa_capital_accounts
|
||||
.where.not(indexa_capital_account_id: upstream_account_ids)
|
||||
|
||||
if removed.any?
|
||||
Rails.logger.info "IndexaCapitalItem::Importer - Pruning #{removed.count} removed accounts"
|
||||
removed.destroy_all
|
||||
end
|
||||
end
|
||||
|
||||
def register_error(error, **context)
|
||||
stats["errors"] ||= []
|
||||
stats["errors"] << {
|
||||
message: error.message,
|
||||
context: context.to_s,
|
||||
timestamp: Time.current.iso8601
|
||||
}
|
||||
end
|
||||
end
|
||||
37
app/models/indexa_capital_item/provided.rb
Normal file
37
app/models/indexa_capital_item/provided.rb
Normal file
@@ -0,0 +1,37 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
module IndexaCapitalItem::Provided
|
||||
extend ActiveSupport::Concern
|
||||
|
||||
def indexa_capital_provider
|
||||
return nil unless credentials_configured?
|
||||
|
||||
token = resolved_api_token
|
||||
if token.present?
|
||||
Provider::IndexaCapital.new(api_token: token)
|
||||
else
|
||||
Provider::IndexaCapital.new(
|
||||
username: username,
|
||||
document: document,
|
||||
password: password
|
||||
)
|
||||
end
|
||||
end
|
||||
|
||||
def indexa_capital_credentials
|
||||
return nil unless credentials_configured?
|
||||
|
||||
{ username: username, document: document, password: password }
|
||||
end
|
||||
|
||||
def credentials_configured?
|
||||
resolved_api_token.present? || (username.present? && document.present? && password.present?)
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
# Priority: stored token > env token
|
||||
def resolved_api_token
|
||||
api_token.presence || ENV["INDEXA_API_TOKEN"].presence
|
||||
end
|
||||
end
|
||||
86
app/models/indexa_capital_item/syncer.rb
Normal file
86
app/models/indexa_capital_item/syncer.rb
Normal file
@@ -0,0 +1,86 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
class IndexaCapitalItem::Syncer
|
||||
include SyncStats::Collector
|
||||
|
||||
attr_reader :indexa_capital_item
|
||||
|
||||
def initialize(indexa_capital_item)
|
||||
@indexa_capital_item = indexa_capital_item
|
||||
end
|
||||
|
||||
def perform_sync(sync)
|
||||
Rails.logger.info "IndexaCapitalItem::Syncer - Starting sync for item #{indexa_capital_item.id}"
|
||||
|
||||
# Phase 1: Import data from provider API
|
||||
sync.update!(status_text: I18n.t("indexa_capital_items.sync.status.importing")) if sync.respond_to?(:status_text)
|
||||
indexa_capital_item.import_latest_indexa_capital_data(sync: sync)
|
||||
|
||||
# Phase 2: Collect setup statistics
|
||||
finalize_setup_counts(sync)
|
||||
|
||||
# Phase 3: Process data for linked accounts
|
||||
linked_indexa_capital_accounts = indexa_capital_item.linked_indexa_capital_accounts.includes(account_provider: :account)
|
||||
if linked_indexa_capital_accounts.any?
|
||||
sync.update!(status_text: I18n.t("indexa_capital_items.sync.status.processing")) if sync.respond_to?(:status_text)
|
||||
mark_import_started(sync)
|
||||
indexa_capital_item.process_accounts
|
||||
|
||||
# Phase 4: Schedule balance calculations
|
||||
sync.update!(status_text: I18n.t("indexa_capital_items.sync.status.calculating")) if sync.respond_to?(:status_text)
|
||||
indexa_capital_item.schedule_account_syncs(
|
||||
parent_sync: sync,
|
||||
window_start_date: sync.window_start_date,
|
||||
window_end_date: sync.window_end_date
|
||||
)
|
||||
|
||||
# Phase 5: Collect statistics
|
||||
account_ids = linked_indexa_capital_accounts.filter_map { |pa| pa.current_account&.id }
|
||||
collect_transaction_stats(sync, account_ids: account_ids, source: "indexa_capital")
|
||||
collect_trades_stats(sync, account_ids: account_ids, source: "indexa_capital")
|
||||
collect_holdings_stats(sync, holdings_count: count_holdings, label: "processed")
|
||||
end
|
||||
|
||||
# Mark sync health
|
||||
collect_health_stats(sync, errors: nil)
|
||||
rescue Provider::IndexaCapital::AuthenticationError => e
|
||||
indexa_capital_item.update!(status: :requires_update)
|
||||
collect_health_stats(sync, errors: [ { message: e.message, category: "auth_error" } ])
|
||||
raise
|
||||
rescue => e
|
||||
collect_health_stats(sync, errors: [ { message: e.message, category: "sync_error" } ])
|
||||
raise
|
||||
end
|
||||
|
||||
# Public: called by Sync after finalization
|
||||
def perform_post_sync
|
||||
# Override for post-sync cleanup if needed
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def count_holdings
|
||||
indexa_capital_item.linked_indexa_capital_accounts.sum { |pa| Array(pa.raw_holdings_payload).size }
|
||||
end
|
||||
|
||||
def mark_import_started(sync)
|
||||
# Mark that we're now processing imported data
|
||||
sync.update!(status_text: I18n.t("indexa_capital_items.sync.status.importing_data")) if sync.respond_to?(:status_text)
|
||||
end
|
||||
|
||||
def finalize_setup_counts(sync)
|
||||
sync.update!(status_text: I18n.t("indexa_capital_items.sync.status.checking_setup")) if sync.respond_to?(:status_text)
|
||||
|
||||
unlinked_count = indexa_capital_item.unlinked_accounts_count
|
||||
|
||||
if unlinked_count > 0
|
||||
indexa_capital_item.update!(pending_account_setup: true)
|
||||
sync.update!(status_text: I18n.t("indexa_capital_items.sync.status.needs_setup", count: unlinked_count)) if sync.respond_to?(:status_text)
|
||||
else
|
||||
indexa_capital_item.update!(pending_account_setup: false)
|
||||
end
|
||||
|
||||
# Collect setup stats
|
||||
collect_setup_stats(sync, provider_accounts: indexa_capital_item.indexa_capital_accounts)
|
||||
end
|
||||
end
|
||||
49
app/models/indexa_capital_item/unlinking.rb
Normal file
49
app/models/indexa_capital_item/unlinking.rb
Normal file
@@ -0,0 +1,49 @@
|
||||
# frozen_string_literal: true
|
||||
|
||||
module IndexaCapitalItem::Unlinking
|
||||
# Concern that encapsulates unlinking logic for a IndexaCapital item.
|
||||
extend ActiveSupport::Concern
|
||||
|
||||
# Idempotently remove all connections between this IndexaCapital item and local accounts.
|
||||
# - Detaches any AccountProvider links for each IndexaCapitalAccount
|
||||
# - Detaches Holdings that point at the AccountProvider links
|
||||
# Returns a per-account result payload for observability
|
||||
def unlink_all!(dry_run: false)
|
||||
results = []
|
||||
|
||||
indexa_capital_accounts.find_each do |provider_account|
|
||||
links = AccountProvider.where(provider_type: "IndexaCapitalAccount", provider_id: provider_account.id).to_a
|
||||
link_ids = links.map(&:id)
|
||||
result = {
|
||||
provider_account_id: provider_account.id,
|
||||
name: provider_account.name,
|
||||
provider_link_ids: link_ids
|
||||
}
|
||||
results << result
|
||||
|
||||
next if dry_run
|
||||
|
||||
begin
|
||||
ActiveRecord::Base.transaction do
|
||||
# Detach holdings for any provider links found
|
||||
if link_ids.any?
|
||||
Holding.where(account_provider_id: link_ids).update_all(account_provider_id: nil)
|
||||
end
|
||||
|
||||
# Destroy all provider links
|
||||
links.each do |ap|
|
||||
ap.destroy!
|
||||
end
|
||||
end
|
||||
rescue StandardError => e
|
||||
Rails.logger.warn(
|
||||
"IndexaCapitalItem Unlinker: failed to fully unlink provider account ##{provider_account.id} (links=#{link_ids.inspect}): #{e.class} - #{e.message}"
|
||||
)
|
||||
# Record error for observability; continue with other accounts
|
||||
result[:error] = e.message
|
||||
end
|
||||
end
|
||||
|
||||
results
|
||||
end
|
||||
end
|
||||
@@ -11,7 +11,7 @@ class Invitation < ApplicationRecord
|
||||
end
|
||||
|
||||
validates :email, presence: true, format: { with: URI::MailTo::EMAIL_REGEXP }
|
||||
validates :role, presence: true, inclusion: { in: %w[admin member] }
|
||||
validates :role, presence: true, inclusion: { in: %w[admin member guest] }
|
||||
validates :token, presence: true, uniqueness: true
|
||||
validates_uniqueness_of :email, scope: :family_id, message: "has already been invited to this family"
|
||||
validate :inviter_is_admin
|
||||
@@ -26,8 +26,26 @@ class Invitation < ApplicationRecord
|
||||
accepted_at.nil? && expires_at > Time.current
|
||||
end
|
||||
|
||||
def accept_for(user)
|
||||
return false if user.blank?
|
||||
return false unless pending?
|
||||
return false unless emails_match?(user)
|
||||
|
||||
transaction do
|
||||
user.update!(family_id: family_id, role: role.to_s)
|
||||
update!(accepted_at: Time.current)
|
||||
end
|
||||
true
|
||||
end
|
||||
|
||||
private
|
||||
|
||||
def emails_match?(user)
|
||||
inv_email = email.to_s.strip.downcase
|
||||
usr_email = user.email.to_s.strip.downcase
|
||||
inv_email.present? && usr_email.present? && inv_email == usr_email
|
||||
end
|
||||
|
||||
def generate_token
|
||||
loop do
|
||||
self.token = SecureRandom.hex(32)
|
||||
|
||||
@@ -73,10 +73,20 @@ class LunchflowEntry::Processor
|
||||
base_temp_id = content_hash_for_transaction(data)
|
||||
temp_id_with_prefix = "lunchflow_pending_#{base_temp_id}"
|
||||
|
||||
# Handle collisions: if this external_id already exists for this account,
|
||||
# append a counter to make it unique. This prevents multiple pending transactions
|
||||
# with identical attributes (e.g., two same-day Uber rides) from colliding.
|
||||
# We check both the account's entries and the current raw payload being processed.
|
||||
# Check if entry with this external_id already exists
|
||||
# If it does AND it's still pending, reuse the same ID for re-sync.
|
||||
# The import adapter's skip logic will handle user edits correctly.
|
||||
# We DON'T check if attributes match - user edits should not cause duplicates.
|
||||
if entry_exists_with_external_id?(temp_id_with_prefix)
|
||||
existing_entry = account.entries.find_by(external_id: temp_id_with_prefix, source: "lunchflow")
|
||||
if existing_entry && existing_entry.entryable.is_a?(Transaction) && existing_entry.entryable.pending?
|
||||
Rails.logger.debug "Lunchflow: Reusing ID #{temp_id_with_prefix} for re-synced pending transaction"
|
||||
return temp_id_with_prefix
|
||||
end
|
||||
end
|
||||
|
||||
# Handle true collisions: multiple different transactions with same attributes
|
||||
# (e.g., two Uber rides on the same day for the same amount within the same sync)
|
||||
final_id = temp_id_with_prefix
|
||||
counter = 1
|
||||
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user