Performance improvements in holding calculation pipeline (#1579)

* Performance improvements in holding calculation pipeline

Investment accounts with large histories were pegging CPU at 100% during
sync. Root cause was a cluster of quadratic and superlinear algorithms in
the inner holding calculation loop. All are replaced with O(1) hash lookups
built from single-pass indexes over the already-loaded data.

Holding::PortfolioCache - load_prices:

  Three O(SxN) patterns inside the per-security loop:

  1. DB prices: `security.prices.where(...)` fired one SQL query per
     security (N+1). Replaced with a single bulk query before the loop:

       Security::Price.where(security_id: ..., date: ...).group_by(&:security_id)

     70 securities -> 70 queries becomes 1.

  2. Trade prices: `trades.select { |t| t.entryable.security_id == id }`
     scanned the full trades array for every security - O(SxT). Replaced
     with trades_by_security_id, pre-indexed once from the loaded array.

  3. Holding prices: `holdings.select { |h| h.security_id == id }` - same
     O(SxH) pattern. Replaced with holdings_by_security_id.

  Prices are now indexed into prices_by_date and prices_by_date_and_source
  hashes during load_prices, making get_price O(1) instead of scanning the
  flat prices array on every lookup.

Holding::PortfolioCache - get_trades / get_price:

  - get_trades(date:): `trades.select { |t| t.date == date }` (O(T) scan)
    replaced with trades_by_date hash (O(1)).

  - get_price: two `prices.select { p.date == date ... }.min_by` linear
    scans replaced with direct hash lookups into prices_by_date and
    prices_by_date_and_source.

Holding::PortfolioCache - collect_unique_securities:

  `holdings.map(&:security)` traversed the security association on every
  holding record (N+1 if not preloaded). Replaced with a pluck of
  security_ids followed by a single Security.where(id: ...) batch load.

Holding::ForwardCalculator / ReverseCalculator:

  `holdings += build_holdings(...)` allocated a new array copy on every
  iteration - O(N) per day x thousands of days = O(D^2) total allocations.
  Replaced with holdings.concat(...) which appends in place, O(1).

Holding::ReverseCalculator - precompute_cost_basis:

  Old: walked every date from account.start_date to Date.current (O(D)),
  writing a cost_basis entry for every security on every date. For an
  account with 2 trades over 9,250 days this wrote ~18,500 hash entries
  and consumed the full date range in the outer loop regardless of trade
  density.

  New: walks only buy trades (O(T)), appending one [date, avg_cost]
  snapshot per trade. cost_basis_for binary-searches the sparse snapshot
  array - O(log T) per lookup. Memory drops from O(DxS) to O(T).

Holding::Gapfillable:

  `security_holdings.find { |h| h.date == date }` was called on every
  date in the gapfill range - O(H) per date, O(HxD) total. Replaced with
  security_holdings.index_by(:date) built once before the loop, making
  each date lookup O(1).

Holding::Materializer - purge_stale_holdings:

  `account.entries.trades.map { |entry| entry.entryable.security_id }.uniq`
  loaded all trade entry records into Ruby then traversed the entryable
  association on each (N+1). Replaced with account.trades.pluck(:security_id).uniq
  (single SQL query returning only the IDs).

In testing, these changes were able to reduce sync time of an account with
25 years of history and 70 securities from about 90 minutes down to under
3 minutes.

* Lint fix

* Lint fix

* addressing the open review nits I agreed with:

* return dup'd arrays from PortfolioCache#get_trades so callers can't mutate memoized cache state
* use the precomputed security-id indexes in collect_unique_securities
* keep security-id dedupe in SQL via distinct.pluck(:security_id)
* tighten the DB price preload to select only needed columns
* harden cost-basis assertions with assert_in_delta

* Back out unnecessary AI slop

* Add back dup to trades array returned from memoized hash

trades_by_date[date] returns a live reference into the memoized hash.
Any caller that mutates the result would silently corrupt the cache for
subsequent calls on the same date within the same sync run. Add .dup to
return a shallow copy, matching the safety of the original select path.
This commit is contained in:
wps260
2026-05-04 18:24:33 -05:00
committed by GitHub
parent 9cc52b9d35
commit c294cbf54b
6 changed files with 169 additions and 51 deletions

View File

@@ -158,6 +158,84 @@ class Holding::ReverseCalculatorTest < ActiveSupport::TestCase
end
end
# cost_basis_for
test "cost_basis_for returns nil when there are no buy trades" do
security = Security.create!(ticker: "TST", name: "Test")
calc = calculator_with_trades(security)
assert_nil cost_basis_for(calc, security, Date.current)
end
test "cost_basis_for returns nil for dates before the first buy" do
security = Security.create!(ticker: "TST", name: "Test")
buy_date = 5.days.ago.to_date
calc = calculator_with_trades(security) do
create_trade(security, account: @account, qty: 10, price: 100, date: buy_date)
end
assert_nil cost_basis_for(calc, security, buy_date - 1)
end
test "cost_basis_for returns weighted average cost on buy date" do
security = Security.create!(ticker: "TST", name: "Test")
buy_date = 5.days.ago.to_date
calc = calculator_with_trades(security) do
create_trade(security, account: @account, qty: 10, price: 100, date: buy_date)
end
assert_in_delta 100.0, cost_basis_for(calc, security, buy_date).to_f, 1e-6
end
test "cost_basis_for carries forward to dates between buys" do
security = Security.create!(ticker: "TST", name: "Test")
first_buy = 10.days.ago.to_date
second_buy = 3.days.ago.to_date
calc = calculator_with_trades(security) do
create_trade(security, account: @account, qty: 10, price: 100, date: first_buy)
create_trade(security, account: @account, qty: 5, price: 130, date: second_buy)
end
# Between the two buys, cost basis is from the first buy only
assert_in_delta 100.0, cost_basis_for(calc, security, first_buy + 1).to_f, 1e-6
assert_in_delta 100.0, cost_basis_for(calc, security, second_buy - 1).to_f, 1e-6
# After second buy: WAC = (10*100 + 5*130) / 15 = 110.0
assert_in_delta 110.0, cost_basis_for(calc, security, second_buy).to_f, 1e-6
assert_in_delta 110.0, cost_basis_for(calc, security, Date.current).to_f, 1e-6
end
test "cost_basis_for accumulates multiple buys on the same date" do
security = Security.create!(ticker: "TST", name: "Test")
buy_date = 5.days.ago.to_date
calc = calculator_with_trades(security) do
create_trade(security, account: @account, qty: 10, price: 100, date: buy_date)
create_trade(security, account: @account, qty: 5, price: 130, date: buy_date)
end
# WAC = (10*100 + 5*130) / 15 = 110.0 — not the intermediate value after only the first trade
assert_in_delta 110.0, cost_basis_for(calc, security, buy_date).to_f, 1e-6
end
test "cost_basis_for ignores sell trades" do
security = Security.create!(ticker: "TST", name: "Test")
buy_date = 10.days.ago.to_date
sell_date = 5.days.ago.to_date
calc = calculator_with_trades(security) do
create_trade(security, account: @account, qty: 10, price: 100, date: buy_date)
create_trade(security, account: @account, qty: -5, price: 120, date: sell_date)
end
# Sell does not change cost basis
assert_in_delta 100.0, cost_basis_for(calc, security, sell_date).to_f, 1e-6
assert_in_delta 100.0, cost_basis_for(calc, security, Date.current).to_f, 1e-6
end
private
def assert_holdings(expected, calculated)
expected.each do |expected_entry|
@@ -202,6 +280,18 @@ class Holding::ReverseCalculatorTest < ActiveSupport::TestCase
# Brokerage Cash: $5,000
# Holdings Value: $15,000
# Total Balance: $20,000
def calculator_with_trades(security)
yield if block_given?
snapshot = OpenStruct.new(to_h: { security.id => 10 })
calc = Holding::ReverseCalculator.new(@account, portfolio_snapshot: snapshot)
calc.send(:precompute_cost_basis)
calc
end
def cost_basis_for(calc, security, date)
calc.send(:cost_basis_for, security.id, date)
end
def load_today_portfolio
@account.update!(cash_balance: 5000)