Feat: /import endpoint & drag-n-drop imports (#501)

* Implement API v1 Imports controller

- Add Api::V1::ImportsController with index, show, and create actions
- Add Jbuilder views for index and show
- Add integration tests
- Implement row generation logic in create action
- Update routes

* Validate import account belongs to family

- Add validation to Import model to ensure account belongs to the same family
- Add regression test case in Api::V1::ImportsControllerTest

* updating docs to be more detailed

* Rescue StandardError instead of bare rescue in ImportsController

* Optimize Imports API and fix documentation

- Implement rows_count counter cache for Imports
- Preload rows in Api::V1::ImportsController#show
- Update documentation to show correct OAuth scopes

* Fix formatting in ImportsControllerTest

* Permit all import parameters and fix unknown attribute error

* Restore API routes for auth, chats, and messages

* removing pr summary

* Fix trailing whitespace and configured? test failure

- Update Import#configured? to use rows_count for performance and consistency
- Mock rows_count in TransactionImportTest
- Fix trailing whitespace in migration

* Harden security and fix mass assignment in ImportsController

- Handle type and account_id explicitly in create action
- Rename import_params to import_config_params for clarity
- Validate type against Import::TYPES

* Fix MintImport rows_count update and migration whitespace

- Update MintImport#generate_rows_from_csv to update rows_count counter cache
- Fix trailing whitespace and final newline in AddRowsCountToImports migration

* Implement full-screen Drag and Drop CSV import on Transactions page

- Add DragAndDropImport Stimulus controller listening on document
- Add full-screen overlay with icon and text to Transactions index
- Update ImportsController to handle direct file uploads via create action
- Add system test for drag and drop functionality

* Implement Drag and Drop CSV upload on Import Upload page

- Add drag-and-drop-import controller to import/uploads/show
- Add full-screen overlay to import/uploads/show
- Annotate upload form and input with drag-and-drop targets
- Add PR_SUMMARY.md

* removing pr summary

* Add file validation to ImportsController

- Validate file size (max 10MB) and MIME type in create action
- Prevent memory exhaustion and invalid file processing
- Defined MAX_CSV_SIZE and ALLOWED_MIME_TYPES in Import model

* Refactor dragLeave logic with counter pattern to prevent flickering

* Extract shared drag-and-drop overlay partial

- Create app/views/imports/_drag_drop_overlay.html.erb
- Update transactions/index and import/uploads/show to use the partial
- Reduce code duplication in views

* Update Brakeman and harden ImportsController security

- Update brakeman to 7.1.2
- Explicitly handle type assignment in ImportsController#create to avoid mass assignment
- Remove :type from permitted import parameters

* Fix trailing whitespace in DragAndDropImportTest

* Don't commit LLM comments as file

* FIX add api validation

---------

Co-authored-by: Carlos Adames <cj@Carloss-MacBook-Air.local>
Co-authored-by: Juan José Mata <jjmata@jjmata.com>
Co-authored-by: sokie <sokysrm@gmail.com>
This commit is contained in:
Carlos Adames
2026-01-10 11:39:18 -04:00
committed by GitHub
parent 61caaf056c
commit b56dbdb9eb
22 changed files with 723 additions and 19 deletions

View File

@@ -0,0 +1,171 @@
# frozen_string_literal: true
class Api::V1::ImportsController < Api::V1::BaseController
include Pagy::Backend
# Ensure proper scope authorization
before_action :ensure_read_scope, only: [ :index, :show ]
before_action :ensure_write_scope, only: [ :create ]
before_action :set_import, only: [ :show ]
def index
family = current_resource_owner.family
imports_query = family.imports.ordered
# Apply filters
if params[:status].present?
imports_query = imports_query.where(status: params[:status])
end
if params[:type].present?
imports_query = imports_query.where(type: params[:type])
end
# Pagination
@pagy, @imports = pagy(
imports_query,
page: safe_page_param,
limit: safe_per_page_param
)
@per_page = safe_per_page_param
render :index
rescue StandardError => e
Rails.logger.error "ImportsController#index error: #{e.message}"
render json: { error: "internal_server_error", message: e.message }, status: :internal_server_error
end
def show
render :show
rescue StandardError => e
Rails.logger.error "ImportsController#show error: #{e.message}"
render json: { error: "internal_server_error", message: e.message }, status: :internal_server_error
end
def create
family = current_resource_owner.family
# 1. Determine type and validate
type = params[:type].to_s
type = "TransactionImport" unless Import::TYPES.include?(type)
# 2. Build the import object with permitted config attributes
@import = family.imports.build(import_config_params)
@import.type = type
@import.account_id = params[:account_id] if params[:account_id].present?
# 3. Attach the uploaded file if present (with validation)
if params[:file].present?
file = params[:file]
if file.size > Import::MAX_CSV_SIZE
return render json: {
error: "file_too_large",
message: "File is too large. Maximum size is #{Import::MAX_CSV_SIZE / 1.megabyte}MB."
}, status: :unprocessable_entity
end
unless Import::ALLOWED_MIME_TYPES.include?(file.content_type)
return render json: {
error: "invalid_file_type",
message: "Invalid file type. Please upload a CSV file."
}, status: :unprocessable_entity
end
@import.raw_file_str = file.read
elsif params[:raw_file_content].present?
if params[:raw_file_content].bytesize > Import::MAX_CSV_SIZE
return render json: {
error: "content_too_large",
message: "Content is too large. Maximum size is #{Import::MAX_CSV_SIZE / 1.megabyte}MB."
}, status: :unprocessable_entity
end
@import.raw_file_str = params[:raw_file_content]
end
# 4. Save and Process
if @import.save
# Generate rows if file content was provided
if @import.uploaded?
begin
@import.generate_rows_from_csv
@import.reload
rescue StandardError => e
Rails.logger.error "Row generation failed for import #{@import.id}: #{e.message}"
end
end
# If the import is configured (has rows), we can try to auto-publish or just leave it as pending
# For API simplicity, if enough info is provided, we might want to trigger processing
if @import.configured? && params[:publish] == "true"
@import.publish_later
end
render :show, status: :created
else
render json: {
error: "validation_failed",
message: "Import could not be created",
errors: @import.errors.full_messages
}, status: :unprocessable_entity
end
rescue StandardError => e
Rails.logger.error "ImportsController#create error: #{e.message}"
render json: { error: "internal_server_error", message: e.message }, status: :internal_server_error
end
private
def set_import
@import = current_resource_owner.family.imports.includes(:rows).find(params[:id])
rescue ActiveRecord::RecordNotFound
render json: { error: "not_found", message: "Import not found" }, status: :not_found
end
def ensure_read_scope
authorize_scope!(:read)
end
def ensure_write_scope
authorize_scope!(:write)
end
def import_config_params
params.permit(
:date_col_label,
:amount_col_label,
:name_col_label,
:category_col_label,
:tags_col_label,
:notes_col_label,
:account_col_label,
:qty_col_label,
:ticker_col_label,
:price_col_label,
:entity_type_col_label,
:currency_col_label,
:exchange_operating_mic_col_label,
:date_format,
:number_format,
:signage_convention,
:col_sep,
:amount_type_strategy,
:amount_type_inflow_value
)
end
def safe_page_param
page = params[:page].to_i
page > 0 ? page : 1
end
def safe_per_page_param
per_page = params[:per_page].to_i
(1..100).include?(per_page) ? per_page : 25
end
end

View File

@@ -26,14 +26,38 @@ class ImportsController < ApplicationController
end end
def create def create
type = params.dig(:import, :type).to_s
type = "TransactionImport" unless Import::TYPES.include?(type)
account = Current.family.accounts.find_by(id: params.dig(:import, :account_id)) account = Current.family.accounts.find_by(id: params.dig(:import, :account_id))
import = Current.family.imports.create!( import = Current.family.imports.create!(
type: import_params[:type], type: type,
account: account, account: account,
date_format: Current.family.date_format, date_format: Current.family.date_format,
) )
redirect_to import_upload_path(import) if import_params[:csv_file].present?
file = import_params[:csv_file]
if file.size > Import::MAX_CSV_SIZE
import.destroy
redirect_to new_import_path, alert: "File is too large. Maximum size is #{Import::MAX_CSV_SIZE / 1.megabyte}MB."
return
end
unless Import::ALLOWED_MIME_TYPES.include?(file.content_type)
import.destroy
redirect_to new_import_path, alert: "Invalid file type. Please upload a CSV file."
return
end
# Stream reading is not fully applicable here as we store the raw string in the DB,
# but we have validated size beforehand to prevent memory exhaustion from massive files.
import.update!(raw_file_str: file.read)
redirect_to import_configuration_path(import), notice: "CSV uploaded successfully."
else
redirect_to import_upload_path(import)
end
end end
def show def show
@@ -70,6 +94,6 @@ class ImportsController < ApplicationController
end end
def import_params def import_params
params.require(:import).permit(:type) params.require(:import).permit(:csv_file)
end end
end end

View File

@@ -0,0 +1,65 @@
import { Controller } from "@hotwired/stimulus"
export default class extends Controller {
static targets = ["input", "form", "overlay"]
dragDepth = 0
connect() {
this.boundDragOver = this.dragOver.bind(this)
this.boundDragEnter = this.dragEnter.bind(this)
this.boundDragLeave = this.dragLeave.bind(this)
this.boundDrop = this.drop.bind(this)
// Listen on the document to catch drags anywhere
document.addEventListener("dragover", this.boundDragOver)
document.addEventListener("dragenter", this.boundDragEnter)
document.addEventListener("dragleave", this.boundDragLeave)
document.addEventListener("drop", this.boundDrop)
}
disconnect() {
document.removeEventListener("dragover", this.boundDragOver)
document.removeEventListener("dragenter", this.boundDragEnter)
document.removeEventListener("dragleave", this.boundDragLeave)
document.removeEventListener("drop", this.boundDrop)
}
dragEnter(event) {
event.preventDefault()
this.dragDepth++
if (this.dragDepth === 1) {
this.overlayTarget.classList.remove("hidden")
}
}
dragOver(event) {
event.preventDefault()
}
dragLeave(event) {
event.preventDefault()
this.dragDepth--
if (this.dragDepth <= 0) {
this.dragDepth = 0
this.overlayTarget.classList.add("hidden")
}
}
drop(event) {
event.preventDefault()
this.dragDepth = 0
this.overlayTarget.classList.add("hidden")
if (event.dataTransfer.files.length > 0) {
const file = event.dataTransfer.files[0]
// Simple validation
if (file.type === "text/csv" || file.name.toLowerCase().endsWith(".csv")) {
this.inputTarget.files = event.dataTransfer.files
this.formTarget.requestSubmit()
} else {
alert("Please upload a valid CSV file.")
}
}
}
}

View File

@@ -42,7 +42,7 @@ class AccountImport < Import
def dry_run def dry_run
{ {
accounts: rows.count accounts: rows_count
} }
end end

View File

@@ -42,7 +42,7 @@ class CategoryImport < Import
end end
def dry_run def dry_run
{ categories: rows.count } { categories: rows_count }
end end
def csv_template def csv_template

View File

@@ -2,6 +2,9 @@ class Import < ApplicationRecord
MaxRowCountExceededError = Class.new(StandardError) MaxRowCountExceededError = Class.new(StandardError)
MappingError = Class.new(StandardError) MappingError = Class.new(StandardError)
MAX_CSV_SIZE = 10.megabytes
ALLOWED_MIME_TYPES = %w[text/csv text/plain application/vnd.ms-excel application/csv].freeze
TYPES = %w[TransactionImport TradeImport AccountImport MintImport CategoryImport RuleImport].freeze TYPES = %w[TransactionImport TradeImport AccountImport MintImport CategoryImport RuleImport].freeze
SIGNAGE_CONVENTIONS = %w[inflows_positive inflows_negative] SIGNAGE_CONVENTIONS = %w[inflows_positive inflows_negative]
SEPARATORS = [ [ "Comma (,)", "," ], [ "Semicolon (;)", ";" ] ].freeze SEPARATORS = [ [ "Comma (,)", "," ], [ "Semicolon (;)", ";" ] ].freeze
@@ -36,6 +39,7 @@ class Import < ApplicationRecord
validates :col_sep, inclusion: { in: SEPARATORS.map(&:last) } validates :col_sep, inclusion: { in: SEPARATORS.map(&:last) }
validates :signage_convention, inclusion: { in: SIGNAGE_CONVENTIONS }, allow_nil: true validates :signage_convention, inclusion: { in: SIGNAGE_CONVENTIONS }, allow_nil: true
validates :number_format, presence: true, inclusion: { in: NUMBER_FORMATS.keys } validates :number_format, presence: true, inclusion: { in: NUMBER_FORMATS.keys }
validate :account_belongs_to_family
has_many :rows, dependent: :destroy has_many :rows, dependent: :destroy
has_many :mappings, dependent: :destroy has_many :mappings, dependent: :destroy
@@ -110,7 +114,7 @@ class Import < ApplicationRecord
def dry_run def dry_run
mappings = { mappings = {
transactions: rows.count, transactions: rows_count,
categories: Import::CategoryMapping.for_import(self).creational.count, categories: Import::CategoryMapping.for_import(self).creational.count,
tags: Import::TagMapping.for_import(self).creational.count tags: Import::TagMapping.for_import(self).creational.count
} }
@@ -152,6 +156,7 @@ class Import < ApplicationRecord
end end
rows.insert_all!(mapped_rows) rows.insert_all!(mapped_rows)
update_column(:rows_count, rows.count)
end end
def sync_mappings def sync_mappings
@@ -181,7 +186,7 @@ class Import < ApplicationRecord
end end
def configured? def configured?
uploaded? && rows.any? uploaded? && rows_count > 0
end end
def cleaned? def cleaned?
@@ -232,7 +237,7 @@ class Import < ApplicationRecord
private private
def row_count_exceeded? def row_count_exceeded?
rows.count > max_row_count rows_count > max_row_count
end end
def import! def import!
@@ -288,4 +293,11 @@ class Import < ApplicationRecord
def set_default_number_format def set_default_number_format
self.number_format ||= "1,234.56" # Default to US/UK format self.number_format ||= "1,234.56" # Default to US/UK format
end end
def account_belongs_to_family
return if account.nil?
return if account.family_id == family_id
errors.add(:account, "must belong to your family")
end
end end

View File

@@ -1,5 +1,5 @@
class Import::Row < ApplicationRecord class Import::Row < ApplicationRecord
belongs_to :import belongs_to :import, counter_cache: true
validates :amount, numericality: true, allow_blank: true validates :amount, numericality: true, allow_blank: true
validates :currency, presence: true validates :currency, presence: true

View File

@@ -18,6 +18,7 @@ class MintImport < Import
end end
rows.insert_all!(mapped_rows) rows.insert_all!(mapped_rows)
update_column(:rows_count, rows.count)
end end
def import! def import!

View File

@@ -20,7 +20,7 @@ class RuleImport < Import
end end
def dry_run def dry_run
{ rules: rows.count } { rules: rows_count }
end end
def csv_template def csv_template

View File

@@ -54,7 +54,7 @@ class TradeImport < Import
end end
def dry_run def dry_run
mappings = { transactions: rows.count } mappings = { transactions: rows_count }
mappings.merge( mappings.merge(
accounts: Import::AccountMapping.for_import(self).creational.count accounts: Import::AccountMapping.for_import(self).creational.count

View File

@@ -0,0 +1,21 @@
json.data do
json.array! @imports do |import|
json.id import.id
json.type import.type
json.status import.status
json.created_at import.created_at
json.updated_at import.updated_at
json.account_id import.account_id
json.rows_count import.rows_count
json.error import.error if import.error.present?
end
end
json.meta do
json.current_page @pagy.page
json.next_page @pagy.next
json.prev_page @pagy.prev
json.total_pages @pagy.pages
json.total_count @pagy.count
json.per_page @per_page
end

View File

@@ -0,0 +1,30 @@
json.data do
json.id @import.id
json.type @import.type
json.status @import.status
json.created_at @import.created_at
json.updated_at @import.updated_at
json.account_id @import.account_id
json.error @import.error if @import.error.present?
json.configuration do
json.date_col_label @import.date_col_label
json.amount_col_label @import.amount_col_label
json.name_col_label @import.name_col_label
json.category_col_label @import.category_col_label
json.tags_col_label @import.tags_col_label
json.notes_col_label @import.notes_col_label
json.account_col_label @import.account_col_label
json.date_format @import.date_format
json.number_format @import.number_format
json.signage_convention @import.signage_convention
end
json.stats do
json.rows_count @import.rows_count
json.valid_rows_count @import.rows.select(&:valid?).count if @import.rows.loaded?
end
# Only show a subset of rows for preview if needed, or link to a separate rows endpoint
# json.sample_rows @import.rows.limit(5)
end

View File

@@ -4,7 +4,10 @@
<%= content_for :previous_path, imports_path %> <%= content_for :previous_path, imports_path %>
<div class="space-y-4"> <div class="space-y-4" data-controller="drag-and-drop-import">
<!-- Overlay -->
<%= render "imports/drag_drop_overlay", title: "Drop CSV to upload", subtitle: "Your file will be uploaded automatically" %>
<div class="space-y-4 mx-auto max-w-md"> <div class="space-y-4 mx-auto max-w-md">
<div class="text-center space-y-2"> <div class="text-center space-y-2">
<h1 class="text-3xl text-primary font-medium"><%= t(".title") %></h1> <h1 class="text-3xl text-primary font-medium"><%= t(".title") %></h1>
@@ -18,7 +21,7 @@
<% end %> <% end %>
<% tabs.with_panel(tab_id: "csv-upload") do %> <% tabs.with_panel(tab_id: "csv-upload") do %>
<%= styled_form_with model: @import, scope: :import, url: import_upload_path(@import), multipart: true, class: "space-y-2" do |form| %> <%= styled_form_with model: @import, scope: :import, url: import_upload_path(@import), multipart: true, class: "space-y-2", data: { drag_and_drop_import_target: "form" } do |form| %>
<%= form.select :col_sep, Import::SEPARATORS, label: true %> <%= form.select :col_sep, Import::SEPARATORS, label: true %>
<% if @import.type == "TransactionImport" || @import.type == "TradeImport" %> <% if @import.type == "TransactionImport" || @import.type == "TradeImport" %>
@@ -41,7 +44,7 @@
<p class="text-md font-medium text-primary"></p> <p class="text-md font-medium text-primary"></p>
</div> </div>
<%= form.file_field :csv_file, class: "hidden", "data-auto-submit-form-target": "auto", "data-file-upload-target": "input" %> <%= form.file_field :csv_file, class: "hidden", "data-auto-submit-form-target": "auto", "data-file-upload-target": "input", "data-drag-and-drop-import-target": "input" %>
</div> </div>
</div> </div>

View File

@@ -0,0 +1,7 @@
<div data-drag-and-drop-import-target="overlay" class="fixed inset-0 bg-primary/20 backdrop-blur-sm z-50 hidden flex items-center justify-center pointer-events-none">
<div class="text-center p-8 bg-container rounded-xl shadow-2xl border-2 border-dashed border-primary animate-in fade-in zoom-in duration-200">
<%= icon("upload", size: "xl", class: "text-primary mb-4 mx-auto w-16 h-16") %>
<h3 class="text-2xl font-semibold text-primary mb-2"><%= title %></h3>
<p class="text-secondary text-base"><%= subtitle %></p>
</div>
</div>

View File

@@ -46,10 +46,18 @@
<%= render "summary", totals: @search.totals %> <%= render "summary", totals: @search.totals %>
<div id="transactions" <div id="transactions"
data-controller="bulk-select checkbox-toggle" data-controller="bulk-select checkbox-toggle drag-and-drop-import"
data-bulk-select-singular-label-value="<%= t(".transaction") %>" data-bulk-select-singular-label-value="<%= t(".transaction") %>"
data-bulk-select-plural-label-value="<%= t(".transactions") %>" data-bulk-select-plural-label-value="<%= t(".transactions") %>"
class="flex flex-col bg-container rounded-xl shadow-border-xs px-3 py-4 lg:p-4"> class="flex flex-col bg-container rounded-xl shadow-border-xs px-3 py-4 lg:p-4 relative group">
<%= form_with url: imports_path, method: :post, class: "hidden", data: { drag_and_drop_import_target: "form" } do |f| %>
<%= f.hidden_field "import[type]", value: "TransactionImport" %>
<%= f.file_field "import[csv_file]", class: "hidden", data: { drag_and_drop_import_target: "input" }, accept: ".csv" %>
<% end %>
<%= render "imports/drag_drop_overlay", title: "Drop CSV to import", subtitle: "Upload transactions directly" %>
<%= render "transactions/searches/search" %> <%= render "transactions/searches/search" %>
<div id="entry-selection-bar" data-bulk-select-target="selectionBar" class="flex justify-center hidden"> <div id="entry-selection-bar" data-bulk-select-target="selectionBar" class="flex justify-center hidden">

View File

@@ -285,11 +285,12 @@ Rails.application.routes.draw do
post "auth/refresh", to: "auth#refresh" post "auth/refresh", to: "auth#refresh"
# Production API endpoints # Production API endpoints
resources :accounts, only: [ :index ] resources :accounts, only: [ :index, :show ]
resources :categories, only: [ :index, :show ] resources :categories, only: [ :index, :show ]
resources :transactions, only: [ :index, :show, :create, :update, :destroy ] resources :transactions, only: [ :index, :show, :create, :update, :destroy ]
resource :usage, only: [ :show ], controller: "usage" resources :imports, only: [ :index, :show, :create ]
resource :sync, only: [ :create ], controller: "sync" resource :usage, only: [ :show ], controller: :usage
post :sync, to: "sync#create"
resources :chats, only: [ :index, :show, :create, :update, :destroy ] do resources :chats, only: [ :index, :show, :create, :update, :destroy ] do
resources :messages, only: [ :create ] do resources :messages, only: [ :create ] do

View File

@@ -0,0 +1,16 @@
class AddRowsCountToImports < ActiveRecord::Migration[7.2]
def up
add_column :imports, :rows_count, :integer, default: 0, null: false
say_with_time "Backfilling rows_count for imports" do
Import.reset_column_information
Import.find_each do |import|
Import.reset_counters(import.id, :rows)
end
end
end
def down
remove_column :imports, :rows_count
end
end

1
db/schema.rb generated
View File

@@ -588,6 +588,7 @@ ActiveRecord::Schema[7.2].define(version: 2026_01_10_122603) do
t.string "exchange_operating_mic_col_label" t.string "exchange_operating_mic_col_label"
t.string "amount_type_strategy", default: "signed_amount" t.string "amount_type_strategy", default: "signed_amount"
t.string "amount_type_inflow_value" t.string "amount_type_inflow_value"
t.integer "rows_count", default: 0, null: false
t.index ["family_id"], name: "index_imports_on_family_id" t.index ["family_id"], name: "index_imports_on_family_id"
end end

101
docs/api/imports.md Normal file
View File

@@ -0,0 +1,101 @@
# Imports API Documentation
The Imports API allows external applications to programmatically upload and process financial data from CSV files. This API supports creating transaction imports, configuring column mappings, and triggering the import process.
## Authentication requirements
All import endpoints require an OAuth2 access token or API key that grants the appropriate scope (`read` or `read_write`).
## Available endpoints
| Endpoint | Scope | Description |
| --- | --- | --- |
| `GET /api/v1/imports` | `read` | List imports with filtering and pagination. |
| `GET /api/v1/imports/{id}` | `read` | Retrieve a single import with configuration and statistics. |
| `POST /api/v1/imports` | `read_write` | Create a new import and optionally trigger processing. |
## Filtering options
The `GET /api/v1/imports` endpoint supports the following query parameters:
| Parameter | Type | Description |
| --- | --- | --- |
| `page` | integer | Page number (default: 1) |
| `per_page` | integer | Items per page (default: 25, max: 100) |
| `status` | string | Filter by status: `pending`, `importing`, `complete`, `failed`, `reverting`, `revert_failed` |
| `type` | string | Filter by import type: `TransactionImport`, `TradeImport`, etc. |
## Import object
An import response includes configuration and processing statistics:
```json
{
"data": {
"id": "uuid",
"type": "TransactionImport",
"status": "pending",
"created_at": "2024-01-15T10:30:00Z",
"updated_at": "2024-01-15T10:30:00Z",
"account_id": "uuid",
"configuration": {
"date_col_label": "date",
"amount_col_label": "amount",
"name_col_label": "name",
"category_col_label": "category",
"tags_col_label": "tags",
"notes_col_label": "notes",
"account_col_label": null,
"date_format": "%m/%d/%Y",
"number_format": "1,234.56",
"signage_convention": "inflows_positive"
},
"stats": {
"rows_count": 150,
"valid_rows_count": 150
}
}
}
```
## Creating an import
When creating an import, you must provide the file content and the column mappings.
### Parameters
| Parameter | Type | Description |
| --- | --- | --- |
| `raw_file_content` | string | The raw CSV content as a string. |
| `file` | file | Alternatively, the CSV file can be uploaded as a multipart form-data part. |
| `account_id` | uuid | Optional. The ID of the account to import into. |
| `date_col_label` | string | The header name for the date column. |
| `amount_col_label` | string | The header name for the amount column. |
| `name_col_label` | string | The header name for the transaction name column. |
| `publish` | boolean | If `true`, the import will be automatically queued for processing if configuration is valid. |
Example request body:
```json
{
"raw_file_content": "date,amount,name\n01/01/2024,10.00,Test",
"date_col_label": "date",
"amount_col_label": "amount",
"name_col_label": "name",
"account_id": "uuid",
"publish": "true"
}
```
## Error responses
Errors conform to the shared `ErrorResponse` schema:
```json
{
"error": "error_code",
"message": "Human readable error message",
"errors": ["Optional array of validation errors"]
}
```

View File

@@ -0,0 +1,206 @@
require "test_helper"
class Api::V1::ImportsControllerTest < ActionDispatch::IntegrationTest
setup do
@family = families(:dylan_family)
@user = users(:family_admin)
@account = accounts(:depository)
@import = imports(:transaction)
@token = valid_token_for(@user)
end
test "should list imports" do
get api_v1_imports_url, headers: { Authorization: "Bearer #{@token}" }
assert_response :success
json_response = JSON.parse(response.body)
assert_not_empty json_response["data"]
assert_equal @family.imports.count, json_response["meta"]["total_count"]
end
test "should show import" do
get api_v1_import_url(@import), headers: { Authorization: "Bearer #{@token}" }
assert_response :success
json_response = JSON.parse(response.body)
assert_equal @import.id, json_response["data"]["id"]
assert_equal @import.status, json_response["data"]["status"]
end
test "should create import with raw content" do
csv_content = "date,amount,name\n2023-01-01,-10.00,Test Transaction"
assert_difference("Import.count") do
post api_v1_imports_url,
params: {
raw_file_content: csv_content,
date_col_label: "date",
amount_col_label: "amount",
name_col_label: "name",
account_id: @account.id
},
headers: { Authorization: "Bearer #{@token}" }
end
assert_response :created
json_response = JSON.parse(response.body)
assert_equal "pending", json_response["data"]["status"]
created_import = Import.find(json_response["data"]["id"])
assert_equal csv_content, created_import.raw_file_str
end
test "should create import and generate rows when configured" do
csv_content = "date,amount,name\n2023-01-01,-10.00,Test Transaction"
assert_difference([ "Import.count", "Import::Row.count" ], 1) do
post api_v1_imports_url,
params: {
raw_file_content: csv_content,
date_col_label: "date",
amount_col_label: "amount",
name_col_label: "name",
account_id: @account.id
},
headers: { Authorization: "Bearer #{@token}" }
end
assert_response :created
json_response = JSON.parse(response.body)
import = Import.find(json_response["data"]["id"])
assert_equal 1, import.rows_count
assert_equal "Test Transaction", import.rows.first.name
assert_equal "-10.00", import.rows.first.amount # Normalized
end
test "should create import and auto-publish when configured and requested" do
csv_content = "date,amount,name\n2023-01-01,-10.00,Test Transaction"
assert_enqueued_with(job: ImportJob) do
post api_v1_imports_url,
params: {
raw_file_content: csv_content,
date_col_label: "date",
amount_col_label: "amount",
name_col_label: "name",
account_id: @account.id,
date_format: "%Y-%m-%d",
publish: "true"
},
headers: { Authorization: "Bearer #{@token}" }
end
assert_response :created
json_response = JSON.parse(response.body)
assert_equal "importing", json_response["data"]["status"]
end
test "should not create import for account in another family" do
other_family = Family.create!(name: "Other Family", currency: "USD", locale: "en")
other_depository = Depository.create!(subtype: "checking")
other_account = Account.create!(family: other_family, name: "Other Account", currency: "USD", classification: "asset", accountable: other_depository, balance: 0)
csv_content = "date,amount,name\n2023-01-01,-10.00,Test Transaction"
post api_v1_imports_url,
params: {
raw_file_content: csv_content,
account_id: other_account.id
},
headers: { Authorization: "Bearer #{@token}" }
assert_response :unprocessable_entity
json_response = JSON.parse(response.body)
assert_includes json_response["errors"], "Account must belong to your family"
end
test "should reject file upload exceeding max size" do
large_file = Rack::Test::UploadedFile.new(
StringIO.new("x" * (Import::MAX_CSV_SIZE + 1)),
"text/csv",
original_filename: "large.csv"
)
assert_no_difference("Import.count") do
post api_v1_imports_url,
params: { file: large_file },
headers: { Authorization: "Bearer #{@token}" }
end
assert_response :unprocessable_entity
json_response = JSON.parse(response.body)
assert_equal "file_too_large", json_response["error"]
end
test "should reject file upload with invalid mime type" do
invalid_file = Rack::Test::UploadedFile.new(
StringIO.new("not a csv"),
"application/pdf",
original_filename: "document.pdf"
)
assert_no_difference("Import.count") do
post api_v1_imports_url,
params: { file: invalid_file },
headers: { Authorization: "Bearer #{@token}" }
end
assert_response :unprocessable_entity
json_response = JSON.parse(response.body)
assert_equal "invalid_file_type", json_response["error"]
end
test "should reject raw content exceeding max size" do
# Use a small test limit to avoid Rack request size limits
test_limit = 1.kilobyte
large_content = "x" * (test_limit + 1)
original_value = Import::MAX_CSV_SIZE
Import.send(:remove_const, :MAX_CSV_SIZE)
Import.const_set(:MAX_CSV_SIZE, test_limit)
assert_no_difference("Import.count") do
post api_v1_imports_url,
params: { raw_file_content: large_content },
headers: { Authorization: "Bearer #{@token}" }
end
assert_response :unprocessable_entity
json_response = JSON.parse(response.body)
assert_equal "content_too_large", json_response["error"]
ensure
Import.send(:remove_const, :MAX_CSV_SIZE)
Import.const_set(:MAX_CSV_SIZE, original_value)
end
test "should accept file upload with valid csv mime type" do
csv_content = "date,amount,name\n2023-01-01,-10.00,Test Transaction"
valid_file = Rack::Test::UploadedFile.new(
StringIO.new(csv_content),
"text/csv",
original_filename: "transactions.csv"
)
assert_difference("Import.count") do
post api_v1_imports_url,
params: {
file: valid_file,
date_col_label: "date",
amount_col_label: "amount",
name_col_label: "name",
account_id: @account.id
},
headers: { Authorization: "Bearer #{@token}" }
end
assert_response :created
end
private
def valid_token_for(user)
application = Doorkeeper::Application.create!(name: "Test App", redirect_uri: "urn:ietf:wg:oauth:2.0:oob", scopes: "read read_write")
Doorkeeper::AccessToken.create!(application: application, resource_owner_id: user.id, scopes: "read read_write").token
end
end

View File

@@ -14,6 +14,7 @@ class TransactionImportTest < ActiveSupport::TestCase
test "configured? if uploaded and rows are generated" do test "configured? if uploaded and rows are generated" do
@import.expects(:uploaded?).returns(true).once @import.expects(:uploaded?).returns(true).once
@import.expects(:rows_count).returns(1).once
assert @import.configured? assert @import.configured?
end end

View File

@@ -0,0 +1,36 @@
require "application_system_test_case"
class DragAndDropImportTest < ApplicationSystemTestCase
setup do
sign_in users(:family_admin)
end
test "upload csv via hidden input on transactions index" do
visit transactions_path
assert_selector "#transactions[data-controller*='drag-and-drop-import']"
# We can't easily simulate a true native drag-and-drop in headless chrome via Capybara without complex JS.
# However, we can verify that the hidden form exists and works when a file is "dropped" (input populated).
# The Stimulus controller's job is just to transfer the dropped file to the input and submit.
file_path = file_fixture("imports/transactions.csv")
# Manually make form and input visible
execute_script("
var form = document.querySelector('form[action=\"#{imports_path}\"]');
form.classList.remove('hidden');
var input = document.querySelector('input[name=\"import[csv_file]\"]');
input.classList.remove('hidden');
input.style.display = 'block';
")
attach_file "import[csv_file]", file_path
# Submit the form manually since we bypassed the 'drop' event listener which triggers submit
find("form[action='#{imports_path}']").evaluate_script("this.requestSubmit()")
assert_text "CSV uploaded successfully"
assert_text "Configure your import"
end
end