CSV Uploads with Live Progress Using Turbo Streams
CSV imports are one of those features every business app needs eventually. User uploads a file, system processes thousands of rows, user waits... and waits... and wonders if anything is happening.
With Turbo Streams and Action Cable, you can show live progress as each row imports. No JavaScript polling. No page refreshes. The progress bar fills, row counts update, errors appear—all in real time.
Here's how to build it.
What We're Building
1. User uploads a CSV file
2. Server creates an Import record and kicks off a background job
3. Page shows the import in "processing" state
4. Background job processes rows, broadcasting progress after each batch
5. User sees live updates: progress bar, row count, any errors
6. Job finishes, status changes to "completed" or "failed"
No page refresh required. User can even navigate away and come back—the progress is persisted.
The Import Model
First, a model to track import state:
rails g model Import status:string total_rows:integer processed_rows:integer failed_rows:integer error_messages:text filename:string
Migration:
class CreateImports < ActiveRecord::Migration[7.1]
def change
create_table :imports do |t|
t.string :status, default: "pending"
t.string :filename
t.integer :total_rows, default: 0
t.integer :processed_rows, default: 0
t.integer :failed_rows, default: 0
t.text :error_messages
t.timestamps
end
end
end
Model with status helpers and broadcasting:
class Import < ApplicationRecord
# Store errors as JSON array
serialize :error_messages, coder: JSON, default: []
# Status helpers
def pending?
status == "pending"
end
def processing?
status == "processing"
end
def completed?
status == "completed"
end
def failed?
status == "failed"
end
def progress_percentage
return 0 if total_rows.zero?
((processed_rows.to_f / total_rows) * 100).round
end
# Broadcast updates to the import's stream
def broadcast_progress
broadcast_replace_to self,
target: "import_#{id}",
partial: "imports/import",
locals: { import: self }
end
end
The Controller
class ImportsController < ApplicationController
def index
@imports = Import.order(created_at: :desc).limit(20)
end
def show
@import = Import.find(params[:id])
end
def new
@import = Import.new
end
def create
file = params[:file]
unless file.present?
redirect_to new_import_path, alert: "Please select a file"
return
end
# Count rows (subtract 1 for header)
total_rows = File.readlines(file.tempfile).count - 1
@import = Import.create!(
status: "pending",
filename: file.original_filename,
total_rows: total_rows
)
# Store file and kick off job
@import.file.attach(file)
CsvImportJob.perform_later(@import.id)
redirect_to @import, notice: "Import started"
end
end
Add Active Storage for file attachment:
class Import < ApplicationRecord has_one_attached :file # ... rest of model end
The Background Job
This is where the magic happens. Process rows in batches, broadcast progress after each batch:
class CsvImportJob < ApplicationJob
queue_as :default
BATCH_SIZE = 50
def perform(import_id)
import = Import.find(import_id)
import.update!(status: "processing")
import.broadcast_progress
begin
process_csv(import)
import.update!(status: "completed")
rescue => e
import.update!(
status: "failed",
error_messages: import.error_messages + ["Fatal error: #{e.message}"]
)
end
import.broadcast_progress
end
private
def process_csv(import)
require "csv"
csv_content = import.file.download
csv = CSV.parse(csv_content, headers: true)
csv.each_slice(BATCH_SIZE) do |batch|
batch.each do |row|
process_row(import, row)
end
# Broadcast after each batch
import.broadcast_progress
end
end
def process_row(import, row)
# Your actual import logic here
# Example: creating a Product from CSV row
Product.create!(
name: row["name"],
price: row["price"],
sku: row["sku"]
)
import.increment!(:processed_rows)
rescue => e
import.increment!(:failed_rows)
import.increment!(:processed_rows)
import.update!(
error_messages: import.error_messages + ["Row #{import.processed_rows}: #{e.message}"]
)
end
end
Key points:
•
each_slice(BATCH_SIZE) processes rows in chunks•
broadcast_progress fires after each batch, not each row (too noisy)• Errors are caught per-row so one bad row doesn't kill the whole import
• Final broadcast happens whether successful or failed
The Views
Show page with Turbo Stream subscription:
<!-- app/views/imports/show.html.erb --> <%= turbo_stream_from @import %> <h1>Import: <%= @import.filename %></h1> <div id="import_<%= @import.id %>"> <%= render "imports/import", import: @import %> </div> <%= link_to "Back to Imports", imports_path %>
The
turbo_stream_from @import subscribes to updates for this specific import. When the job calls broadcast_progress, this page receives the update.The import partial:
<!-- app/views/imports/_import.html.erb -->
<div class="import-status">
<div class="status-badge status-<%= import.status %>">
<%= import.status.titleize %>
</div>
<% if import.processing? || import.completed? %>
<div class="progress-container">
<div class="progress-bar" style="width: <%= import.progress_percentage %>%"></div>
</div>
<div class="progress-stats">
<span><%= import.processed_rows %> / <%= import.total_rows %> rows</span>
<span><%= import.progress_percentage %>%</span>
</div>
<% end %>
<% if import.failed_rows > 0 %>
<div class="error-summary">
<strong><%= import.failed_rows %> rows failed</strong>
<% if import.error_messages.any? %>
<ul class="error-list">
<% import.error_messages.last(10).each do |error| %>
<li><%= error %></li>
<% end %>
<% if import.error_messages.size > 10 %>
<li>... and <%= import.error_messages.size - 10 %> more</li>
<% end %>
</ul>
<% end %>
</div>
<% end %>
<% if import.completed? %>
<div class="success-message">
Import completed! <%= import.processed_rows - import.failed_rows %> rows imported successfully.
</div>
<% end %>
</div>
Upload form:
<!-- app/views/imports/new.html.erb -->
<h1>Upload CSV</h1>
<%= form_with url: imports_path, method: :post, local: true do |f| %>
<div class="field">
<%= f.label :file, "Select CSV file" %>
<%= f.file_field :file, accept: ".csv" %>
</div>
<%= f.submit "Upload and Import" %>
<% end %>
Basic Styling
.progress-container {
background: #e0e0e0;
border-radius: 4px;
height: 24px;
margin: 1rem 0;
overflow: hidden;
}
.progress-bar {
background: #4caf50;
height: 100%;
transition: width 0.3s ease;
}
.status-badge {
display: inline-block;
padding: 0.25rem 0.75rem;
border-radius: 4px;
font-weight: bold;
}
.status-pending { background: #fff3cd; color: #856404; }
.status-processing { background: #cce5ff; color: #004085; }
.status-completed { background: #d4edda; color: #155724; }
.status-failed { background: #f8d7da; color: #721c24; }
.error-list {
max-height: 200px;
overflow-y: auto;
font-size: 0.875rem;
color: #721c24;
}
.progress-stats {
display: flex;
justify-content: space-between;
font-size: 0.875rem;
color: #666;
}
Routes
Rails.application.routes.draw do resources :imports, only: [:index, :show, :new, :create] end
The Flow
1. User visits
/imports/new, uploads CSV2. Controller creates Import, attaches file, enqueues job, redirects to
/imports/1233. Show page renders with
turbo_stream_from @import—subscribes to this import's channel4. Job starts, updates status to "processing", broadcasts
5. Page receives broadcast, replaces
#import_123 with updated partial6. Job processes 50 rows, broadcasts updated counts
7. Progress bar moves, row count updates—no page refresh
8. Job finishes, broadcasts final status
9. User sees "completed" with final stats
Handling Large Files
For very large CSVs, a few adjustments:
Stream the file instead of loading it all:
def process_csv(import)
require "csv"
# Download to temp file for streaming
tempfile = Tempfile.new(["import", ".csv"])
tempfile.binmode
tempfile.write(import.file.download)
tempfile.rewind
processed = 0
failed = 0
CSV.foreach(tempfile.path, headers: true) do |row|
process_row(import, row)
processed += 1
# Broadcast every 50 rows
if processed % BATCH_SIZE == 0
import.broadcast_progress
end
end
import.broadcast_progress
ensure
tempfile&.close
tempfile&.unlink
end
Increase batch size for faster imports:
BATCH_SIZE = 100 # or 200, depending on row complexity
Broadcasting too frequently creates overhead. Every 50-100 rows is a good balance between responsiveness and performance.
Use bulk inserts:
def process_batch(import, rows)
valid_records = []
rows.each do |row|
record = Product.new(
name: row["name"],
price: row["price"],
sku: row["sku"]
)
if record.valid?
valid_records << record.attributes.except("id", "created_at", "updated_at")
else
import.increment!(:failed_rows)
import.error_messages << "Row: #{record.errors.full_messages.join(', ')}"
end
end
Product.insert_all(valid_records) if valid_records.any?
import.increment!(:processed_rows, rows.size)
import.save!
end
Multiple Imports at Once
If users can see multiple imports on one page (like an index), subscribe to each:
<!-- app/views/imports/index.html.erb -->
<h1>Imports</h1>
<% @imports.each do |import| %>
<%= turbo_stream_from import %>
<div id="import_<%= import.id %>">
<%= render "imports/import", import: import %>
</div>
<% end %>
Each import gets its own subscription. Updates for import #123 only affect that div.
Canceling an Import
Add a cancel mechanism:
class Import < ApplicationRecord
def cancel!
update!(status: "canceled")
end
def canceled?
status == "canceled"
end
end
Check in the job:
def process_csv(import)
CSV.foreach(tempfile.path, headers: true) do |row|
# Check for cancellation
import.reload
if import.canceled?
import.update!(status: "canceled")
import.broadcast_progress
return
end
process_row(import, row)
# ...
end
end
Add cancel button to the view:
<% if import.processing? %> <%= button_to "Cancel Import", cancel_import_path(import), method: :post %> <% end %>
Common Gotchas
Action Cable not configured: Make sure Redis is set up for production. Development uses async adapter by default.
# config/cable.yml
production:
adapter: redis
url: <%= ENV.fetch("REDIS_URL") %>
Job queue not running: In development, jobs run inline by default. For async processing, start Sidekiq or whatever queue adapter you're using.
N+1 on broadcasts: If your partial loads associations, eager-load them:
def broadcast_progress import = Import.includes(:file_attachment).find(id) broadcast_replace_to self, ... end
Too many broadcasts: Broadcasting every single row is too chatty. Batch them. 50-100 rows per broadcast is a good starting point.
The Result
User uploads a 10,000 row CSV. Instead of staring at a spinner for 30 seconds wondering if anything is happening, they see:
• Status change from "Pending" to "Processing"
• Progress bar filling up in real-time
• Row count incrementing: "500 / 10,000 rows... 1,000 / 10,000 rows..."
• Any errors appearing as they happen
• Final "Completed" status with summary
All without writing a single line of JavaScript. The server broadcasts HTML fragments, Turbo applies them. That's it.
This pattern works for any long-running background process: data migrations, report generation, API syncs, bulk operations. Anywhere users need to see progress without refreshing.