How to Handle File Uploads in React: Buffering, Progress & Preview
Tips on building a smooth, client-side file upload component


Uploading large files in a React application can be tricky: you don’t want to freeze the UI, lose users when uploads hang, or overwhelm your backend with massive payloads.
In this tutorial, we’ll build a smooth, client-side file upload component that:
-
Reads files in buffered chunks.
-
Displays a live progress bar.
-
Shows a preview (for images) before uploading.
-
Sends each chunk to your API endpoint.
Table of Contents
- Why Buffering Matters
-
Setting Up the File Input & State
-
Reading Files in Chunks
-
Displaying a Progress Bar
-
Rendering an Image Preview
-
Uploading Chunks to Your Server
-
Putting It All Together
-
Further Reading & Internal Links
1. Why Buffering Matters
When you attempt to upload a multi-megabyte file all at once:
-
UI freezes while the browser processes the data.
-
Network failures force you to restart from zero.
-
Memory usage spikes, affecting performance.
By slicing the file into smaller chunks (e.g. 512 KB each), you can:
-
Stream data progressively with the Fetch API.
-
Update a progress indicator after each chunk.
-
Retry individual chunks on failure.
2. Setting Up the File Input & State
First, create a React component with state to hold the selected file, upload progress, and preview URL.
jsx
CopyEdit
import React, { useState } from 'react';
export default function FileUploader() {
const [file, setFile] = useState(null);
const [previewURL, setPreviewURL] = useState('');
const [progress, setProgress] = useState(0);
function handleFileChange(e) {
const selected = e.target.files[0];
if (selected) {
setFile(selected);
setPreviewURL(URL.createObjectURL(selected));
}
}
return (
<div>
<input type="file" onChange={handleFileChange} />
{previewURL && (
<img src={previewURL} alt="Preview" style={{ maxWidth: 200 }} />
)}
{file && <button onClick={() => uploadInChunks(file)}>Upload</button>}
{progress > 0 && <progress value={progress} max="100" />}
</div>
);
}
3. Reading Files in Chunks
Use the File API’s slice method to break the file into buffers:
js
CopyEdit
async function* sliceFile(file, chunkSize = 512 * 1024) {
let offset = 0;
while (offset < file.size) {
const chunk = file.slice(offset, offset + chunkSize);
yield chunk;
offset += chunkSize;
}
}
This async generator yields Blob chunks you can upload sequentially.
4. Displaying a Progress Bar
Inside your upload loop, update progress after each chunk:
js
CopyEdit
async function uploadInChunks(file) {
const totalChunks = Math.ceil(file.size / chunkSize);
let uploaded = 0;
for await (const chunk of sliceFile(file, chunkSize)) {
await uploadChunk(chunk, file.name, uploaded);
uploaded++;
setProgress(Math.round((uploaded / totalChunks) * 100));
}
}
With setProgress, React re-renders the <progress> element, giving live feedback.
5. Rendering an Image Preview
We already generate a preview URL via URL.createObjectURL. To clean up memory:
js
CopyEdit
useEffect(() => {
return () => {
if (previewURL) URL.revokeObjectURL(previewURL);
};
}, [previewURL]);
This releases the blob URL when the component unmounts or a new file is selected.
6. Uploading Chunks to Your Server
Assuming your server expects:
yaml
CopyEdit
POST /api/upload
Headers:
Content-Range: bytes {start}-{end}/{total}
Body: raw chunk bytes
Implement uploadChunk like so:
js
CopyEdit
async function uploadChunk(chunk, fileName, chunkIndex) {
const start = chunkIndex * chunkSize;
const end = start + chunk.size - 1;
const res = await fetch('/api/upload', {
method: 'POST',
headers: {
'Content-Range': `bytes ${start}-${end}/${file.size}`,
'X-File-Name': fileName,
},
body: chunk,
});
if (!res.ok) {
throw new Error(`Chunk upload failed: ${res.statusText}`);
}
}
On the backend, you can assemble the chunks in order. see our guide on API vs. Webhooks: What’s the Difference? for webhook-based ingestion patterns.
7. Putting It All Together
Here’s the full uploader component with everything wired:
jsx
CopyEdit
import React, { useState, useEffect } from 'react';
const chunkSize = 512 * 1024; // 512 KB
async function* sliceFile(file) { /*…*/ }
async function uploadChunk(chunk, fileName, chunkIndex) { /*…*/ }
export default function FileUploader() {
const [file, setFile] = useState(null);
const [previewURL, setPreviewURL] = useState('');
const [progress, setProgress] = useState(0);
useEffect(() => () => previewURL && URL.revokeObjectURL(previewURL), [previewURL]);
async function handleUpload() {
const totalChunks = Math.ceil(file.size / chunkSize);
let uploaded = 0;
for await (const chunk of sliceFile(file)) {
await uploadChunk(chunk, file.name, uploaded);
uploaded++;
setProgress(Math.round((uploaded / totalChunks) * 100));
}
}
return (
<div>
<input type="file" onChange={e => {
const f = e.target.files[0];
setFile(f);
setPreviewURL(URL.createObjectURL(f));
setProgress(0);
}} />
{previewURL && <img src={previewURL} alt="Preview" style={{ maxWidth: 200 }} />}
{file && <button onClick={handleUpload}>Upload</button>}
{progress > 0 && <progress value={progress} max="100">{progress}%</progress>}
</div>
);
}
8. Further Reading & Internal Links
-
Routing in React: share the upload state via URL query → How React Routing Works Using a Headless CMS
-
API vs. Webhooks: backend chunk assembly patterns → API vs Webhooks: What's the Difference
-
CDN Assets & Caching: serve uploaded files efficiently → CDNs, Assets and Caching
-
Content Fetch API: if you need to fetch metadata after upload → Content Fetch API
With buffering, progress indicators, and previews in place, your React app can handle large file uploads without sacrificing performance or user experience.

About the Author
Agility CMS is Canada's original headless CMS platform. Since 2002, Agility has helped companies across Canada and around the world better manage their content. Marketers are free to create the content they want, when they want it. Developers are empowered to build what they want, how they want.
- Get a demo for a personalized walkthrough.
- Try for FREE and experience Agility CMS.
- Contact us with your questions.