Skip to main content

How to Track Download Progress with the Fetch API in JavaScript

Introduction

When downloading large files, data sets, or media through fetch(), users stare at a blank screen with no idea whether the request is 5% done or 95% done. Unlike the older XMLHttpRequest, which provided a simple progress event, fetch() does not offer a built-in progress callback. This was a deliberate design choice: the Fetch API uses streams as its underlying mechanism for reading response bodies, and progress tracking is built on top of that streaming architecture.

The key to tracking download progress with fetch() lies in the response.body property, which is a ReadableStream. Instead of waiting for the entire response to download and then receiving it all at once (which is what response.json() or response.text() do internally), you can read the stream chunk by chunk, tracking how many bytes have arrived so far compared to the total expected size.

In this guide, you will learn how ReadableStream works, how to read a response body incrementally, how to calculate download progress, and how to build practical progress indicators for real-world applications.

Why fetch() Has No Built-In Progress Event

With the older XMLHttpRequest, tracking progress was straightforward:

// The old XHR way (simple but outdated)
const xhr = new XMLHttpRequest();
xhr.open('GET', '/large-file.zip');

xhr.onprogress = function(event) {
if (event.lengthComputable) {
const percent = (event.loaded / event.total) * 100;
console.log(`${percent.toFixed(1)}% downloaded`);
}
};

xhr.onload = function() {
console.log('Download complete');
};

xhr.send();

The Fetch API replaced this event-based model with streams. Streams are more powerful and flexible because they let you process data as it arrives, transform it on the fly, pipe it to other destinations, and cancel reading at any point. The trade-off is that progress tracking requires a bit more code.

note

This article covers download progress (tracking data arriving from the server). Upload progress with fetch() is a different matter entirely. As of now, the Fetch API does not support upload progress tracking natively. For upload progress, you still need XMLHttpRequest or the newer fetch() upload streaming proposals.

Understanding response.body as a ReadableStream

When you call fetch(), the returned Response object has a body property that is a ReadableStream. This stream delivers the response data in small pieces called chunks as they arrive from the network.

What Is a ReadableStream?

A ReadableStream represents a source of data that you can read from sequentially. Think of it like a water pipe: data flows through it continuously, and you can catch it bucket by bucket (chunk by chunk) rather than waiting for the entire reservoir to fill.

const response = await fetch('https://example.com/large-file.json');

console.log(response.body);
// ReadableStream { locked: false }

console.log(response.body instanceof ReadableStream);
// true

The Reader Pattern

To read from a ReadableStream, you obtain a reader using the getReader() method. The reader provides a read() method that returns a Promise resolving to an object with two properties:

  • value: A Uint8Array containing the bytes of the current chunk
  • done: A boolean indicating whether the stream has ended
const response = await fetch('https://example.com/data.json');
const reader = response.body.getReader();

// Read one chunk
const { value, done } = await reader.read();

console.log(done); // false (there's probably more data)
console.log(value); // Uint8Array(16384) [ 123, 34, 110, 97, ... ]

Reading the Entire Stream Chunk by Chunk

To read the complete response, you call read() in a loop until done is true:

const response = await fetch('https://example.com/data.json');
const reader = response.body.getReader();

const chunks = [];

while (true) {
const { value, done } = await reader.read();

if (done) {
break;
}

chunks.push(value);
console.log(`Received chunk: ${value.length} bytes`);
}

console.log('Stream complete');

Output (example):

Received chunk: 16384 bytes
Received chunk: 16384 bytes
Received chunk: 16384 bytes
Received chunk: 8192 bytes
Stream complete

Each chunk is a Uint8Array (a typed array of bytes). The size of each chunk is determined by the browser and the network conditions. You have no control over chunk sizes, and they can vary significantly between reads.

Tracking Download Progress

To show meaningful progress, you need two pieces of information:

  1. How many bytes have been received so far (you count this as chunks arrive)
  2. The total size of the response (from the Content-Length header)

The Basic Progress Pattern

async function fetchWithProgress(url) {
const response = await fetch(url);

// Step 1: Get the total size from the Content-Length header
const contentLength = response.headers.get('Content-Length');

if (!contentLength) {
console.warn('Content-Length header is missing. Cannot track progress.');
}

const totalBytes = parseInt(contentLength, 10);

// Step 2: Get the reader from the response body
const reader = response.body.getReader();

// Step 3: Read chunks and track progress
let receivedBytes = 0;
const chunks = [];

while (true) {
const { value, done } = await reader.read();

if (done) break;

chunks.push(value);
receivedBytes += value.length;

if (totalBytes) {
const progress = (receivedBytes / totalBytes) * 100;
console.log(`Progress: ${progress.toFixed(1)}% (${receivedBytes} / ${totalBytes} bytes)`);
} else {
console.log(`Received: ${receivedBytes} bytes`);
}
}

// Step 4: Combine all chunks into a single Uint8Array
const allChunks = new Uint8Array(receivedBytes);
let position = 0;
for (const chunk of chunks) {
allChunks.set(chunk, position);
position += chunk.length;
}

return allChunks;
}

Output (example):

Progress: 2.4% (16384 / 688128 bytes)
Progress: 4.8% (32768 / 688128 bytes)
Progress: 7.1% (49152 / 688128 bytes)
...
Progress: 97.6% (671744 / 688128 bytes)
Progress: 100.0% (688128 / 688128 bytes)

Step-by-Step Breakdown

Let us walk through each part of this pattern in detail.

Step 1: Getting the total size. The Content-Length header tells you the total number of bytes the server is sending. You retrieve it from the response headers:

const contentLength = response.headers.get('Content-Length');
const totalBytes = parseInt(contentLength, 10);
caution

The Content-Length header is not always available. It can be missing in several scenarios:

  • The server uses chunked transfer encoding (common with dynamically generated responses)
  • The response is compressed with Content-Encoding: gzip (the header reflects the compressed size, not the decompressed size your code receives)
  • CORS restrictions prevent reading the header (the server must include Content-Length in Access-Control-Expose-Headers)

When Content-Length is unavailable, you can still show the number of bytes received, but you cannot calculate a percentage.

Step 2: Obtaining the reader. The getReader() method locks the stream to this reader. Once locked, no other code can read from the stream. This means you cannot use response.json() or response.text() after calling getReader() because those methods also try to read the body stream.

const reader = response.body.getReader();
// response.body is now locked
// response.json() would throw: "body stream is locked"

Step 3: Reading in a loop. Each call to reader.read() returns the next available chunk. The loop continues until done is true:

let receivedBytes = 0;
const chunks = [];

while (true) {
const { value, done } = await reader.read();
if (done) break;

chunks.push(value);
receivedBytes += value.length;
}

Step 4: Reassembling the data. After reading all chunks, you need to combine them into a single contiguous array. This is necessary because each chunk is a separate Uint8Array, and most APIs expect a single buffer or string:

const allChunks = new Uint8Array(receivedBytes);
let position = 0;
for (const chunk of chunks) {
allChunks.set(chunk, position);
position += chunk.length;
}

Converting the Result

Once you have the complete Uint8Array, you can convert it to whatever format you need:

// Convert to string (for JSON or text responses)
const text = new TextDecoder().decode(allChunks);
const json = JSON.parse(text);

// Convert to Blob (for files or images)
const blob = new Blob([allChunks]);

// Use the Uint8Array directly for binary processing
console.log(allChunks.byteLength);

Complete Function: Fetch JSON with Progress

Here is a self-contained function that fetches JSON data while reporting progress through a callback:

async function fetchJSONWithProgress(url, onProgress) {
const response = await fetch(url);

if (!response.ok) {
throw new Error(`HTTP error: ${response.status}`);
}

const contentLength = response.headers.get('Content-Length');
const totalBytes = contentLength ? parseInt(contentLength, 10) : null;
const reader = response.body.getReader();
const chunks = [];
let receivedBytes = 0;

while (true) {
const { value, done } = await reader.read();
if (done) break;

chunks.push(value);
receivedBytes += value.length;

if (onProgress) {
onProgress({
receivedBytes,
totalBytes,
progress: totalBytes ? receivedBytes / totalBytes : null
});
}
}

// Reassemble and parse as JSON
const allChunks = new Uint8Array(receivedBytes);
let position = 0;
for (const chunk of chunks) {
allChunks.set(chunk, position);
position += chunk.length;
}

const text = new TextDecoder().decode(allChunks);
return JSON.parse(text);
}

// Usage
const data = await fetchJSONWithProgress(
'https://jsonplaceholder.typicode.com/photos',
({ receivedBytes, totalBytes, progress }) => {
if (progress !== null) {
console.log(`${(progress * 100).toFixed(1)}%`);
} else {
console.log(`${(receivedBytes / 1024).toFixed(1)} KB received`);
}
}
);

console.log(`Loaded ${data.length} photos`);

Building a Progress Indicator

Let us build practical, visual progress indicators that you can use in real applications.

A Simple Progress Bar

<div id="progress-container" style="
width: 400px;
height: 24px;
background: #ecf0f1;
border-radius: 12px;
overflow: hidden;
position: relative;
margin: 20px;
">
<div id="progress-bar" style="
width: 0%;
height: 100%;
background: linear-gradient(90deg, #3498db, #2ecc71);
border-radius: 12px;
transition: width 0.1s ease-out;
"></div>
<span id="progress-text" style="
position: absolute;
top: 50%;
left: 50%;
transform: translate(-50%, -50%);
font-size: 12px;
font-weight: bold;
color: #2c3e50;
">0%</span>
</div>

<button id="downloadBtn">Download Large File</button>

<script>
const progressBar = document.getElementById('progress-bar');
const progressText = document.getElementById('progress-text');

function updateProgress(received, total) {
if (total) {
const percent = (received / total) * 100;
progressBar.style.width = `${percent}%`;
progressText.textContent = `${percent.toFixed(1)}%`;
} else {
progressText.textContent = `${(received / 1024).toFixed(0)} KB`;
}
}

async function downloadWithProgress(url) {
const response = await fetch(url);

if (!response.ok) {
throw new Error(`Download failed: ${response.status}`);
}

const contentLength = response.headers.get('Content-Length');
const totalBytes = contentLength ? parseInt(contentLength, 10) : null;
const reader = response.body.getReader();
const chunks = [];
let receivedBytes = 0;

while (true) {
const { value, done } = await reader.read();
if (done) break;

chunks.push(value);
receivedBytes += value.length;
updateProgress(receivedBytes, totalBytes);
}

// Combine chunks into a Blob
return new Blob(chunks);
}

document.getElementById('downloadBtn').addEventListener('click', async () => {
try {
updateProgress(0, 1); // Reset
const blob = await downloadWithProgress('https://example.com/large-file.zip');
console.log(`Download complete: ${blob.size} bytes`);

// Trigger file download
const url = URL.createObjectURL(blob);
const a = document.createElement('a');
a.href = url;
a.download = 'large-file.zip';
a.click();
URL.revokeObjectURL(url);
} catch (error) {
progressText.textContent = `Error: ${error.message}`;
}
});
</script>

A Reusable Progress Fetch Function

Here is a more polished, reusable implementation that separates the fetching logic from the UI:

async function fetchWithProgress(url, options = {}) {
const { onProgress, signal } = options;

const response = await fetch(url, { signal });

if (!response.ok) {
throw new Error(`HTTP ${response.status}: ${response.statusText}`);
}

// If no progress callback, just return the response as-is
if (!onProgress) {
return response;
}

const contentLength = response.headers.get('Content-Length');
const totalBytes = contentLength ? parseInt(contentLength, 10) : null;

// If the body is empty or not streamable, return response directly
if (!response.body) {
return response;
}

const reader = response.body.getReader();
const chunks = [];
let receivedBytes = 0;

// Report initial state
onProgress({
phase: 'download',
receivedBytes: 0,
totalBytes,
progress: 0
});

while (true) {
const { value, done } = await reader.read();

if (done) {
onProgress({
phase: 'complete',
receivedBytes,
totalBytes,
progress: 1
});
break;
}

chunks.push(value);
receivedBytes += value.length;

onProgress({
phase: 'download',
receivedBytes,
totalBytes,
progress: totalBytes ? receivedBytes / totalBytes : null
});
}

// Reconstruct a new Response from the collected data
const blob = new Blob(chunks);
return new Response(blob, {
status: response.status,
statusText: response.statusText,
headers: response.headers
});
}

This function returns a new Response object, so you can use it as a drop-in replacement for fetch() and still call .json(), .text(), or .blob() on the result:

const response = await fetchWithProgress(
'https://jsonplaceholder.typicode.com/photos',
{
onProgress: ({ phase, receivedBytes, totalBytes, progress }) => {
if (phase === 'download' && progress !== null) {
console.log(`Downloading: ${(progress * 100).toFixed(1)}%`);
} else if (phase === 'download') {
console.log(`Downloading: ${(receivedBytes / 1024).toFixed(1)} KB`);
} else if (phase === 'complete') {
console.log('Download complete!');
}
}
}
);

// Use the response normally
const photos = await response.json();
console.log(`Loaded ${photos.length} photos`);

Image Loading with Progress

A common use case is showing progress while loading a large image:

async function loadImageWithProgress(imageUrl, imgElement, progressCallback) {
const response = await fetch(imageUrl);

if (!response.ok) {
throw new Error(`Image load failed: ${response.status}`);
}

const contentLength = response.headers.get('Content-Length');
const totalBytes = contentLength ? parseInt(contentLength, 10) : null;
const reader = response.body.getReader();
const chunks = [];
let receivedBytes = 0;

while (true) {
const { value, done } = await reader.read();
if (done) break;

chunks.push(value);
receivedBytes += value.length;

if (progressCallback) {
progressCallback(
totalBytes ? receivedBytes / totalBytes : null,
receivedBytes,
totalBytes
);
}
}

// Create a blob URL and set it as the image source
const blob = new Blob(chunks);
const blobUrl = URL.createObjectURL(blob);
imgElement.src = blobUrl;

// Clean up the blob URL after the image loads
imgElement.onload = () => URL.revokeObjectURL(blobUrl);

return blob;
}

// Usage
const img = document.getElementById('hero-image');
const progressLabel = document.getElementById('image-progress');

await loadImageWithProgress(
'https://example.com/high-res-photo.jpg',
img,
(progress, received, total) => {
if (progress !== null) {
progressLabel.textContent = `Loading image: ${(progress * 100).toFixed(0)}%`;
} else {
progressLabel.textContent = `Loading: ${(received / 1024).toFixed(0)} KB`;
}
}
);

progressLabel.textContent = 'Image loaded!';

Multiple Downloads with Combined Progress

When downloading multiple files, you might want a single progress bar that reflects the overall progress:

async function fetchMultipleWithProgress(urls, onTotalProgress) {
// First, send HEAD requests to get file sizes
const sizes = await Promise.all(
urls.map(async (url) => {
try {
const head = await fetch(url, { method: 'HEAD' });
const length = head.headers.get('Content-Length');
return length ? parseInt(length, 10) : 0;
} catch {
return 0;
}
})
);

const totalSize = sizes.reduce((sum, size) => sum + size, 0);
const received = new Array(urls.length).fill(0);

function reportProgress() {
const totalReceived = received.reduce((sum, r) => sum + r, 0);
onTotalProgress({
totalReceived,
totalSize,
progress: totalSize > 0 ? totalReceived / totalSize : null
});
}

// Download all files in parallel with individual progress tracking
const results = await Promise.all(
urls.map(async (url, index) => {
const response = await fetch(url);

if (!response.ok) {
throw new Error(`Failed to fetch ${url}: ${response.status}`);
}

const reader = response.body.getReader();
const chunks = [];

while (true) {
const { value, done } = await reader.read();
if (done) break;

chunks.push(value);
received[index] += value.length;
reportProgress();
}

return new Blob(chunks);
})
);

return results;
}

// Usage
const files = await fetchMultipleWithProgress(
[
'/files/document1.pdf',
'/files/spreadsheet.xlsx',
'/files/presentation.pptx'
],
({ totalReceived, totalSize, progress }) => {
if (progress !== null) {
document.getElementById('total-progress').style.width = `${progress * 100}%`;
document.getElementById('total-label').textContent =
`${(progress * 100).toFixed(1)}% (${formatBytes(totalReceived)} / ${formatBytes(totalSize)})`;
}
}
);

function formatBytes(bytes) {
if (bytes < 1024) return `${bytes} B`;
if (bytes < 1048576) return `${(bytes / 1024).toFixed(1)} KB`;
return `${(bytes / 1048576).toFixed(1)} MB`;
}

Handling Edge Cases

When Content-Length Is Missing

As mentioned, the Content-Length header is often unavailable. Your progress indicator should handle this gracefully:

async function fetchAdaptiveProgress(url, onProgress) {
const response = await fetch(url);
const contentLength = response.headers.get('Content-Length');
const totalBytes = contentLength ? parseInt(contentLength, 10) : null;
const reader = response.body.getReader();
const chunks = [];
let receivedBytes = 0;

while (true) {
const { value, done } = await reader.read();
if (done) break;

chunks.push(value);
receivedBytes += value.length;

if (totalBytes) {
// Determinate progress: show percentage
onProgress({
type: 'determinate',
percent: (receivedBytes / totalBytes) * 100,
receivedBytes,
totalBytes
});
} else {
// Indeterminate progress: show bytes received
onProgress({
type: 'indeterminate',
receivedBytes
});
}
}

const allData = new Uint8Array(receivedBytes);
let pos = 0;
for (const chunk of chunks) {
allData.set(chunk, pos);
pos += chunk.length;
}

return allData;
}

The UI can then switch between a progress bar and a spinner:

fetchAdaptiveProgress('https://api.example.com/data', (info) => {
const progressEl = document.getElementById('progress');

if (info.type === 'determinate') {
// Show actual progress bar
progressEl.innerHTML = `
<div class="progress-bar" style="width: ${info.percent}%"></div>
<span>${info.percent.toFixed(1)}%</span>
`;
} else {
// Show indeterminate indicator with byte count
progressEl.innerHTML = `
<div class="spinner"></div>
<span>${formatBytes(info.receivedBytes)} received</span>
`;
}
});

Content-Length with Compressed Responses

A subtle but important issue: when the server sends a compressed response (e.g., Content-Encoding: gzip), the Content-Length header reflects the compressed size on the wire. However, the chunks you receive from the ReadableStream are the decompressed data, because the browser handles decompression transparently.

This means receivedBytes can actually exceed totalBytes:

// Server sends: Content-Length: 50000 (compressed)
// Decompressed data is actually 200000 bytes

// Your progress calculation would show:
// 50000 / 50000 = 100% ... but you're only 25% done reading!
// Then it goes to 150%, 200%, etc.

To handle this, clamp the progress value:

const rawProgress = receivedBytes / totalBytes;
const progress = Math.min(rawProgress, 1); // Never exceed 100%
tip

A more reliable approach for compressed responses is to not trust Content-Length for progress when the Content-Encoding header is present:

const isCompressed = response.headers.get('Content-Encoding');
const totalBytes = isCompressed ? null : parseInt(contentLength, 10);

This way, compressed responses fall back to the indeterminate progress indicator, which is more honest than a misleading percentage.

Cancelling a Download in Progress

Combine AbortController with stream reading to allow users to cancel ongoing downloads:

function createCancellableDownload(url, onProgress) {
const controller = new AbortController();

const promise = (async () => {
const response = await fetch(url, { signal: controller.signal });

if (!response.ok) {
throw new Error(`HTTP ${response.status}`);
}

const contentLength = response.headers.get('Content-Length');
const totalBytes = contentLength ? parseInt(contentLength, 10) : null;
const reader = response.body.getReader();
const chunks = [];
let receivedBytes = 0;

try {
while (true) {
const { value, done } = await reader.read();
if (done) break;

chunks.push(value);
receivedBytes += value.length;

if (onProgress) {
onProgress({
receivedBytes,
totalBytes,
progress: totalBytes ? Math.min(receivedBytes / totalBytes, 1) : null
});
}
}
} catch (error) {
// If aborted, the reader.read() will throw
if (error.name === 'AbortError') {
// Release the reader lock
reader.releaseLock();
throw error;
}
throw error;
}

return new Blob(chunks);
})();

return {
promise,
cancel: () => controller.abort()
};
}

// Usage
const download = createCancellableDownload(
'https://example.com/large-file.zip',
({ progress }) => {
if (progress !== null) {
console.log(`${(progress * 100).toFixed(1)}%`);
}
}
);

// Cancel after 5 seconds
setTimeout(() => {
download.cancel();
console.log('Download cancelled by user');
}, 5000);

try {
const blob = await download.promise;
console.log('Download complete:', blob.size);
} catch (error) {
if (error.name === 'AbortError') {
console.log('Download was cancelled');
} else {
console.error('Download failed:', error);
}
}

A Complete Real-World Example

Here is a polished, production-ready download manager with visual progress, speed calculation, ETA estimation, and cancellation:

<!DOCTYPE html>
<html>
<head>
<style>
.download-manager {
font-family: system-ui, sans-serif;
max-width: 500px;
margin: 40px auto;
padding: 24px;
border: 1px solid #e0e0e0;
border-radius: 12px;
}

.progress-track {
height: 28px;
background: #ecf0f1;
border-radius: 14px;
overflow: hidden;
position: relative;
margin: 16px 0;
}

.progress-fill {
height: 100%;
background: linear-gradient(90deg, #3498db, #2ecc71);
border-radius: 14px;
transition: width 0.15s ease-out;
width: 0%;
}

.progress-label {
position: absolute;
top: 50%;
left: 50%;
transform: translate(-50%, -50%);
font-size: 13px;
font-weight: 600;
color: #2c3e50;
}

.stats {
display: flex;
justify-content: space-between;
font-size: 13px;
color: #7f8c8d;
margin-top: 8px;
}

.controls {
display: flex;
gap: 8px;
margin-top: 16px;
}

.controls button {
padding: 10px 20px;
border: none;
border-radius: 8px;
font-size: 14px;
cursor: pointer;
font-weight: 500;
}

.btn-start {
background: #3498db;
color: white;
}

.btn-cancel {
background: #e74c3c;
color: white;
}

button:disabled {
opacity: 0.5;
cursor: not-allowed;
}
</style>
</head>
<body>

<div class="download-manager">
<h3>Download Manager</h3>
<input type="text" id="urlInput" placeholder="Enter file URL"
value="https://jsonplaceholder.typicode.com/photos"
style="width: 100%; padding: 10px; box-sizing: border-box; border-radius: 8px; border: 1px solid #ddd;">

<div class="progress-track">
<div class="progress-fill" id="progressFill"></div>
<span class="progress-label" id="progressLabel">Ready</span>
</div>

<div class="stats">
<span id="statSize">--</span>
<span id="statSpeed">--</span>
<span id="statETA">--</span>
</div>

<div class="controls">
<button class="btn-start" id="startBtn">Start Download</button>
<button class="btn-cancel" id="cancelBtn" disabled>Cancel</button>
</div>
</div>

<script>
const progressFill = document.getElementById('progressFill');
const progressLabel = document.getElementById('progressLabel');
const statSize = document.getElementById('statSize');
const statSpeed = document.getElementById('statSpeed');
const statETA = document.getElementById('statETA');
const startBtn = document.getElementById('startBtn');
const cancelBtn = document.getElementById('cancelBtn');
const urlInput = document.getElementById('urlInput');

let abortController = null;

function formatBytes(bytes) {
if (bytes === 0) return '0 B';
const units = ['B', 'KB', 'MB', 'GB'];
const i = Math.floor(Math.log(bytes) / Math.log(1024));
return `${(bytes / Math.pow(1024, i)).toFixed(1)} ${units[i]}`;
}

function formatTime(seconds) {
if (!isFinite(seconds) || seconds < 0) return '--';
if (seconds < 60) return `${Math.ceil(seconds)}s`;
const mins = Math.floor(seconds / 60);
const secs = Math.ceil(seconds % 60);
return `${mins}m ${secs}s`;
}

async function startDownload() {
const url = urlInput.value.trim();
if (!url) return;

abortController = new AbortController();
startBtn.disabled = true;
cancelBtn.disabled = false;

// Reset UI
progressFill.style.width = '0%';
progressLabel.textContent = 'Connecting...';

// Speed calculation state
let lastTime = performance.now();
let lastBytes = 0;
let speed = 0;

try {
const response = await fetch(url, { signal: abortController.signal });

if (!response.ok) {
throw new Error(`HTTP ${response.status}: ${response.statusText}`);
}

const contentLength = response.headers.get('Content-Length');
const contentEncoding = response.headers.get('Content-Encoding');
const totalBytes = (contentLength && !contentEncoding)
? parseInt(contentLength, 10)
: null;

const reader = response.body.getReader();
const chunks = [];
let receivedBytes = 0;
const startTime = performance.now();

while (true) {
const { value, done } = await reader.read();
if (done) break;

chunks.push(value);
receivedBytes += value.length;

// Calculate speed every chunk
const now = performance.now();
const timeDelta = (now - lastTime) / 1000; // seconds

if (timeDelta > 0.2) { // Update speed every 200ms
speed = (receivedBytes - lastBytes) / timeDelta;
lastTime = now;
lastBytes = receivedBytes;
}

// Update UI
if (totalBytes) {
const progress = Math.min(receivedBytes / totalBytes, 1);
progressFill.style.width = `${progress * 100}%`;
progressLabel.textContent = `${(progress * 100).toFixed(1)}%`;

const remainingBytes = totalBytes - receivedBytes;
const eta = speed > 0 ? remainingBytes / speed : Infinity;
statETA.textContent = `ETA: ${formatTime(eta)}`;
} else {
progressLabel.textContent = formatBytes(receivedBytes);
statETA.textContent = '';
}

statSize.textContent = totalBytes
? `${formatBytes(receivedBytes)} / ${formatBytes(totalBytes)}`
: formatBytes(receivedBytes);

statSpeed.textContent = `${formatBytes(speed)}/s`;
}

// Download complete
const totalTime = ((performance.now() - startTime) / 1000).toFixed(1);
progressFill.style.width = '100%';
progressLabel.textContent = 'Complete!';
statETA.textContent = `Time: ${totalTime}s`;
statSpeed.textContent = `Avg: ${formatBytes(receivedBytes / parseFloat(totalTime))}/s`;

console.log(`Downloaded ${chunks.length} chunks, ${formatBytes(receivedBytes)} total`);

} catch (error) {
if (error.name === 'AbortError') {
progressLabel.textContent = 'Cancelled';
progressFill.style.background = '#e74c3c';
} else {
progressLabel.textContent = `Error: ${error.message}`;
progressFill.style.background = '#e74c3c';
}
} finally {
startBtn.disabled = false;
cancelBtn.disabled = true;
abortController = null;
}
}

startBtn.addEventListener('click', startDownload);
cancelBtn.addEventListener('click', () => {
if (abortController) abortController.abort();
});
</script>

</body>
</html>

This example includes:

  • Percentage progress when Content-Length is available
  • Byte counter when Content-Length is missing
  • Download speed calculated from recent chunks
  • ETA estimation based on current speed and remaining bytes
  • Cancel support via AbortController
  • Compressed response detection to avoid misleading progress
  • Graceful error handling for network failures and HTTP errors

Streaming Without Buffering

In all the examples so far, we collected every chunk into an array and reassembled them at the end. This works well for reasonably sized downloads, but for very large files, keeping all chunks in memory can be wasteful. If you just need to save the file or pipe it somewhere, you can process chunks as they arrive without buffering:

async function streamToFile(url, onProgress) {
const response = await fetch(url);
const contentLength = response.headers.get('Content-Length');
const totalBytes = contentLength ? parseInt(contentLength, 10) : null;
const reader = response.body.getReader();
let receivedBytes = 0;

// Use a WritableStream or process chunks immediately
const chunks = []; // In a Service Worker, you could write directly to Cache API

while (true) {
const { value, done } = await reader.read();
if (done) break;

// Process immediately instead of buffering everything
chunks.push(value); // or write to IndexedDB, or pipe elsewhere

receivedBytes += value.length;

if (onProgress) {
onProgress(receivedBytes, totalBytes);
}
}

return new Blob(chunks);
}
note

In environments that support the full Streams API (including WritableStream and TransformStream), you can use pipeTo() and pipeThrough() for even more efficient stream processing. However, for the common use case of showing download progress, the getReader() approach covered in this guide is the most practical and widely supported solution.

Summary

Tracking download progress with the Fetch API revolves around the response.body ReadableStream. Instead of calling convenience methods like .json() or .text() (which read the entire body at once behind the scenes), you read the stream manually, chunk by chunk, counting bytes as they arrive.

The essential pattern is: get the Content-Length header for the total size, obtain a reader with response.body.getReader(), loop through chunks with reader.read(), accumulate the received byte count, and calculate progress as the ratio of received bytes to total bytes.

Key things to remember: Content-Length may be missing (use indeterminate progress), compressed responses report compressed size in the header but deliver decompressed chunks (clamp progress to 100% or skip percentage for compressed responses), the body can only be read once (either use the reader or use .json(), never both), and always handle cancellation with AbortController for a polished user experience.

For most real applications, wrap this logic in a reusable function that accepts a progress callback, and build your UI updates on top of it. The stream-based approach is slightly more code than the old XHR progress event, but it gives you a much more powerful and flexible foundation.