Skip to main content

How to Remove Duplicates from an Array of Objects in JavaScript

A common data manipulation task is to "deduplicate" an array of objects based on a specific property, like an id. Unlike an array of primitive values (where you can just use a Set), removing duplicate objects requires a more deliberate approach to specify what makes an object "unique."

This guide will demonstrate the two most effective and modern methods for this task. We will cover the highly efficient and concise approach using a Map, and a more procedural but still very effective method using filter() with a Set.

The Core Problem: Objects are Compared by Reference

You cannot simply use new Set(arrayOfObjects) to remove duplicates. A Set determines uniqueness by comparing values. For primitive types (strings, numbers), this works as expected. For objects, however, it compares their reference (their address in memory), not their content.

Example of problem:

// Problem: These two objects have the same content but are different in memory.
const obj1 = { id: 1, name: 'Alice' };
const obj2 = { id: 1, name: 'Alice' };

console.log(obj1 === obj2); // Output: false

const mySet = new Set([obj1, obj2]);
console.log(mySet.size); // Output: 2 (Both objects are kept)

To solve this, we must tell our code to check for uniqueness based on a specific property, like id.

This is the most modern, elegant, and efficient way to deduplicate an array of objects. It leverages the fact that Map keys must be unique.

The logic:

  1. Create a Map where the keys are the unique property (e.g., the id) and the values are the full objects.
  2. As you build the Map, if you encounter an id that is already in the Map, it will simply overwrite the previous entry. This automatically handles deduplication, keeping the last seen object for each unique id.
  3. Finally, convert the values of the Map back into an array.

The solution: this can all be done in a single, powerful line of code.

const employees = [
{ id: 1, name: 'Alice1' },
{ id: 2, name: 'Bob' },
{ id: 1, name: 'Alice2' }, // Duplicate ID
{ id: 3, name: 'Charlie' },
];

// This one-liner performs the entire operation.
const uniqueEmployees = Array.from(new Map(employees.map(emp => [emp.id, emp])).values());

console.log(uniqueEmployees);

Output:

[
{ id: 1, name: 'Alice2' }, // Note: The last one with id: 1 wins.
{ id: 2, name: 'Bob' },
{ id: 3, name: 'Charlie' }
]

The filter() and Set Method (Also Excellent)

This method is more imperative but can be easier to read for developers less familiar with Map constructors. It is also very performant because it only iterates through the array once.

The logic:

  1. Create an empty Set to keep track of the unique IDs you have already seen.
  2. Use Array.prototype.filter() to iterate over the array.
  3. For each object, check if its id is already in the Set.
  4. If the id is not in the Set, add it and keep the object in the filtered array.
  5. If the id is already in the Set, it's a duplicate, so filter it out.

The solution:

const employees = [
{ id: 1, name: 'Alice1' },
{ id: 2, name: 'Bob' },
{ id: 1, name: 'Alice2' }, // Duplicate ID
{ id: 3, name: 'Charlie' },
];

const seen = new Set();

const uniqueEmployees = employees.filter(emp => {
const isDuplicate = seen.has(emp.id);
seen.add(emp.id);
return !isDuplicate;
});

console.log(uniqueEmployees);

Output:

[
{ id: 1, name: 'Alice1' }, // Note: The first one with id: 1 wins.
{ id: 2, name: 'Bob' },
{ id: 3, name: 'Charlie' }
]
note

Key Difference: This method keeps the first object seen for each unique ID, whereas the Map method keeps the last.

How the Map Method Works

Let's break down the one-liner: Array.from(new Map(employees.map(emp => [emp.id, emp])).values()).

  1. employees.map(emp => [emp.id, emp]): This is the first step. It transforms the original array of objects into an array of [key, value] pairs.
    // Result of the .map() call:
    [
    [1, { id: 1, name: 'Alice1' }],
    [2, { id: 2, name: 'Bob' }],
    [1, { id: 1, name: 'Alice2' }], // Note the duplicate key: 1
    [3, { id: 3, name: 'Charlie' }]
    ]
  2. new Map(...): This creates a new Map from the array of pairs. Because Map keys must be unique, the second entry with the key 1 simply overwrites the first one.
    // The resulting Map:
    // Map(3) { 1 => {id: 1, name: 'Alice1'}, 2 => {id: 2, name: 'Bob'}, 3 => {id: 3, name: 'Charlie'} }
  3. .values(): This returns an iterator for the values (the objects) in the Map.
  4. Array.from(...): This converts the iterator back into a new array, giving us our final, deduplicated result.

How to Deduplicate Based on Multiple Properties

Both methods can be easily adapted to check for uniqueness based on a combination of properties. You just need to create a unique composite key.

The Solution uses filter and Set:

const users = [
{ firstName: 'John', lastName: 'Doe' },
{ firstName: 'Jane', lastName: 'Doe' },
{ firstName: 'John', lastName: 'Doe' }, // Duplicate
];

const seen = new Set();
const uniqueUsers = users.filter(user => {
const compositeKey = `${user.firstName}-${user.lastName}`;
if (seen.has(compositeKey)) {
return false;
} else {
seen.add(compositeKey);
return true;
}
});

Conclusion

Removing duplicate objects from an array is a common data cleaning task with clear, modern solutions in JavaScript.

  • The Map constructor method is the most concise and idiomatic solution. It's a powerful one-liner that is highly efficient but has the characteristic of keeping the last seen object.
  • The filter() with a Set method is also an excellent, highly readable, and performant alternative. It keeps the first seen object.

Choosing between them often comes down to which duplicate you want to keep (the first or the last) and which style of code you find more readable.