Skip to main content

How to Find the Size of an Object in Python

Understanding how much memory your Python objects consume is essential for writing efficient code - especially when working with large datasets, optimizing performance, or debugging memory issues. Python provides the built-in sys.getsizeof() function to measure the memory size of any object in bytes.

This guide explains how to use sys.getsizeof(), how to interpret its results, and how to measure the total (deep) size of complex objects that contain references to other objects.

Using sys.getsizeof() for Basic Size Measurement

The sys.getsizeof() function is part of Python's sys module. It returns the size of an object in bytes, including the garbage collector overhead.

import sys

# Size of different object types
print(f"Integer (12): {sys.getsizeof(12)} bytes")
print(f"Float (3.14): {sys.getsizeof(3.14)} bytes")
print(f"String ('tutorial'): {sys.getsizeof('tutorial')} bytes")
print(f"Boolean (True): {sys.getsizeof(True)} bytes")
print(f"None: {sys.getsizeof(None)} bytes")

Output:

Integer (12):        28 bytes
Float (3.14): 24 bytes
String ('tutorial'): 57 bytes
Boolean (True): 28 bytes
None: 16 bytes
note

You might expect an integer to occupy just 4 or 8 bytes (as in C or Java). In Python, integers are objects - they carry additional metadata like type information, reference count, and the actual value. This is why even a simple integer takes 28 bytes.

Understanding Python Object Sizes

Python objects have a base overhead plus a variable component that grows with the amount of data stored. Here's a reference table for common types (sizes may vary slightly between Python versions and platforms):

Object TypeBase Size (Bytes)Growth Pattern
int28Grows for very large integers (arbitrary precision)
float24Fixed size
bool28Same as int (subclass of int)
None16Fixed size
str49+1 byte per additional character (for ASCII)
bytes33+1 byte per additional byte
tuple40 (empty)+8 bytes per item
list56 (empty)+8 bytes per item
set216 (empty)Grows in steps (216 → 728 → 2264...)
dict64 (empty)Grows in steps (64 → 232 → 360...)
function136Base size without attributes

Measuring Container Sizes

Let's examine how the size changes as containers grow:

import sys

# Empty vs. populated containers
print("--- Tuples ---")
print(f"Empty tuple: {sys.getsizeof(())} bytes")
print(f"5-item tuple: {sys.getsizeof(('a', 'b', 'c', 'd', 'e'))} bytes")

print("\n--- Lists ---")
print(f"Empty list: {sys.getsizeof([])} bytes")
print(f"5-item list: {sys.getsizeof(['a', 'b', 'c', 'd', 'e'])} bytes")

print("\n--- Sets ---")
print(f"Empty set: {sys.getsizeof(set())} bytes")
print(f"4-item set: {sys.getsizeof({1, 2, 3, 4})} bytes")

print("\n--- Dicts ---")
print(f"Empty dict: {sys.getsizeof({}) } bytes")
print(f"4-item dict: {sys.getsizeof({1: 'a', 2: 'b', 3: 'c', 4: 'd'})} bytes")

Output:

--- Tuples ---
Empty tuple: 40 bytes
5-item tuple: 80 bytes

--- Lists ---
Empty list: 56 bytes
5-item list: 104 bytes

--- Sets ---
Empty set: 216 bytes
4-item set: 216 bytes

--- Dicts ---
Empty dict: 64 bytes
4-item dict: 224 bytes

How Strings Grow

import sys

print(f"Empty string '': {sys.getsizeof('')} bytes")
print(f"1-char string 'a': {sys.getsizeof('a')} bytes")
print(f"5-char string 'hello': {sys.getsizeof('hello')} bytes")
print(f"10-char string: {sys.getsizeof('helloworld')} bytes")

Output:

Empty string '':       49 bytes
1-char string 'a': 50 bytes
5-char string 'hello': 54 bytes
10-char string: 59 bytes

Each additional ASCII character adds exactly 1 byte to the base size of 49 bytes.

Important Limitation: getsizeof() Is Shallow

sys.getsizeof() only measures the direct memory of the object - not the memory of objects it references. For a list of strings, it measures the list container (pointers), not the strings themselves:

import sys

my_list = ["hello", "world", "python"]

# This only counts the list overhead + 3 pointers (8 bytes each)
print(f"List (shallow): {sys.getsizeof(my_list)} bytes")

# The actual strings consume additional memory
total = sys.getsizeof(my_list) + sum(sys.getsizeof(s) for s in my_list)
print(f"List (with strings): {total} bytes")

Output:

List (shallow): 88 bytes
List (with strings): 251 bytes

Measuring Deep (Total) Size of Objects

For nested or complex objects, you need a recursive function that traverses all referenced objects. Here's a reusable deep size calculator:

import sys

def deep_getsizeof(obj, seen=None):
"""Recursively calculate the total memory size of an object and its contents."""
if seen is None:
seen = set()

obj_id = id(obj)
if obj_id in seen:
return 0
seen.add(obj_id)

size = sys.getsizeof(obj)

if isinstance(obj, dict):
size += sum(deep_getsizeof(k, seen) + deep_getsizeof(v, seen) for k, v in obj.items())
elif isinstance(obj, (list, tuple, set, frozenset)):
size += sum(deep_getsizeof(item, seen) for item in obj)

return size


# Compare shallow vs. deep size
data = {
"name": "Alice",
"scores": [95, 87, 92, 88],
"details": {"city": "Delhi", "active": True}
}

print(f"Shallow size: {sys.getsizeof(data)} bytes")
print(f"Deep size: {deep_getsizeof(data)} bytes")

Output:

Shallow size: 184 bytes
Deep size: 976 byte

The deep size is nearly 3x larger because it accounts for the strings, integers, nested list, and nested dictionary - all of which the shallow measurement ignores.

Comparing Sizes of Different Data Structures

Understanding relative sizes helps you choose the right data structure:

import sys

# Store the same 5 items in different containers
items_tuple = (1, 2, 3, 4, 5)
items_list = [1, 2, 3, 4, 5]
items_set = {1, 2, 3, 4, 5}
items_dict = {1: "a", 2: "b", 3: "c", 4: "d", 5: "e"}

print(f"Tuple (5 items): {sys.getsizeof(items_tuple):>6} bytes")
print(f"List (5 items): {sys.getsizeof(items_list):>6} bytes")
print(f"Set (5 items): {sys.getsizeof(items_set):>6} bytes")
print(f"Dict (5 items): {sys.getsizeof(items_dict):>6} bytes")

Output:

Tuple (5 items):     80 bytes
List (5 items): 104 bytes
Set (5 items): 472 bytes
Dict (5 items): 224 bytes

Key takeaways:

  • Tuples are the most memory-efficient sequence type (immutable, no over-allocation).
  • Lists use slightly more memory than tuples due to dynamic resizing overhead.
  • Sets and dicts use significantly more memory because of their hash table implementation - but they provide O(1) lookups.

Measuring the Size of Custom Objects

You can also measure the size of class instances:

import sys

class User:
def __init__(self, name, age):
self.name = name
self.age = age

class UserSlots:
__slots__ = ["name", "age"]
def __init__(self, name, age):
self.name = name
self.age = age

user_regular = User("Alice", 30)
user_slots = UserSlots("Alice", 30)

print(f"Regular class instance: {sys.getsizeof(user_regular)} bytes")
print(f"__slots__ class instance: {sys.getsizeof(user_slots)} bytes")

Output:

Regular class instance: 56 bytes
__slots__ class instance: 48 bytes
tip

While the instance size shown by getsizeof() may appear similar, regular instances also have a __dict__ attribute (not counted in the shallow size) that consumes additional memory. Using __slots__ eliminates __dict__, saving memory when you create many instances of the same class:

print(f"Instance __dict__: {sys.getsizeof(user_regular.__dict__)} bytes")
# user_slots has no __dict__, AttributeError if accessed

Using pympler for Advanced Memory Profiling

For more accurate and detailed memory analysis, the third-party pympler library provides asizeof(), which automatically performs deep size measurement:

pip install pympler
from pympler import asizeof

data = {"name": "Alice", "scores": [95, 87, 92, 88]}

print(f"pympler deep size: {asizeof.asizeof(data)} bytes")
# Output: pympler deep size: 648 bytes

This gives you a single function call that handles all the recursive traversal automatically.

Quick Reference

FunctionMeasuresDepthSource
sys.getsizeof(obj)Direct object memoryShallow onlyStandard library
Custom deep_getsizeof()Object + all referenced objectsDeep (recursive)Manual implementation
pympler.asizeof.asizeof()Object + all referenced objectsDeep (automatic)Third-party (pympler)

Conclusion

Measuring object sizes in Python is straightforward with sys.getsizeof(), but understanding its limitations is critical:

  • sys.getsizeof() measures only the shallow size - the object's own memory footprint, excluding the objects it references.
  • For containers like lists, dicts, and nested structures, use a recursive deep size function or the pympler library to get the true total memory consumption.
  • Python objects are larger than their raw data because every object carries metadata (type, reference count, etc.).
  • Tuples are more memory-efficient than lists, and __slots__ classes are more efficient than regular classes when creating many instances.

Understanding these details helps you make informed decisions about data structures and optimize memory usage in your Python applications.