Skip to main content

Python Pandas:How to Search for a Value Within a Pandas DataFrame Row in Python

Searching for specific values within a DataFrame is one of the most fundamental operations in data analysis. Whether you're filtering customer records, finding specific transactions, or isolating outliers, Pandas provides powerful and flexible methods to search rows based on single or multiple conditions.

In this guide, you'll learn how to search for values within DataFrame rows using boolean indexing, multiple conditions, string matching, and other techniques.

Creating a Sample DataFrame

Let's create a DataFrame to use throughout all examples:

import pandas as pd

data = {
'Name': ['Alice', 'Bob', 'Charlie', 'Diana', 'Eve'],
'Age': [28, 35, 42, 31, 38],
'City': ['New York', 'Chicago', 'Boston', 'Chicago', 'New York'],
'Purchased': ['Yes', 'No', 'Yes', 'Yes', 'No'],
'Amount': [250.0, 0.0, 180.0, 320.0, 0.0]
}

df = pd.DataFrame(data)
print(df)

Output:

      Name  Age      City Purchased  Amount
0 Alice 28 New York Yes 250.0
1 Bob 35 Chicago No 0.0
2 Charlie 42 Boston Yes 180.0
3 Diana 31 Chicago Yes 320.0
4 Eve 38 New York No 0.0

Searching with a Single Condition

The most straightforward way to search is using boolean indexing - comparing a column to a value and using the resulting boolean mask to filter rows:

import pandas as pd

data = {
'Name': ['Alice', 'Bob', 'Charlie', 'Diana', 'Eve'],
'Age': [28, 35, 42, 31, 38],
'City': ['New York', 'Chicago', 'Boston', 'Chicago', 'New York'],
'Purchased': ['Yes', 'No', 'Yes', 'Yes', 'No'],
'Amount': [250.0, 0.0, 180.0, 320.0, 0.0]
}

df = pd.DataFrame(data)

# Find all rows where Purchased is "Yes"
result = df[df['Purchased'] == 'Yes']
print(result)

Output:

      Name  Age      City Purchased  Amount
0 Alice 28 New York Yes 250.0
2 Charlie 42 Boston Yes 180.0
3 Diana 31 Chicago Yes 320.0

How It Works

  1. df['Purchased'] == 'Yes' creates a boolean Series: [True, False, True, True, False].
  2. df[boolean_series] returns only the rows where the value is True.

You can use any comparison operator:

import pandas as pd

data = {
'Name': ['Alice', 'Bob', 'Charlie', 'Diana', 'Eve'],
'Age': [28, 35, 42, 31, 38],
'City': ['New York', 'Chicago', 'Boston', 'Chicago', 'New York'],
'Purchased': ['Yes', 'No', 'Yes', 'Yes', 'No'],
'Amount': [250.0, 0.0, 180.0, 320.0, 0.0]
}
df = pd.DataFrame(data)

# Find rows where Amount is greater than 200
result = df[df['Amount'] > 200]
print(result)

Output:

    Name  Age      City Purchased  Amount
0 Alice 28 New York Yes 250.0
3 Diana 31 Chicago Yes 320.0

Searching with Multiple Conditions

Combine conditions using & (AND), | (OR), and ~ (NOT). Each condition must be wrapped in parentheses.

AND Condition (&)

Find rows that satisfy all conditions:

import pandas as pd

data = {
'Name': ['Alice', 'Bob', 'Charlie', 'Diana', 'Eve'],
'Age': [28, 35, 42, 31, 38],
'City': ['New York', 'Chicago', 'Boston', 'Chicago', 'New York'],
'Purchased': ['Yes', 'No', 'Yes', 'Yes', 'No'],
'Amount': [250.0, 0.0, 180.0, 320.0, 0.0]
}
df = pd.DataFrame(data)

# Find rows where Age is between 30 and 40 (inclusive)
result = df[(df['Age'] >= 30) & (df['Age'] <= 40)]
print(result)

Output:

    Name  Age      City Purchased  Amount
1 Bob 35 Chicago No 0.0
3 Diana 31 Chicago Yes 320.0
4 Eve 38 New York No 0.0

OR Condition (|)

Find rows that satisfy any condition:

import pandas as pd

data = {
'Name': ['Alice', 'Bob', 'Charlie', 'Diana', 'Eve'],
'Age': [28, 35, 42, 31, 38],
'City': ['New York', 'Chicago', 'Boston', 'Chicago', 'New York'],
'Purchased': ['Yes', 'No', 'Yes', 'Yes', 'No'],
'Amount': [250.0, 0.0, 180.0, 320.0, 0.0]
}
df = pd.DataFrame(data)

# Find rows where City is "Boston" OR Amount is greater than 300
result = df[(df['City'] == 'Boston') | (df['Amount'] > 300)]
print(result)

Output:

      Name  Age     City Purchased  Amount
2 Charlie 42 Boston Yes 180.0
3 Diana 31 Chicago Yes 320.0

NOT Condition (~)

Find rows that don't match a condition:

import pandas as pd

data = {
'Name': ['Alice', 'Bob', 'Charlie', 'Diana', 'Eve'],
'Age': [28, 35, 42, 31, 38],
'City': ['New York', 'Chicago', 'Boston', 'Chicago', 'New York'],
'Purchased': ['Yes', 'No', 'Yes', 'Yes', 'No'],
'Amount': [250.0, 0.0, 180.0, 320.0, 0.0]
}
df = pd.DataFrame(data)

# Find rows where Purchased is NOT "Yes"
result = df[~(df['Purchased'] == 'Yes')]
print(result)

Output:

  Name  Age      City Purchased  Amount
1 Bob 35 Chicago No 0.0
4 Eve 38 New York No 0.0
Common Mistake: Using and/or Instead of &/|

Python's and and or keywords don't work with Pandas boolean Series. Always use &, |, and ~, and wrap each condition in parentheses:

# ❌ Wrong: raises ValueError
try:
result = df[(df['Age'] > 30) and (df['City'] == 'Chicago')]
except ValueError as e:
print(f"Error: {e}")

# ✅ Correct: use & with parentheses
result = df[(df['Age'] > 30) & (df['City'] == 'Chicago')]
print(result)

Searching with isin() for Multiple Values

When you want to match against a list of values, use isin():

import pandas as pd

data = {
'Name': ['Alice', 'Bob', 'Charlie', 'Diana', 'Eve'],
'Age': [28, 35, 42, 31, 38],
'City': ['New York', 'Chicago', 'Boston', 'Chicago', 'New York'],
'Purchased': ['Yes', 'No', 'Yes', 'Yes', 'No'],
'Amount': [250.0, 0.0, 180.0, 320.0, 0.0]
}
df = pd.DataFrame(data)

# Find rows where City is either "Chicago" or "Boston"
cities = ['Chicago', 'Boston']
result = df[df['City'].isin(cities)]
print(result)

Output:

      Name  Age     City Purchased  Amount
1 Bob 35 Chicago No 0.0
2 Charlie 42 Boston Yes 180.0
3 Diana 31 Chicago Yes 320.0

To find rows that are not in the list, negate with ~:

import pandas as pd

data = {
'Name': ['Alice', 'Bob', 'Charlie', 'Diana', 'Eve'],
'Age': [28, 35, 42, 31, 38],
'City': ['New York', 'Chicago', 'Boston', 'Chicago', 'New York'],
'Purchased': ['Yes', 'No', 'Yes', 'Yes', 'No'],
'Amount': [250.0, 0.0, 180.0, 320.0, 0.0]
}
df = pd.DataFrame(data)
cities = ['Chicago', 'Boston']

# Find rows where City is NOT "Chicago" or "Boston"
result = df[~df['City'].isin(cities)]
print(result)

Output:

    Name  Age      City Purchased  Amount
0 Alice 28 New York Yes 250.0
4 Eve 38 New York No 0.0

Searching with String Methods

Pandas provides .str accessor methods for pattern matching on string columns:

Exact Match

result = df[df['Name'] == 'Alice']
print(result)

Output:

    Name  Age      City Purchased  Amount
0 Alice 28 New York Yes 250.0

Contains a Substring

# Find names containing "li"
result = df[df['Name'].str.contains('li', case=False)]
print(result)

Output:

      Name  Age      City Purchased  Amount
0 Alice 28 New York Yes 250.0
2 Charlie 42 Boston Yes 180.0

Starts With / Ends With

# Names starting with 'D'
result = df[df['Name'].str.startswith('D')]
print(result)

# Cities ending with 'go'
result = df[df['City'].str.endswith('go')]
print(result)

Output:

    Name  Age     City Purchased  Amount
3 Diana 31 Chicago Yes 320.0
Name Age City Purchased Amount
1 Bob 35 Chicago No 0.0
3 Diana 31 Chicago Yes 320.0

Regular Expression Matching

# Find names that match a regex pattern (starts with A or E)
result = df[df['Name'].str.match(r'^[AE]')]
print(result)

Output:

    Name  Age      City Purchased  Amount
0 Alice 28 New York Yes 250.0
4 Eve 38 New York No 0.0

Using query() for Readable Filtering

The query() method lets you write filter conditions as strings, which can be more readable for complex queries:

# Equivalent to: df[(df['Age'] > 30) & (df['Purchased'] == 'Yes')]
result = df.query('Age > 30 and Purchased == "Yes"')
print(result)

Output:

      Name  Age     City Purchased  Amount
2 Charlie 42 Boston Yes 180.0
3 Diana 31 Chicago Yes 320.0

You can also reference variables using @:

min_age = 35
target_city = 'Chicago'

result = df.query('Age >= @min_age and City == @target_city')
print(result)

Output:

  Name  Age     City Purchased  Amount
1 Bob 35 Chicago No 0.0
tip

query() is especially useful for long, complex conditions that become hard to read with bracket notation. It also avoids the need to repeat df for every column reference.

Searching Across All Columns

To find rows where any column contains a specific value, use isin() on the entire DataFrame:

import pandas as pd

df = pd.DataFrame({
'A': ['apple', 'banana', 'cherry'],
'B': ['dog', 'apple', 'fish'],
'C': ['red', 'blue', 'apple']
})

# Find rows where ANY column contains "apple"
result = df[df.isin(['apple']).any(axis=1)]
print(result)

Output:

        A      B      C
0 apple dog red
1 banana apple blue
2 cherry fish apple

Using loc[] for Searching and Selecting Specific Columns

Combine searching with column selection using loc[]:

import pandas as pd

data = {
'Name': ['Alice', 'Bob', 'Charlie', 'Diana', 'Eve'],
'Age': [28, 35, 42, 31, 38],
'City': ['New York', 'Chicago', 'Boston', 'Chicago', 'New York'],
'Purchased': ['Yes', 'No', 'Yes', 'Yes', 'No'],
'Amount': [250.0, 0.0, 180.0, 320.0, 0.0]
}
df = pd.DataFrame(data)

# Search for rows AND select only specific columns
result = df.loc[df['Purchased'] == 'Yes', ['Name', 'Amount']]
print(result)

Output:

      Name  Amount
0 Alice 250.0
2 Charlie 180.0
3 Diana 320.0

Summary

MethodUse CaseExample
df[df['col'] == value]Single conditiondf[df['Age'] == 30]
df[(cond1) & (cond2)]Multiple conditions (AND)df[(df['Age'] > 30) & (df['City'] == 'NY')]
df[(cond1) | (cond2)]Multiple conditions (OR)df[(df['Age'] < 25) | (df['Age'] > 40)]
df[df['col'].isin(list)]Match against a listdf[df['City'].isin(['NY', 'LA'])]
df[df['col'].str.contains()]Substring / pattern searchdf[df['Name'].str.contains('li')]
df.query('expression')Readable string-based filteringdf.query('Age > 30')
df.loc[condition, cols]Search + select columnsdf.loc[df['Age'] > 30, ['Name']]

To search for values within a Pandas DataFrame, use boolean indexing with comparison operators for simple filters, combine conditions with & and | (always in parentheses), use isin() for matching against lists, and leverage .str methods for pattern matching on string columns. For complex queries, query() provides a more readable alternative.