Mean Filter in Python NumPy

While I was working on an image processing project, I needed to reduce noise in some satellite images of national parks in the USA. The images had some grain that was making feature extraction difficult. This is where the mean filter (also called average filter) saved the day.

In this article, I’ll share multiple ways to implement mean filters in Python, focusing on practical applications that you can use in your projects.

Let’s dive in!

Python Mean Filter

A mean filter is one of the simplest and most common filtering techniques used in image processing and signal processing. It works by:

  1. Taking a window (kernel) of pixels around each pixel in the image
  2. Calculating the average of all pixels in that window
  3. Replacing the center pixel with the calculated average

This process effectively smooths out the image by reducing pixel-to-pixel variations, which helps eliminate noise.

Read NumPy Normalize 0 and 1 in Python

Method 1: Implement a Mean Filter Using NumPy

Python NumPy makes implementing a mean filter quite simple. Here’s a simple implementation:

import numpy as np
import matplotlib.pyplot as plt
from PIL import Image

def mean_filter(image, kernel_size):
    # Get image dimensions
    height, width = image.shape

    # Create a padded version of the image
    k = kernel_size // 2
    padded_img = np.pad(image, ((k, k), (k, k)), mode='reflect')

    # Create output image
    filtered_img = np.zeros_like(image)

    # Apply filter
    for i in range(height):
        for j in range(width):
            filtered_img[i, j] = np.mean(padded_img[i:i+kernel_size, j:j+kernel_size])

    return filtered_img

# Example usage with a noisy image
# Load an image (grayscale)
img = np.array(Image.open('yosemite.jpg').convert('L'))

# Add some noise
noisy_img = img + 25 * np.random.randn(*img.shape)
noisy_img = np.clip(noisy_img, 0, 255).astype(np.uint8)

# Apply mean filter
filtered_img = mean_filter(noisy_img, 3)  # 3x3 kernel

# Display results
plt.figure(figsize=(12, 4))
plt.subplot(131), plt.imshow(img, cmap='gray'), plt.title('Original')
plt.subplot(132), plt.imshow(noisy_img, cmap='gray'), plt.title('Noisy')
plt.subplot(133), plt.imshow(filtered_img, cmap='gray'), plt.title('Filtered')
plt.tight_layout()
plt.show()

I executed the above example code and added the screenshot below.

mean filter python

This implementation manually applies the mean filter by calculating the average of each pixel’s neighborhood. While educational, it’s not the most efficient approach for larger images.

Check out Create a 2D NumPy Array in Python

Method 2: Use SciPy’s Uniform Filter

SciPy provides a much faster implementation through its ndimage module:

import numpy as np
import matplotlib.pyplot as plt
from scipy import ndimage
from skimage import data, util

# Load and noise
img = data.camera()
noisy_img = util.random_noise(img, mode='gaussian')
noisy_img = (noisy_img * 255).astype(np.uint8)

# Mean filter
def scipy_mean_filter(image, kernel_size):
    return ndimage.uniform_filter(image, size=kernel_size)

scipy_filtered = scipy_mean_filter(noisy_img, 3)

# Display
plt.figure(figsize=(12, 4))
plt.subplot(131), plt.imshow(img, cmap='gray'), plt.title('Original')
plt.subplot(132), plt.imshow(noisy_img, cmap='gray'), plt.title('Noisy')
plt.subplot(133), plt.imshow(scipy_filtered, cmap='gray'), plt.title('SciPy Filtered')
plt.tight_layout()
plt.show()

I executed the above example code and added the screenshot below.

python mean filter

The uniform_filter function is optimized and generally much faster than our manual implementation, especially for larger images or kernels.

Read NumPy Unique Function in Python

Method 3: Mean Filter for Time Series Data

Mean filters aren’t just for images! I often use them for smoothing time series data, too, like weather patterns across California:

import pandas as pd

# Sample time series data (daily temperatures in San Francisco)
dates = pd.date_range('2023-01-01', periods=100)
temperatures = np.random.normal(60, 10, 100) + 10 * np.sin(np.linspace(0, 4*np.pi, 100))
df = pd.DataFrame({'Date': dates, 'Temperature': temperatures})

# Apply rolling mean (another name for mean filter in time series)
window_size = 7  # 7-day moving average
df['Smoothed'] = df['Temperature'].rolling(window=window_size, center=True).mean()

# Plot
plt.figure(figsize=(12, 6))
plt.plot(df['Date'], df['Temperature'], label='Original')
plt.plot(df['Date'], df['Smoothed'], label=f'{window_size}-day Moving Average', linewidth=2)
plt.legend()
plt.title('Temperature Smoothing with Mean Filter')
plt.xlabel('Date')
plt.ylabel('Temperature (°F)')
plt.grid(True)
plt.show()
np.mean python

This kind of filtering is perfect for revealing underlying patterns in noisy time series data.

Check out np.unit8 in Python

Method 4: Use OpenCV for Mean Filtering

OpenCV is a useful library for image processing and offers an efficient blur function:

import cv2

def opencv_mean_filter(image, kernel_size):
    return cv2.blur(image, (kernel_size, kernel_size))

# Apply OpenCV's mean filter
opencv_filtered = opencv_mean_filter(noisy_img, 3)

# Display results
plt.figure(figsize=(12, 4))
plt.subplot(131), plt.imshow(img, cmap='gray'), plt.title('Original')
plt.subplot(132), plt.imshow(noisy_img, cmap='gray'), plt.title('Noisy')
plt.subplot(133), plt.imshow(opencv_filtered, cmap='gray'), plt.title('OpenCV Filtered')
plt.tight_layout()
plt.show()

OpenCV’s implementation is highly optimized and is typically the best choice for real-time applications.

Read NumPy Divide Array by Scalar in Python

Compare Mean Filter Performance

When I’m working on projects with large datasets like satellite imagery of the Grand Canyon or Yellowstone National Park, performance becomes crucial. Let’s compare the execution time of our different methods:

import time

# Create a larger image for timing comparison
large_img = np.random.randint(0, 256, size=(1000, 1000), dtype=np.uint8)

# Time our custom implementation
start = time.time()
mean_filter(large_img, 3)
custom_time = time.time() - start
print(f"Custom implementation: {custom_time:.4f} seconds")

# Time SciPy implementation
start = time.time()
scipy_mean_filter(large_img, 3)
scipy_time = time.time() - start
print(f"SciPy implementation: {scipy_time:.4f} seconds")

# Time OpenCV implementation
start = time.time()
opencv_mean_filter(large_img, 3)
opencv_time = time.time() - start
print(f"OpenCV implementation: {opencv_time:.4f} seconds")

The OpenCV and SciPy implementations are typically orders of magnitude faster than our custom implementation due to optimizations and potential parallel processing.

Check out Copy a NumPy Array to the Clipboard through Python

Adjust the Kernel Size

One important parameter when applying mean filters is the kernel size. Larger kernels provide more smoothing but can also blur important details. Let’s see the effect of different kernel sizes:

# Apply mean filters with different kernel sizes
kernel_sizes = [3, 5, 7, 9]
filtered_images = [opencv_mean_filter(noisy_img, k) for k in kernel_sizes]

# Display results
plt.figure(figsize=(15, 8))
plt.subplot(151), plt.imshow(noisy_img, cmap='gray'), plt.title('Noisy')

for i, (img, k) in enumerate(zip(filtered_images, kernel_sizes)):
    plt.subplot(151 + i + 1)
    plt.imshow(img, cmap='gray')
    plt.title(f'Kernel Size: {k}x{k}')

plt.tight_layout()
plt.show()

As you increase the kernel size, you’ll notice that the image becomes smoother, but details get increasingly blurred. Finding the right balance is key to effective filtering.

Read Convert the DataFrame to a NumPy Array Without Index in Python

When to Use Mean Filters

In my experience, mean filters work best when:

  1. You need to reduce random noise in an image
  2. You want to smooth out time series data
  3. You’re preprocessing data for feature detection
  4. You need a simple, fast smoothing algorithm

However, they’re not always the best choice when:

  1. You need to preserve edges or fine details
  2. Your noise is non-Gaussian (for example, salt-and-pepper noise)
  3. You’re working with very high-resolution images where detail preservation is critical

For those cases, I’d recommend exploring other filters like median filters or Gaussian filters.

Check out the np.count() function in Python

Mean Filter with Weights (Weighted Average)

Sometimes, I need more control over how the averaging happens. A weighted mean filter allows you to assign different importance to different pixels in the neighborhood:

def weighted_mean_filter(image, kernel):
    # Ensure kernel sums to 1 for proper averaging
    kernel = kernel / np.sum(kernel)
    return ndimage.convolve(image, kernel)

# Create a weighted kernel that emphasizes the center
weighted_kernel = np.array([
    [1, 2, 1],
    [2, 4, 2],
    [1, 2, 1]
])

# Apply weighted mean filter
weighted_filtered = weighted_mean_filter(noisy_img, weighted_kernel)

# Display results
plt.figure(figsize=(12, 4))
plt.subplot(131), plt.imshow(img, cmap='gray'), plt.title('Original')
plt.subplot(132), plt.imshow(noisy_img, cmap='gray'), plt.title('Noisy')
plt.subplot(133), plt.imshow(weighted_filtered, cmap='gray'), plt.title('Weighted Mean Filter')
plt.tight_layout()
plt.show()

The weighted mean filter can give you much more control over the smoothing process and can be adjusted for specific types of images or noise patterns.

I hope you found this tutorial on mean filters in Python helpful. Whether you’re cleaning up satellite imagery of our national parks, smoothing economic data, or just getting started with image processing, mean filters are a fundamental tool you’ll want in your toolbox.

Other Python articles you may also like:

51 Python Programs

51 PYTHON PROGRAMS PDF FREE

Download a FREE PDF (112 Pages) Containing 51 Useful Python Programs.

pyython developer roadmap

Aspiring to be a Python developer?

Download a FREE PDF on how to become a Python developer.

Let’s be friends

Be the first to know about sales and special discounts.