Implementing In-Memory Cache with Time-to-Live in Python

Dec 05, 2025 · Programming · 9 views · 7.8

Keywords: Python | caching | TTL | multithreading

Abstract: This article discusses how to implement an in-memory cache with time-to-live (TTL) in Python, particularly for multithreaded applications. It focuses on using the expiringdict module, which provides an ordered dictionary with auto-expiring values, and addresses thread safety with locks. Additional methods like lru_cache with TTL hash and cachetools' TTLCache are also covered for comparison. The aim is to provide a comprehensive guide for developers needing efficient caching solutions.

In Python applications, especially in multithreaded environments, implementing an in-memory cache with time-to-live (TTL) is a common requirement to avoid redundant work. This article explores various approaches to achieve this, with a focus on using the expiringdict module as the primary solution.

Using ExpiringDict for Caching

The expiringdict module provides an ExpiringDict class, which is an ordered dictionary with auto-expiring values, ideal for caching purposes. To use it, first install the module via pip.

pip install expiringdict

Then, in your code, you can create an instance and manage it with a lock for thread safety.

from expiringdict import ExpiringDict
import threading

cache = ExpiringDict(max_len=100, max_age_seconds=20)
lock = threading.Lock()

def add_to_cache(key, value):
    with lock:
        cache[key] = value

def get_from_cache(key):
    with lock:
        return cache.get(key)

This ensures that multiple threads can safely access the cache without conflicts.

Alternative Methods

Other approaches include using functools.lru_cache with a TTL hash, or the cachetools library which offers TTLCache and ttl_cache decorator.

For example, with cachetools:

from cachetools import TTLCache

cache = TTLCache(maxsize=10, ttl=20)
cache['key'] = 'value'

This automatically expires entries after the specified TTL.

Conclusion

While expiringdict is a straightforward solution, cachetools provides more features and is actively maintained. For multithreaded applications, always consider synchronization mechanisms like locks to ensure data integrity.

Copyright Notice: All rights in this article are reserved by the operators of DevGex. Reasonable sharing and citation are welcome; any reproduction, excerpting, or re-publication without prior permission is prohibited.