functools provides tools for working with functions as first-class objects. import functools. lru_cache: @functools.lru_cache(maxsize=128) — memoize with LRU eviction; fn.cache_info(); fn.cache_clear(). cache: @functools.cache (Python 3.9+) — unbounded memoize (same as lru_cache(maxsize=None)). cached_property: @functools.cached_property — computed once, stored per instance. partial: double = functools.partial(operator.mul, 2) — bind leading positional args. partialmethod: class C: method = partialmethod(base_method, arg=val). reduce: functools.reduce(lambda acc,x: acc+x, items, initializer) — left fold. wraps: @functools.wraps(fn) — preserves __name__, __doc__, __wrapped__. total_ordering: @functools.total_ordering + define __eq__ + one of __lt__/__gt__/__le__/__ge__ — auto-generates the rest. singledispatch: @functools.singledispatch; @impl.register(int); @impl.register(str) — function overloading by type. singledispatchmethod: class-method variant. cmp_to_key: sorted(items, key=functools.cmp_to_key(cmp_fn)) — convert old-style comparator. update_wrapper: functools.update_wrapper(wrapper, wrapped) — manual version of @wraps. WRAPPER_ASSIGNMENTS/WRAPPER_UPDATES: tuple of copied attributes. Claude Code generates memoized helpers, decorator factories, dispatch registries, and pipeline composers.
CLAUDE.md for functools
## functools Stack
- Stdlib: import functools, operator
- Memo: @functools.lru_cache(maxsize=N) | @functools.cache (unbounded, 3.9+)
- Lazy: @functools.cached_property — compute on first access, cache on instance
- Bind: functools.partial(fn, *args, **kwargs) — create specialized callables
- Decorator: @functools.wraps(fn) inside every wrapper — preserves metadata
- Dispatch: @functools.singledispatch + @fn.register(Type) — type-based overloads
- Order: @functools.total_ordering — define __eq__ + one comparison method
functools Higher-Order Function Pipeline
# app/hof.py — lru_cache, cached_property, partial, reduce, wraps, singledispatch
from __future__ import annotations
import functools
import hashlib
import inspect
import logging
import operator
import time
from dataclasses import dataclass
from typing import Any, Callable, Iterable, TypeVar
log = logging.getLogger(__name__)
T = TypeVar("T")
R = TypeVar("R")
# ─────────────────────────────────────────────────────────────────────────────
# 1. Memoization
# ─────────────────────────────────────────────────────────────────────────────
@functools.lru_cache(maxsize=512)
def fibonacci(n: int) -> int:
"""
Classic memoized Fibonacci.
Example:
fibonacci(50) # fast
fibonacci.cache_info() # CacheInfo(hits=48, misses=51, ...)
"""
if n < 2:
return n
return fibonacci(n - 1) + fibonacci(n - 2)
def memoize(maxsize: int = 128):
"""
Decorator factory: @memoize() or @memoize(maxsize=256).
Example:
@memoize(maxsize=64)
def expensive(x: int) -> int:
return x ** 2
"""
def decorator(fn: Callable) -> Callable:
cached = functools.lru_cache(maxsize=maxsize)(fn)
@functools.wraps(fn)
def wrapper(*args, **kwargs):
return cached(*args, **kwargs)
wrapper.cache_info = cached.cache_info # type: ignore[attr-defined]
wrapper.cache_clear = cached.cache_clear # type: ignore[attr-defined]
return wrapper
return decorator
def memoize_ttl(ttl: float = 60.0):
"""
Memoize with a time-to-live expiry per call signature.
Example:
@memoize_ttl(ttl=300.0)
def get_remote_config(env: str) -> dict:
return api.fetch_config(env)
"""
def decorator(fn: Callable) -> Callable:
cache: dict[Any, tuple[float, Any]] = {}
@functools.wraps(fn)
def wrapper(*args, **kwargs):
key = (args, tuple(sorted(kwargs.items())))
now = time.monotonic()
if key in cache:
expire, value = cache[key]
if now < expire:
return value
value = fn(*args, **kwargs)
cache[key] = (now + ttl, value)
return value
def invalidate():
cache.clear()
wrapper.invalidate = invalidate # type: ignore[attr-defined]
return wrapper
return decorator
# ─────────────────────────────────────────────────────────────────────────────
# 2. cached_property
# ─────────────────────────────────────────────────────────────────────────────
@dataclass
class TextDocument:
"""
Document where expensive derived properties are computed lazily and cached.
Example:
doc = TextDocument("Hello world hello world")
print(doc.word_count) # 4 — computed once
print(doc.unique_words) # {"hello", "world"} — computed once
print(doc.checksum) # sha256 hex — computed once
"""
text: str
@functools.cached_property
def words(self) -> list[str]:
return self.text.lower().split()
@functools.cached_property
def word_count(self) -> int:
return len(self.words)
@functools.cached_property
def unique_words(self) -> set[str]:
return set(self.words)
@functools.cached_property
def checksum(self) -> str:
return hashlib.sha256(self.text.encode()).hexdigest()
@functools.cached_property
def frequency(self) -> dict[str, int]:
from collections import Counter
return dict(Counter(self.words))
# ─────────────────────────────────────────────────────────────────────────────
# 3. partial
# ─────────────────────────────────────────────────────────────────────────────
def make_adder(n: float) -> Callable[[float], float]:
"""
Create a function that adds n to its argument.
Example:
add5 = make_adder(5)
add5(10) # 15
"""
return functools.partial(operator.add, n)
def make_multiplier(n: float) -> Callable[[float], float]:
"""
Create a function that multiplies its argument by n.
Example:
double = make_multiplier(2)
double(7) # 14
"""
return functools.partial(operator.mul, n)
def with_defaults(fn: Callable, **defaults) -> Callable:
"""
Return a copy of fn with keyword defaults pre-filled.
Example:
json_compact = with_defaults(json.dumps, separators=(",", ":"))
json_compact({"a": 1}) # '{"a":1}'
"""
return functools.partial(fn, **defaults)
# ─────────────────────────────────────────────────────────────────────────────
# 4. Decorator utilities
# ─────────────────────────────────────────────────────────────────────────────
def retry(
max_attempts: int = 3,
exceptions: tuple[type[Exception], ...] = (Exception,),
delay: float = 0.5,
backoff: float = 2.0,
):
"""
Decorator: retry on specified exceptions with exponential backoff.
Example:
@retry(max_attempts=4, exceptions=(IOError, TimeoutError), delay=1.0)
def unstable_api_call(url: str) -> dict:
return requests.get(url).json()
"""
def decorator(fn: Callable) -> Callable:
@functools.wraps(fn)
def wrapper(*args, **kwargs):
wait = delay
for attempt in range(1, max_attempts + 1):
try:
return fn(*args, **kwargs)
except exceptions as exc:
if attempt == max_attempts:
raise
log.warning(
"Retry %d/%d for %s: %s — waiting %.1fs",
attempt, max_attempts, fn.__name__, exc, wait,
)
time.sleep(wait)
wait *= backoff
return wrapper
return decorator
def timed(fn: Callable) -> Callable:
"""
Decorator: log execution time of any function.
Example:
@timed
def train_model(data): ...
"""
@functools.wraps(fn)
def wrapper(*args, **kwargs):
t0 = time.perf_counter()
result = fn(*args, **kwargs)
elapsed = time.perf_counter() - t0
log.debug("%s completed in %.3fs", fn.__qualname__, elapsed)
return result
return wrapper
def once(fn: Callable) -> Callable:
"""
Decorator: run function at most once; return same value on repeat calls.
Example:
@once
def initialize_db():
connect_and_migrate()
"""
sentinel = object()
result = [sentinel]
lock = __import__("threading").Lock()
@functools.wraps(fn)
def wrapper(*args, **kwargs):
if result[0] is sentinel:
with lock:
if result[0] is sentinel:
result[0] = fn(*args, **kwargs)
return result[0]
return wrapper
def deprecated(message: str = ""):
"""
Decorator: emit a DeprecationWarning when function is called.
Example:
@deprecated("Use new_fn() instead")
def old_fn(): ...
"""
import warnings
def decorator(fn: Callable) -> Callable:
msg = message or f"{fn.__qualname__} is deprecated"
@functools.wraps(fn)
def wrapper(*args, **kwargs):
warnings.warn(msg, DeprecationWarning, stacklevel=2)
return fn(*args, **kwargs)
return wrapper
return decorator
# ─────────────────────────────────────────────────────────────────────────────
# 5. total_ordering
# ─────────────────────────────────────────────────────────────────────────────
@functools.total_ordering
class Version:
"""
Comparable semantic version using total_ordering.
Example:
v1 = Version("2.1.0")
v2 = Version("2.0.3")
assert v1 > v2
assert sorted([v2, v1]) == [v2, v1] # ascending
"""
def __init__(self, version_str: str) -> None:
parts = [int(p) for p in version_str.split(".", 2)]
self.major = parts[0] if len(parts) > 0 else 0
self.minor = parts[1] if len(parts) > 1 else 0
self.patch = parts[2] if len(parts) > 2 else 0
def _tuple(self) -> tuple[int, int, int]:
return (self.major, self.minor, self.patch)
def __eq__(self, other: object) -> bool:
if not isinstance(other, Version):
return NotImplemented
return self._tuple() == other._tuple()
def __lt__(self, other: object) -> bool:
if not isinstance(other, Version):
return NotImplemented
return self._tuple() < other._tuple()
def __repr__(self) -> str:
return f"Version({self.major}.{self.minor}.{self.patch})"
def __hash__(self) -> int:
return hash(self._tuple())
# ─────────────────────────────────────────────────────────────────────────────
# 6. singledispatch
# ─────────────────────────────────────────────────────────────────────────────
@functools.singledispatch
def serialize(obj: Any) -> str:
"""
Generic serializer — dispatch by type.
Example:
serialize(42) # "42"
serialize(3.14) # "3.140000"
serialize([1, 2, 3]) # "[1, 2, 3]"
serialize({"a": 1}) # '{"a": 1}'
serialize(True) # "true"
"""
return repr(obj)
@serialize.register(int)
def _serialize_int(obj: int) -> str:
return str(obj)
@serialize.register(float)
def _serialize_float(obj: float) -> str:
return f"{obj:f}"
@serialize.register(bool)
def _serialize_bool(obj: bool) -> str:
return "true" if obj else "false"
@serialize.register(list)
@serialize.register(tuple)
def _serialize_seq(obj) -> str:
items = ", ".join(serialize(v) for v in obj)
return f"[{items}]"
@serialize.register(dict)
def _serialize_dict(obj: dict) -> str:
import json
return json.dumps(obj)
# ─────────────────────────────────────────────────────────────────────────────
# 7. reduce / fold
# ─────────────────────────────────────────────────────────────────────────────
def compose(*fns: Callable) -> Callable:
"""
Function composition: compose(f, g, h)(x) == f(g(h(x))).
Example:
pipeline = compose(str.upper, str.strip, lambda s: s.replace("-", " "))
pipeline(" hello-world ") # "HELLO WORLD"
"""
def apply(acc, fn):
return fn(acc)
def composed(x):
return functools.reduce(apply, reversed(fns), x)
return composed
def pipe(*fns: Callable) -> Callable:
"""
Left-to-right composition: pipe(f, g, h)(x) == h(g(f(x))).
Example:
clean = pipe(str.strip, str.lower, lambda s: s.replace(" ", "_"))
clean(" Hello World ") # "hello_world"
"""
def apply(acc, fn):
return fn(acc)
def piped(x):
return functools.reduce(apply, fns, x)
return piped
def fold(fn: Callable[[R, T], R], items: Iterable[T], initial: R) -> R:
"""
Left fold with explicit initial value.
Example:
product = fold(operator.mul, [1,2,3,4,5], 1) # 120
joined = fold(lambda a, s: a + ", " + s, ["a","b","c"], "")
"""
return functools.reduce(fn, items, initial)
# ─────────────────────────────────────────────────────────────────────────────
# Demo
# ─────────────────────────────────────────────────────────────────────────────
if __name__ == "__main__":
print("=== functools demo ===")
print("\n--- lru_cache (fibonacci) ---")
for n in [10, 20, 30]:
print(f" fib({n}) = {fibonacci(n)}")
info = fibonacci.cache_info()
print(f" cache_info: hits={info.hits}, misses={info.misses}, size={info.currsize}")
print("\n--- memoize_ttl ---")
call_count = [0]
@memoize_ttl(ttl=5.0)
def slow_fn(x: int) -> int:
call_count[0] += 1
return x * x
slow_fn(5)
slow_fn(5) # cache hit
slow_fn(6)
print(f" actual calls: {call_count[0]} (expected 2, not 3)")
print("\n--- cached_property ---")
doc = TextDocument("The quick brown fox jumps over the lazy dog")
print(f" words: {doc.word_count} unique: {len(doc.unique_words)}")
print(f" checksum: {doc.checksum[:12]}...")
print("\n--- partial ---")
add10 = make_adder(10.0)
triple = make_multiplier(3.0)
print(f" add10(5) = {add10(5.0)}")
print(f" triple(7) = {triple(7.0)}")
print("\n--- timed decorator ---")
@timed
def slow(n: int) -> int:
time.sleep(0.05)
return n * n
logging.basicConfig(level=logging.DEBUG)
result = slow(7)
print(f" slow(7) = {result}")
print("\n--- total_ordering ---")
versions = [Version("2.0.3"), Version("1.5.0"), Version("2.1.0"), Version("1.5.0")]
print(f" sorted: {sorted(versions)}")
print(f" 2.1.0 > 2.0.3: {Version('2.1.0') > Version('2.0.3')}")
print("\n--- singledispatch ---")
values: list[Any] = [42, 3.14, True, [1, 2, 3], {"k": "v"}]
for v in values:
print(f" serialize({v!r}) = {serialize(v)!r}")
print("\n--- compose / pipe ---")
normalize = pipe(str.strip, str.lower, lambda s: s.replace(" ", "_"))
print(f" pipe: ' Hello World ' → {normalize(' Hello World ')!r}")
pipeline = compose(str.upper, str.strip)
print(f" compose: ' hello ' → {pipeline(' hello ')!r}")
print("\n--- fold ---")
product = fold(operator.mul, [1, 2, 3, 4, 5], 1)
print(f" product([1..5]) = {product}")
print("\n=== done ===")
For the toolz alternative — toolz (PyPI) provides a comprehensive functional programming toolkit: curry, compose, pipe, memoize, juxt, frequencies, groupby, partition_all, and 60+ more; Python’s stdlib functools covers the most essential higher-order tools (lru_cache, partial, reduce, wraps, singledispatch) without external dependencies — use toolz (or its faster Cython variant cytoolz) when building data processing pipelines where point-free functional style matters, functools for decorators, memoization, and dispatch in production libraries where minimizing dependencies is important. For the attrs alternative — attrs (@attr.define) provides cached_property-like @property caching, comparisons via @attr.s(order=True), and validators via @attr.validator — features that partly overlap with functools.cached_property and functools.total_ordering; functools is stdlib-only and function-oriented rather than class-oriented — use attrs to define data-centric classes with rich equality, ordering, and validation in one decorator, functools when you need to decorate arbitrary functions or add capabilities to existing classes without changing their definition. The Claude Skills 360 bundle includes functools skill sets covering fibonacci/memoize()/memoize_ttl() caching, TextDocument cached_property patterns, make_adder()/make_multiplier()/with_defaults() partial binding, retry()/timed()/once()/deprecated() decorator factories, Version total_ordering, serialize() singledispatch registry, and compose()/pipe()/fold() functional combinators. Start with the free tier to try higher-order functions and functools decorator pipeline code generation.