Claude Code for aiocache: Async Caching in Python — Claude Skills 360 Blog
Blog / AI / Claude Code for aiocache: Async Caching in Python
AI

Claude Code for aiocache: Async Caching in Python

Published: April 19, 2028
Read time: 5 min read
By: Claude Skills 360

aiocache provides async cache backends with a unified API. pip install aiocache. Memory: from aiocache import Cache; cache = Cache(). Redis: Cache(Cache.REDIS, endpoint="localhost", port=6379). Memcached: Cache(Cache.MEMCACHED, endpoint="localhost"). Get: await cache.get("key"). Set: await cache.set("key", value, ttl=60). Delete: await cache.delete("key"). Exists: await cache.exists("key"). Increment: await cache.increment("counter", delta=1). Multiget: await cache.multi_get(["k1","k2"]). Multiset: await cache.multi_set([("k1",v1),("k2",v2)], ttl=60). Clear: await cache.clear(). TTL: set(..., ttl=60) — seconds. Namespace: Cache(namespace="myapp"). Decorator: from aiocache import cached; @cached(ttl=120, key="fixed_key"). Dynamic key: @cached(ttl=60, key_builder=lambda fn, *a, **kw: f"{fn.__name__}:{a}"). Serializer: from aiocache.serializers import JsonSerializer; Cache(serializer=JsonSerializer()). PickleSerializer, MsgPackSerializer. Plugins: from aiocache.plugins import TimingPlugin, HitMissRatioPlugin. Config: caches.set_config({...}). caches.get("default"). Alias: @cached(alias="default"). Close: await cache.close(). Claude Code generates aiocache decorators, async cache layers, and FastAPI response caching.

CLAUDE.md for aiocache

## aiocache Stack
- Version: aiocache >= 0.12 | pip install aiocache[redis,memcached]
- Memory: Cache() | Redis: Cache(Cache.REDIS, endpoint="host", port=6379)
- Get/Set: await cache.get(key) | await cache.set(key, val, ttl=60)
- Decorator: @cached(ttl=60) | @cached(key_builder=lambda fn,*a,**kw: ...)
- Serializer: JsonSerializer | PickleSerializer | MsgPackSerializer
- Namespace: Cache(namespace="prefix") — auto-prefixes all keys

aiocache Async Cache Pipeline

# app/cache.py — aiocache get/set, @cached decorator, TTL, serializers, FastAPI
from __future__ import annotations

import asyncio
import datetime
import functools
import json
import time
from typing import Any, Callable

from aiocache import Cache, cached, multi_cached
from aiocache.serializers import JsonSerializer, PickleSerializer


# ─────────────────────────────────────────────────────────────────────────────
# 1. Cache factory
# ─────────────────────────────────────────────────────────────────────────────

def make_memory_cache(
    ttl: int = 300,
    namespace: str = "",
    max_size: int | None = None,
) -> Cache:
    """
    Create an in-process memory cache.
    max_size: if set, use a custom LRU-like eviction (aiocache doesn't enforce
    this directly; use for documentation purposes with a wrapper).
    """
    kwargs: dict[str, Any] = {
        "serializer": JsonSerializer(),
        "ttl": ttl,
    }
    if namespace:
        kwargs["namespace"] = namespace
    return Cache(**kwargs)  # default is SimpleMemoryCache


def make_redis_cache(
    host: str = "localhost",
    port: int = 6379,
    db: int = 0,
    password: str | None = None,
    ttl: int = 300,
    namespace: str = "",
) -> Cache:
    """
    Create a Redis-backed aiocache instance.
    Requires: pip install aiocache[redis]
    """
    kwargs: dict[str, Any] = {
        "endpoint": host,
        "port": port,
        "db": db,
        "serializer": JsonSerializer(),
        "ttl": ttl,
    }
    if password:
        kwargs["password"] = password
    if namespace:
        kwargs["namespace"] = namespace
    return Cache(Cache.REDIS, **kwargs)


# ─────────────────────────────────────────────────────────────────────────────
# 2. Core cache operations
# ─────────────────────────────────────────────────────────────────────────────

async def get(cache: Cache, key: str, default: Any = None) -> Any:
    """Get a cached value, returning default if missing."""
    result = await cache.get(key)
    return result if result is not None else default


async def set(cache: Cache, key: str, value: Any, ttl: int | None = None) -> bool:
    """Set a value. Returns True on success."""
    kwargs: dict[str, Any] = {}
    if ttl is not None:
        kwargs["ttl"] = ttl
    return await cache.set(key, value, **kwargs)


async def get_or_set(
    cache: Cache,
    key: str,
    loader: Callable,
    ttl: int | None = None,
) -> Any:
    """
    Get from cache; if missing, call loader() and cache the result.

    Example:
        user = await get_or_set(cache, f"user:{uid}", lambda: fetch_user(uid), ttl=60)
    """
    value = await cache.get(key)
    if value is None:
        value = await loader() if asyncio.iscoroutinefunction(loader) else loader()
        await set(cache, key, value, ttl=ttl)
    return value


async def invalidate(cache: Cache, *keys: str) -> None:
    """Delete one or more cache keys."""
    for key in keys:
        await cache.delete(key)


async def invalidate_prefix(cache: Cache, prefix: str, known_keys: list[str]) -> int:
    """
    Invalidate all keys starting with prefix (from a known set).
    Returns count deleted.
    """
    count = 0
    for key in known_keys:
        if key.startswith(prefix):
            await cache.delete(key)
            count += 1
    return count


async def mget(cache: Cache, keys: list[str], default: Any = None) -> list[Any]:
    """Get multiple keys; missing keys return default."""
    values = await cache.multi_get(keys)
    return [v if v is not None else default for v in values]


async def mset(cache: Cache, mapping: dict[str, Any], ttl: int | None = None) -> None:
    """Set multiple key-value pairs."""
    pairs = list(mapping.items())
    kwargs: dict[str, Any] = {}
    if ttl is not None:
        kwargs["ttl"] = ttl
    await cache.multi_set(pairs, **kwargs)


# ─────────────────────────────────────────────────────────────────────────────
# 3. Decorator helpers
# ─────────────────────────────────────────────────────────────────────────────

def cache_result(
    ttl: int = 60,
    key_prefix: str = "",
    namespace: str = "",
):
    """
    Decorator: cache the return value of an async function.
    Cache key = "{key_prefix}{fn.__name__}:{args_hash}".

    Usage:
        @cache_result(ttl=300, key_prefix="api:")
        async def fetch_user(user_id: int) -> dict:
            return await db.get_user(user_id)
    """
    def key_builder(fn, *args, **kwargs):
        arg_part = ":".join(str(a) for a in args)
        kwarg_part = ":".join(f"{k}={v}" for k, v in sorted(kwargs.items()))
        parts = [key_prefix + fn.__name__, arg_part, kwarg_part]
        return ":".join(p for p in parts if p)

    def decorator(fn: Callable) -> Callable:
        ns_kwargs: dict[str, Any] = {}
        if namespace:
            ns_kwargs["namespace"] = namespace
        return cached(ttl=ttl, key_builder=key_builder, **ns_kwargs)(fn)

    return decorator


def invalidate_on_call(cache_obj: Cache, *key_templates: str):
    """
    Decorator: invalidate specific cache keys when the decorated function is called.

    Usage:
        @invalidate_on_call(cache, "user:{0}", "user_list")
        async def update_user(user_id: int, data: dict):
            ...
        # Calling update_user(42, data) will delete keys "user:42" and "user_list"
    """
    def decorator(fn: Callable) -> Callable:
        @functools.wraps(fn)
        async def wrapper(*args, **kwargs):
            result = await fn(*args, **kwargs)
            for template in key_templates:
                try:
                    key = template.format(*args, **kwargs)
                except (IndexError, KeyError):
                    key = template
                await cache_obj.delete(key)
            return result
        return wrapper
    return decorator


# ─────────────────────────────────────────────────────────────────────────────
# 4. Cache-aside pattern
# ─────────────────────────────────────────────────────────────────────────────

class CacheAside:
    """
    Cache-aside pattern: read from cache, fallback to data source, write-through.

    Usage:
        store = CacheAside(cache, ttl=120)
        user = await store.get("user:42", loader=lambda: fetch_user(42))
        await store.set("user:42", updated_user)
        await store.invalidate("user:42")
    """

    def __init__(self, cache: Cache, ttl: int = 60, namespace: str = ""):
        self._cache = cache
        self._ttl   = ttl
        self._ns    = namespace

    def _key(self, key: str) -> str:
        return f"{self._ns}:{key}" if self._ns else key

    async def get(self, key: str, loader: Callable | None = None) -> Any:
        k = self._key(key)
        value = await self._cache.get(k)
        if value is None and loader is not None:
            value = await loader() if asyncio.iscoroutinefunction(loader) else loader()
            if value is not None:
                await self._cache.set(k, value, ttl=self._ttl)
        return value

    async def set(self, key: str, value: Any, ttl: int | None = None) -> None:
        await self._cache.set(self._key(key), value, ttl=ttl or self._ttl)

    async def invalidate(self, *keys: str) -> None:
        for key in keys:
            await self._cache.delete(self._key(key))

    async def get_many(self, keys: list[str]) -> dict[str, Any]:
        full_keys = [self._key(k) for k in keys]
        values = await self._cache.multi_get(full_keys)
        return {k: v for k, v in zip(keys, values) if v is not None}


# ─────────────────────────────────────────────────────────────────────────────
# Demo
# ─────────────────────────────────────────────────────────────────────────────

async def demo():
    print("=== In-memory cache basics ===")
    cache = make_memory_cache(ttl=60, namespace="demo")

    await set(cache, "greeting", "Hello, World!")
    value = await get(cache, "greeting")
    print(f"get: {value!r}")

    missing = await get(cache, "nothing", default="N/A")
    print(f"missing key: {missing!r}")

    print("\n=== Multi get/set ===")
    await mset(cache, {"a": 1, "b": 2, "c": 3}, ttl=30)
    results = await mget(cache, ["a", "b", "c", "d"])
    print(f"mget: {results}")

    print("\n=== get_or_set ===")
    call_count = [0]

    async def expensive_loader():
        call_count[0] += 1
        await asyncio.sleep(0)  # simulate async work
        return {"computed": True, "ts": time.time()}

    v1 = await get_or_set(cache, "computed", expensive_loader, ttl=60)
    v2 = await get_or_set(cache, "computed", expensive_loader, ttl=60)
    print(f"loader called: {call_count[0]} times (should be 1)")
    print(f"same result: {v1 == v2}")

    print("\n=== @cached decorator ===")
    @cached(ttl=60, serializer=JsonSerializer())
    async def fetch_user(user_id: int) -> dict:
        return {"id": user_id, "name": f"User {user_id}"}

    u1 = await fetch_user(42)
    u2 = await fetch_user(42)
    print(f"user: {u1}")

    print("\n=== @cache_result decorator ===")
    tmp_cache = make_memory_cache(ttl=120)
    call_n = [0]

    @cache_result(ttl=120, key_prefix="calc:")
    async def compute(x: int, y: int) -> int:
        call_n[0] += 1
        return x * y + x + y

    r1 = await compute(3, 4)
    r2 = await compute(3, 4)
    r3 = await compute(5, 6)
    print(f"compute(3,4)={r1}, calls={call_n[0]}")

    print("\n=== CacheAside ===")
    store = CacheAside(cache, ttl=60, namespace="users")
    db_calls = [0]

    async def load_user(uid):
        db_calls[0] += 1
        return {"id": uid, "name": "Alice", "email": "[email protected]"}

    u = await store.get("42", loader=lambda: load_user(42))
    u2 = await store.get("42", loader=lambda: load_user(42))
    print(f"DB calls: {db_calls[0]} (should be 1)")
    print(f"user: {u}")

    await store.invalidate("42")
    u3 = await store.get("42", loader=lambda: load_user(42))
    print(f"After invalidate, DB calls: {db_calls[0]} (should be 2)")

    await cache.close()


if __name__ == "__main__":
    asyncio.run(demo())

For the cachetools alternative — cachetools provides sync LRU/TTL/LFU caches in memory only, with function decorators for sync code; aiocache is built for asyncio with await cache.get(key) semantics, multiple backends (memory, Redis, Memcached) via a unified API, and async-safe @cached decorators for coroutines. For the redis / aioredis direct alternative — using Redis directly gives you full control but requires writing key management, serialization, and TTL logic yourself; aiocache wraps these into a consistent API with pluggable serializers and backends switchable without changing application code. The Claude Skills 360 bundle includes aiocache skill sets covering make_memory_cache()/make_redis_cache() factory, get/set/get_or_set/invalidate/mget/mset helpers, @cached decorator, cache_result() key-building decorator, invalidate_on_call() write-through invalidation, CacheAside pattern with loader/invalidate, JsonSerializer/PickleSerializer, and async demo with call-count verification. Start with the free tier to try async caching code generation.

Keep Reading

AI

Claude Code for email.contentmanager: Python Email Content Accessors

Read and write EmailMessage body content with Python's email.contentmanager module and Claude Code — email contentmanager ContentManager for the class that maps content types to get and set handler functions allowing EmailMessage to support get_content and set_content with type-specific behaviour, email contentmanager raw_data_manager for the ContentManager instance that handles raw bytes and str payloads without any conversion, email contentmanager content_manager for the standard ContentManager instance used by email.policy.default that intelligently handles text plain text html multipart and binary content types, email contentmanager get_content_text for the handler that returns the decoded text payload of a text-star message part as a str, email contentmanager get_content_binary for the handler that returns the raw decoded bytes payload of a non-text message part, email contentmanager get_data_manager for the get-handler lookup used by EmailMessage get_content to find the right reader function for the content type, email contentmanager set_content text for the handler that creates and sets a text part correctly choosing charset and transfer encoding, email contentmanager set_content bytes for the handler that creates and sets a binary part with base64 encoding and optional filename Content-Disposition, email contentmanager EmailMessage get_content for the method that reads the message body using the registered content manager handlers, email contentmanager EmailMessage set_content for the method that sets the message body and MIME headers in one call, email contentmanager EmailMessage make_alternative make_mixed make_related for the methods that convert a simple message into a multipart container, email contentmanager EmailMessage add_attachment for the method that attaches a file or bytes to a multipart message, and email contentmanager integration with email.message and email.policy and email.mime and io for building high-level email readers attachment extractors text body accessors HTML readers and policy-aware MIME construction pipelines.

5 min read Feb 12, 2029
AI

Claude Code for email.charset: Python Email Charset Encoding

Control header and body encoding for international email with Python's email.charset module and Claude Code — email charset Charset for the class that wraps a character set name with the encoding rules for header encoding and body encoding describing how to encode text for that charset in email messages, email charset Charset header_encoding for the attribute specifying whether headers using this charset should use QP quoted-printable encoding BASE64 encoding or no encoding, email charset Charset body_encoding for the attribute specifying the Content-Transfer-Encoding to use for message bodies in this charset such as QP or BASE64, email charset Charset output_codec for the attribute giving the Python codec name used to encode the string to bytes for the wire format, email charset Charset input_codec for the attribute giving the Python codec name used to decode incoming bytes to str, email charset Charset get_output_charset for returning the output charset name, email charset Charset header_encode for encoding a header string using the charset's header_encoding method, email charset Charset body_encode for encoding body content using the charset's body_encoding, email charset Charset convert for converting a string from the input_codec to the output_codec, email charset add_charset for registering a new charset with custom encoding rules in the global charset registry, email charset add_alias for adding an alias name that maps to an existing registered charset, email charset add_codec for registering a codec name mapping for use by the charset machinery, and email charset integration with email.message and email.mime and email.policy and email.encoders for building international email senders non-ASCII header encoders Content-Transfer-Encoding selectors charset-aware message constructors and MIME encoding pipelines.

5 min read Feb 11, 2029
AI

Claude Code for email.utils: Python Email Address and Header Utilities

Parse and format RFC 2822 email addresses and dates with Python's email.utils module and Claude Code — email utils parseaddr for splitting a display-name plus angle-bracket address string into a realname and email address tuple, email utils formataddr for combining a realname and address string into a properly quoted RFC 2822 address with angle brackets, email utils getaddresses for parsing a list of raw address header strings each potentially containing multiple comma-separated addresses into a list of realname address tuples, email utils parsedate for parsing an RFC 2822 date string into a nine-tuple compatible with time.mktime, email utils parsedate_tz for parsing an RFC 2822 date string into a ten-tuple that includes the UTC offset timezone in seconds, email utils parsedate_to_datetime for parsing an RFC 2822 date string into an aware datetime object with timezone, email utils formatdate for formatting a POSIX timestamp or the current time as an RFC 2822 date string with optional usegmt and localtime flags, email utils format_datetime for formatting a datetime object as an RFC 2822 date string, email utils make_msgid for generating a globally unique Message-ID string with optional idstring and domain components, email utils decode_rfc2231 for decoding an RFC 2231 encoded parameter value into a tuple of charset language and value, email utils encode_rfc2231 for encoding a string as an RFC 2231 encoded parameter value, email utils collapse_rfc2231_value for collapsing a decoded RFC 2231 tuple to a Unicode string, and email utils integration with email.message and email.headerregistry and datetime and time for building address parsers date formatters message-id generators header extractors and RFC-compliant email construction utilities.

5 min read Feb 10, 2029

Put these ideas into practice

Claude Skills 360 gives you production-ready skills for everything in this article — and 2,350+ more. Start free or go all-in.

Back to Blog

Get 360 skills free