Claude Code for contextlib: Context Manager Utilities in Python — Claude Skills 360 Blog
Blog / AI / Claude Code for contextlib: Context Manager Utilities in Python
AI

Claude Code for contextlib: Context Manager Utilities in Python

Published: July 2, 2028
Read time: 5 min read
By: Claude Skills 360

contextlib provides utilities for creating and working with with-statement context managers. from contextlib import contextmanager, asynccontextmanager, ExitStack, suppress, redirect_stdout, nullcontext. contextmanager: @contextmanager def cm(): setup; try: yield value; finally: teardown. asynccontextmanager: @asynccontextmanager async def acm(): setup; try: yield value; finally: await teardown. ExitStack: with ExitStack() as stack: f = stack.enter_context(cm()); stack.callback(fn) — dynamic cleanup. AsyncExitStack: async equivalent. suppress: with suppress(FileNotFoundError, KeyError): risky_op() — swallow specific exceptions. redirect_stdout: with redirect_stdout(buf): fn() — capture stdout. redirect_stderr: with redirect_stderr(io.StringIO()) as buf: fn(). closing: with closing(urllib.request.urlopen(url)) as r: data = r.read() — calls .close() on exit. nullcontext: with nullcontext(val) if cond else real_cm(): — no-op placeholder. chdir: with contextlib.chdir("/tmp"): os.getcwd() (Python 3.11+). AbstractContextManager: base class with __enter__/__exit__. aclosing: async with aclosing(aiter) as it: — calls aclose() on async generator. contextdecorator: methods with __enter__/__exit__ can also be used as decorators. Claude Code generates resource managers, test fixtures, output capture, and multi-cleanup stacks.

CLAUDE.md for contextlib

## contextlib Stack
- Stdlib: from contextlib import contextmanager, asynccontextmanager, ExitStack, suppress
- Simple CM: @contextmanager def cm(): setup; try: yield val; finally: teardown
- Async CM: @asynccontextmanager async def acm(): setup; try: yield val; finally: await teardown
- Dynamic: with ExitStack() as stack: stack.enter_context(cm) | stack.callback(fn)
- Suppress: with suppress(ExcType): risky()  — preferred over bare except: pass
- Capture: with redirect_stdout(io.StringIO()) as out: fn(); out.getvalue()

contextlib Resource Pipeline

# app/resources.py — contextmanager, ExitStack, suppress, redirect, timer, temp
from __future__ import annotations

import io
import logging
import os
import sys
import tempfile
import threading
import time
from contextlib import (
    AbstractContextManager,
    AsyncExitStack,
    ExitStack,
    asynccontextmanager,
    contextmanager,
    nullcontext,
    redirect_stderr,
    redirect_stdout,
    suppress,
)
from pathlib import Path
from typing import Any, AsyncIterator, Callable, Generator, Iterator

log = logging.getLogger(__name__)


# ─────────────────────────────────────────────────────────────────────────────
# 1. Timer context managers
# ─────────────────────────────────────────────────────────────────────────────

@contextmanager
def timer(name: str = "", logger=None) -> Generator[dict, None, None]:
    """
    Measure elapsed time; yield a dict updated on exit.

    Example:
        with timer("database query") as t:
            rows = db.fetchall(sql)
        print(f"Query took {t['elapsed']:.3f}s")
    """
    _log  = logger or log
    info: dict = {"start": None, "elapsed": None}
    info["start"] = time.perf_counter()
    try:
        yield info
    finally:
        info["elapsed"] = time.perf_counter() - info["start"]
        if name:
            _log.debug("%s: %.3fs", name, info["elapsed"])


class Stopwatch(AbstractContextManager):
    """
    Reusable timer that records laps.

    Example:
        sw = Stopwatch()
        with sw:
            do_phase_1()
            sw.lap("phase 1")
            do_phase_2()
            sw.lap("phase 2")
        print(sw.laps)           # [("phase 1", 0.12), ("phase 2", 0.34)]
        print(sw.elapsed)        # 0.46
    """

    def __init__(self) -> None:
        self._start = 0.0
        self._laps: list[tuple[str, float]] = []

    def __enter__(self) -> Stopwatch:
        self._start = time.perf_counter()
        self._laps  = []
        return self

    def __exit__(self, *_) -> None:
        pass  # elapsed calculated lazily

    def lap(self, name: str = "") -> float:
        elapsed = time.perf_counter() - self._start
        self._laps.append((name, elapsed))
        return elapsed

    @property
    def elapsed(self) -> float:
        return time.perf_counter() - self._start

    @property
    def laps(self) -> list[tuple[str, float]]:
        return list(self._laps)


# ─────────────────────────────────────────────────────────────────────────────
# 2. Output capture
# ─────────────────────────────────────────────────────────────────────────────

@contextmanager
def capture_output() -> Generator[dict[str, str], None, None]:
    """
    Capture stdout and stderr as strings.

    Example:
        with capture_output() as out:
            print("hello")
            print("error", file=sys.stderr)
        print(out["stdout"])  # "hello\n"
        print(out["stderr"])  # "error\n"
    """
    stdout_buf = io.StringIO()
    stderr_buf = io.StringIO()
    output: dict[str, str] = {}
    with redirect_stdout(stdout_buf), redirect_stderr(stderr_buf):
        try:
            yield output
        finally:
            output["stdout"] = stdout_buf.getvalue()
            output["stderr"] = stderr_buf.getvalue()


@contextmanager
def silence() -> Generator[None, None, None]:
    """
    Suppress all stdout and stderr output within the block.

    Example:
        with silence():
            noisy_library_call()
    """
    devnull = open(os.devnull, "w")
    with redirect_stdout(devnull), redirect_stderr(devnull):
        try:
            yield
        finally:
            devnull.close()


# ─────────────────────────────────────────────────────────────────────────────
# 3. Temporary file/directory helpers
# ─────────────────────────────────────────────────────────────────────────────

@contextmanager
def temp_file(
    suffix: str = "",
    prefix: str = "tmp_",
    content: str | bytes | None = None,
    encoding: str = "utf-8",
) -> Generator[Path, None, None]:
    """
    Create a named temp file; yield its Path; delete on exit.

    Example:
        with temp_file(".json", content='{"key":"value"}') as p:
            data = json.loads(p.read_text())
    """
    mode = "wb" if isinstance(content, bytes) else "w"
    kwargs: dict = {} if isinstance(content, bytes) else {"encoding": encoding}
    fd, path = tempfile.mkstemp(suffix=suffix, prefix=prefix)
    p = Path(path)
    try:
        if content is not None:
            with open(fd, mode, **kwargs) as f:
                f.write(content)
        else:
            os.close(fd)
        yield p
    finally:
        p.unlink(missing_ok=True)


@contextmanager
def temp_dir(prefix: str = "tmp_") -> Generator[Path, None, None]:
    """
    Create a temp directory; yield its Path; remove tree on exit.

    Example:
        with temp_dir() as d:
            (d / "output.csv").write_text("a,b\n1,2\n")
    """
    with tempfile.TemporaryDirectory(prefix=prefix) as d:
        yield Path(d)


@contextmanager
def atomic_write(path: str | Path, mode: str = "w", encoding: str = "utf-8") -> Generator:
    """
    Write to a temporary file then replace target atomically.

    Example:
        with atomic_write("config.json") as f:
            json.dump(config, f, indent=2)
    """
    p   = Path(path)
    p.parent.mkdir(parents=True, exist_ok=True)
    tmp = p.with_suffix(p.suffix + ".tmp")
    is_binary = "b" in mode
    kwargs = {} if is_binary else {"encoding": encoding}
    try:
        with open(tmp, mode, **kwargs) as f:
            yield f
        tmp.replace(p)
    except Exception:
        tmp.unlink(missing_ok=True)
        raise


# ─────────────────────────────────────────────────────────────────────────────
# 4. ExitStack patterns
# ─────────────────────────────────────────────────────────────────────────────

def open_files(paths: list[str | Path], mode: str = "r") -> ExitStack:
    """
    Open multiple files, guaranteed to all close even on error.
    Returns the ExitStack; use as a context manager.

    Example:
        with open_files(["a.csv","b.csv"]) as stack:
            # not typical usage — see open_all()
    """
    stack = ExitStack()
    for p in paths:
        stack.enter_context(open(p, mode))
    return stack


@contextmanager
def open_all(
    paths: list[str | Path],
    mode: str = "r",
    encoding: str = "utf-8",
) -> Generator[list, None, None]:
    """
    Open all files; yield the list of file handles; close all on exit.

    Example:
        with open_all(["in1.txt","in2.txt"]) as handles:
            for h in handles:
                process(h.read())
    """
    with ExitStack() as stack:
        handles = [
            stack.enter_context(open(p, mode, encoding=encoding))
            for p in paths
        ]
        yield handles


@contextmanager
def managed_resources(*cms) -> Generator[list, None, None]:
    """
    Enter multiple context managers; yield their values; exit all.

    Example:
        with managed_resources(open("a.txt"), open("b.txt"), lock) as (fa, fb, _):
            fa.write(fb.read())
    """
    with ExitStack() as stack:
        yield [stack.enter_context(cm) for cm in cms]


# ─────────────────────────────────────────────────────────────────────────────
# 5. Async context managers
# ─────────────────────────────────────────────────────────────────────────────

@asynccontextmanager
async def async_timer(name: str = "") -> AsyncIterator[dict]:
    """
    Async version of timer().

    Example:
        async with async_timer("fetch") as t:
            data = await client.get("/api/data")
        print(f"took {t['elapsed']:.3f}s")
    """
    info: dict = {"elapsed": None}
    t0 = time.perf_counter()
    try:
        yield info
    finally:
        info["elapsed"] = time.perf_counter() - t0
        if name:
            log.debug("async %s: %.3fs", name, info["elapsed"])


@asynccontextmanager
async def multi_async(*cms) -> AsyncIterator[list]:
    """
    Enter multiple async context managers via AsyncExitStack.

    Example:
        async with multi_async(aiofiles.open("a"), aiofiles.open("b")) as (fa, fb):
            await fa.write(await fb.read())
    """
    async with AsyncExitStack() as stack:
        results = []
        for cm in cms:
            results.append(await stack.enter_async_context(cm))
        yield results


# ─────────────────────────────────────────────────────────────────────────────
# 6. Convenience wrappers
# ─────────────────────────────────────────────────────────────────────────────

@contextmanager
def env_override(**env_vars: str) -> Generator[None, None, None]:
    """
    Temporarily set environment variables; restore on exit.

    Example:
        with env_override(DATABASE_URL="sqlite:///:memory:", DEBUG="1"):
            run_tests()
    """
    original = {k: os.environ.get(k) for k in env_vars}
    os.environ.update(env_vars)
    try:
        yield
    finally:
        for k, v in original.items():
            if v is None:
                os.environ.pop(k, None)
            else:
                os.environ[k] = v


@contextmanager
def chdir(path: str | Path) -> Generator[Path, None, None]:
    """
    Temporarily change the working directory.

    Example:
        with chdir("/repo"):
            subprocess.run(["make", "test"])
    """
    before = Path.cwd()
    os.chdir(path)
    try:
        yield Path(path)
    finally:
        os.chdir(before)


def optional_cm(condition: bool, cm):
    """
    Use a real context manager if condition is True, else nullcontext.

    Example:
        with optional_cm(verbose, timer("load")) as t:
            data = load_file(path)
        if verbose:
            print(f"loaded in {t['elapsed']:.3f}s")
    """
    return cm if condition else nullcontext()


# ─────────────────────────────────────────────────────────────────────────────
# Demo
# ─────────────────────────────────────────────────────────────────────────────

if __name__ == "__main__":
    import json

    print("=== contextlib demo ===")

    print("\n--- timer ---")
    with timer("sleep test") as t:
        time.sleep(0.05)
    print(f"  elapsed: {t['elapsed']:.3f}s")

    print("\n--- Stopwatch ---")
    sw = Stopwatch()
    with sw:
        time.sleep(0.02)
        sw.lap("phase 1")
        time.sleep(0.03)
        sw.lap("phase 2")
    print(f"  laps: {[(n, f'{e:.3f}s') for n,e in sw.laps]}")

    print("\n--- capture_output ---")
    with capture_output() as out:
        print("hello stdout")
        print("hello stderr", file=sys.stderr)
    print(f"  stdout: {out['stdout']!r}")
    print(f"  stderr: {out['stderr']!r}")

    print("\n--- temp_file ---")
    with temp_file(".json", content='{"ok": true}') as p:
        data = json.loads(p.read_text())
        print(f"  read: {data}  exists: {p.exists()}")
    print(f"  after: {p.exists()}")

    print("\n--- temp_dir ---")
    with temp_dir() as d:
        (d / "test.txt").write_text("hello")
        print(f"  files: {[f.name for f in d.iterdir()]}")
    print(f"  dir exists after: {d.exists()}")

    print("\n--- atomic_write ---")
    with temp_dir() as d:
        target = d / "config.json"
        with atomic_write(target) as f:
            json.dump({"env": "test"}, f)
        print(f"  written: {json.loads(target.read_text())}")

    print("\n--- suppress ---")
    with suppress(FileNotFoundError):
        Path("/nonexistent/path/file.txt").read_text()
    print("  suppress worked — no crash")

    print("\n--- env_override ---")
    os.environ.pop("TEST_VAR", None)
    with env_override(TEST_VAR="hello", PATH=os.environ["PATH"]):
        print(f"  inside: TEST_VAR={os.environ.get('TEST_VAR')!r}")
    print(f"  after:  TEST_VAR={os.environ.get('TEST_VAR')!r}")

    print("\n=== done ===")

For the trio alternative — trio is a third-party async library built around structured concurrency with native support for async context managers, nurseries (replace async with AsyncExitStack), and cancel scopes (replaces manual timeout management); Python’s stdlib contextlib works with both asyncio and any async framework, requiring no additional dependencies — use trio when building async applications where structured concurrency and nursery-based task lifetimes are first-class concerns, contextlib.AsyncExitStack + asynccontextmanager when adding async cleanup to existing asyncio code. For the anyio alternative — anyio provides a compatibility layer across asyncio and trio, including its own create_task_group() (structured concurrency) and CancelScope; contextlib.suppress and asynccontextmanager work regardless of async backend — use anyio when writing async library code that must be backend-agnostic, stdlib contextlib for application-level resource management where backend choice is fixed. The Claude Skills 360 bundle includes contextlib skill sets covering timer()/Stopwatch() elapsed timing, capture_output()/silence() output redirection, temp_file()/temp_dir()/atomic_write() temporary resources, open_all()/managed_resources() ExitStack patterns, async_timer()/multi_async() async context managers, env_override()/chdir()/optional_cm() convenience wrappers. Start with the free tier to try resource lifecycle management and contextlib pipeline code generation.

Keep Reading

AI

Claude Code for email.contentmanager: Python Email Content Accessors

Read and write EmailMessage body content with Python's email.contentmanager module and Claude Code — email contentmanager ContentManager for the class that maps content types to get and set handler functions allowing EmailMessage to support get_content and set_content with type-specific behaviour, email contentmanager raw_data_manager for the ContentManager instance that handles raw bytes and str payloads without any conversion, email contentmanager content_manager for the standard ContentManager instance used by email.policy.default that intelligently handles text plain text html multipart and binary content types, email contentmanager get_content_text for the handler that returns the decoded text payload of a text-star message part as a str, email contentmanager get_content_binary for the handler that returns the raw decoded bytes payload of a non-text message part, email contentmanager get_data_manager for the get-handler lookup used by EmailMessage get_content to find the right reader function for the content type, email contentmanager set_content text for the handler that creates and sets a text part correctly choosing charset and transfer encoding, email contentmanager set_content bytes for the handler that creates and sets a binary part with base64 encoding and optional filename Content-Disposition, email contentmanager EmailMessage get_content for the method that reads the message body using the registered content manager handlers, email contentmanager EmailMessage set_content for the method that sets the message body and MIME headers in one call, email contentmanager EmailMessage make_alternative make_mixed make_related for the methods that convert a simple message into a multipart container, email contentmanager EmailMessage add_attachment for the method that attaches a file or bytes to a multipart message, and email contentmanager integration with email.message and email.policy and email.mime and io for building high-level email readers attachment extractors text body accessors HTML readers and policy-aware MIME construction pipelines.

5 min read Feb 12, 2029
AI

Claude Code for email.charset: Python Email Charset Encoding

Control header and body encoding for international email with Python's email.charset module and Claude Code — email charset Charset for the class that wraps a character set name with the encoding rules for header encoding and body encoding describing how to encode text for that charset in email messages, email charset Charset header_encoding for the attribute specifying whether headers using this charset should use QP quoted-printable encoding BASE64 encoding or no encoding, email charset Charset body_encoding for the attribute specifying the Content-Transfer-Encoding to use for message bodies in this charset such as QP or BASE64, email charset Charset output_codec for the attribute giving the Python codec name used to encode the string to bytes for the wire format, email charset Charset input_codec for the attribute giving the Python codec name used to decode incoming bytes to str, email charset Charset get_output_charset for returning the output charset name, email charset Charset header_encode for encoding a header string using the charset's header_encoding method, email charset Charset body_encode for encoding body content using the charset's body_encoding, email charset Charset convert for converting a string from the input_codec to the output_codec, email charset add_charset for registering a new charset with custom encoding rules in the global charset registry, email charset add_alias for adding an alias name that maps to an existing registered charset, email charset add_codec for registering a codec name mapping for use by the charset machinery, and email charset integration with email.message and email.mime and email.policy and email.encoders for building international email senders non-ASCII header encoders Content-Transfer-Encoding selectors charset-aware message constructors and MIME encoding pipelines.

5 min read Feb 11, 2029
AI

Claude Code for email.utils: Python Email Address and Header Utilities

Parse and format RFC 2822 email addresses and dates with Python's email.utils module and Claude Code — email utils parseaddr for splitting a display-name plus angle-bracket address string into a realname and email address tuple, email utils formataddr for combining a realname and address string into a properly quoted RFC 2822 address with angle brackets, email utils getaddresses for parsing a list of raw address header strings each potentially containing multiple comma-separated addresses into a list of realname address tuples, email utils parsedate for parsing an RFC 2822 date string into a nine-tuple compatible with time.mktime, email utils parsedate_tz for parsing an RFC 2822 date string into a ten-tuple that includes the UTC offset timezone in seconds, email utils parsedate_to_datetime for parsing an RFC 2822 date string into an aware datetime object with timezone, email utils formatdate for formatting a POSIX timestamp or the current time as an RFC 2822 date string with optional usegmt and localtime flags, email utils format_datetime for formatting a datetime object as an RFC 2822 date string, email utils make_msgid for generating a globally unique Message-ID string with optional idstring and domain components, email utils decode_rfc2231 for decoding an RFC 2231 encoded parameter value into a tuple of charset language and value, email utils encode_rfc2231 for encoding a string as an RFC 2231 encoded parameter value, email utils collapse_rfc2231_value for collapsing a decoded RFC 2231 tuple to a Unicode string, and email utils integration with email.message and email.headerregistry and datetime and time for building address parsers date formatters message-id generators header extractors and RFC-compliant email construction utilities.

5 min read Feb 10, 2029

Put these ideas into practice

Claude Skills 360 gives you production-ready skills for everything in this article — and 2,350+ more. Start free or go all-in.

Back to Blog

Get 360 skills free