Claude Code for structlog: Structured Logging in Python — Claude Skills 360 Blog
Blog / AI / Claude Code for structlog: Structured Logging in Python
AI

Claude Code for structlog: Structured Logging in Python

Published: January 13, 2028
Read time: 5 min read
By: Claude Skills 360

structlog emits structured, context-rich logs. pip install structlog. Configure: import structlog; structlog.configure(processors=[...], wrapper_class=structlog.BoundLogger, context_class=dict, logger_factory=structlog.PrintLoggerFactory()). Get logger: log = structlog.get_logger(). Log: log.info("user_created", user_id=42, email="[email protected]"). Bind: log = log.bind(request_id="abc123"). All subsequent calls include request_id. log.unbind("request_id"). log.try_unbind("x") — no error if absent. Processors: from structlog.processors import JSONRenderer, TimeStamper, add_log_level, format_exc_info, UnicodeDecoder. stdlib integration: structlog.configure(logger_factory=structlog.stdlib.LoggerFactory(), wrapper_class=structlog.stdlib.BoundLogger) — structlog wraps stdlib. structlog.stdlib.filter_by_level — honour stdlib level. structlog.stdlib.add_logger_name — adds logger name. contextvars: from structlog.contextvars import bind_contextvars, clear_contextvars, merge_contextvars. FastAPI middleware: bind_contextvars(request_id=str(uuid4())) at start of request. clear_contextvars() after. Async: from structlog import get_logger; log = get_logger(); await log.ainfo("msg", key=val). AsyncBoundLogger: structlog.configure(wrapper_class=structlog.stdlib.AsyncBoundLogger). Testing: from structlog.testing import capture_logs. with capture_logs() as cap: log.info("hello", x=1); assert cap[0]["event"] == "hello". cap[0]["x"] == 1. ProcessorFormatter for Django/Flask stdlib integration. ExceptionRenderer — structured exception dict instead of traceback string. CallsiteParameterAdder — adds filename, lineno, function_name automatically. Claude Code generates structlog configurations, middleware context propagation, and capture_logs test assertions.

CLAUDE.md for structlog

## structlog Stack
- Version: structlog >= 24.1 | pip install structlog
- Configure: structlog.configure(processors=[...]) — call once at app startup
- Logger: log = structlog.get_logger().bind(service="api") — immutable context
- Context: bind_contextvars(request_id=...) — automatic ContextVar propagation
- Production: JSONRenderer() processor — machine-readable one-line JSON
- Dev: ConsoleRenderer() — colorised human-readable output
- Test: with capture_logs() as cap: ... assert cap[0]["event"] == "..."

structlog Logging Pipeline

# app/logging_setup.py — structlog configuration
from __future__ import annotations

import logging
import sys
import uuid
from contextlib import asynccontextmanager
from typing import Any

import structlog
from structlog.contextvars import (
    bind_contextvars,
    clear_contextvars,
    merge_contextvars,
)
from structlog.processors import (
    CallsiteParameter,
    CallsiteParameterAdder,
    ExceptionRenderer,
    JSONRenderer,
    TimeStamper,
    UnicodeDecoder,
    add_log_level,
)
from structlog.stdlib import add_logger_name, filter_by_level
from structlog.testing import capture_logs


# ─────────────────────────────────────────────────────────────────────────────
# 1. Configure structlog — call once at app startup
# ─────────────────────────────────────────────────────────────────────────────

def configure_logging(
    log_level: str = "INFO",
    json_output: bool = True,
) -> None:
    """
    Set up structlog with a stdlib bridge so legacy `logging.getLogger()`
    calls also emit structured JSON.

    Processor order matters:
      1. filter_by_level   — drop below-threshold log calls early (fast)
      2. add_logger_name   — from stdlib, adds "logger" key
      3. add_log_level     — adds "level" key
      4. TimeStamper       — adds "timestamp" ISO key
      5. merge_contextvars — merge ContextVar-bound fields (request_id etc.)
      6. CallsiteParameterAdder — adds filename, lineno, func_name
      7. ExceptionRenderer — convert exc_info to structured dict
      8. UnicodeDecoder    — ensure strings
      9. JSONRenderer      — final step: render to JSON string
    """
    shared_processors = [
        filter_by_level,
        add_logger_name,
        add_log_level,
        TimeStamper(fmt="iso"),
        merge_contextvars,
        CallsiteParameterAdder(
            [
                CallsiteParameter.FILENAME,
                CallsiteParameter.LINENO,
                CallsiteParameter.FUNC_NAME,
            ]
        ),
        ExceptionRenderer(),
        UnicodeDecoder(),
    ]

    renderer = JSONRenderer() if json_output else structlog.dev.ConsoleRenderer(colors=True)

    structlog.configure(
        processors=shared_processors + [renderer],
        wrapper_class=structlog.stdlib.BoundLogger,
        context_class=dict,
        logger_factory=structlog.stdlib.LoggerFactory(),
        cache_logger_on_first_use=True,
    )

    # Route stdlib logging through structlog
    logging.basicConfig(
        format="%(message)s",
        stream=sys.stdout,
        level=getattr(logging, log_level.upper()),
    )
    for name in logging.root.manager.loggerDict:
        logging.getLogger(name).setLevel(getattr(logging, log_level.upper()))


# Development configuration — pretty console output
def configure_dev_logging() -> None:
    configure_logging(log_level="DEBUG", json_output=False)


# Production configuration — JSON for log aggregators
def configure_prod_logging() -> None:
    configure_logging(log_level="INFO", json_output=True)


# ─────────────────────────────────────────────────────────────────────────────
# 2. Module-level logger usage
# ─────────────────────────────────────────────────────────────────────────────

log = structlog.get_logger(__name__)


class UserService:

    def __init__(self) -> None:
        # Bind service context — all log calls include service="users"
        self._log = structlog.get_logger().bind(service="users")

    def create_user(self, email: str, role: str = "user") -> dict:
        self._log.info("creating_user", email=email, role=role)
        try:
            user = {"id": 1, "email": email, "role": role}
            self._log.info("user_created", user_id=user["id"], email=email)
            return user
        except Exception as exc:
            self._log.exception("user_creation_failed", email=email)
            raise

    def delete_user(self, user_id: int) -> None:
        bound = self._log.bind(user_id=user_id)
        bound.info("deleting_user")
        bound.info("user_deleted")

    def get_user(self, user_id: int) -> dict | None:
        self._log.debug("fetching_user", user_id=user_id)
        if user_id == 1:
            return {"id": 1, "email": "[email protected]"}
        self._log.warning("user_not_found", user_id=user_id)
        return None


# ─────────────────────────────────────────────────────────────────────────────
# 3. contextvars — automatic request context propagation
# ─────────────────────────────────────────────────────────────────────────────

def process_request(path: str, method: str) -> None:
    """
    bind_contextvars() stores context in a ContextVar — no need to pass the
    logger around. Every log call in this request automatically includes
    request_id and path.
    """
    clear_contextvars()
    bind_contextvars(
        request_id=str(uuid.uuid4()),
        http_method=method,
        http_path=path,
    )
    try:
        log.info("request_started")
        # ... handle request ...
        log.info("request_completed", status_code=200)
    except Exception:
        log.exception("request_failed", status_code=500)
    finally:
        clear_contextvars()


# ─────────────────────────────────────────────────────────────────────────────
# 4. FastAPI / Starlette middleware
# ─────────────────────────────────────────────────────────────────────────────

try:
    from fastapi import FastAPI, Request, Response
    from fastapi.routing import APIRouter
    import time

    demo_app = FastAPI(lifespan=None)

    @demo_app.middleware("http")
    async def structlog_middleware(request: Request, call_next) -> Response:
        """
        Bind request context for every log call during this request.
        clear_contextvars() in the finally ensures no leakage between requests.
        """
        clear_contextvars()
        bind_contextvars(
            request_id=request.headers.get("X-Request-Id", str(uuid.uuid4())),
            http_method=request.method,
            http_path=str(request.url.path),
            http_host=request.headers.get("host", ""),
        )
        start_ns = time.perf_counter_ns()
        try:
            response = await call_next(request)
            bind_contextvars(
                status_code=response.status_code,
                duration_ms=round((time.perf_counter_ns() - start_ns) / 1e6, 2),
            )
            log.info("http_request")
            return response
        except Exception:
            log.exception("http_request_error")
            raise
        finally:
            clear_contextvars()

    @demo_app.get("/health")
    async def health() -> dict:
        log.debug("health_check_called")
        return {"status": "ok"}

    @demo_app.get("/users/{user_id}")
    async def get_user(user_id: int) -> dict:
        svc = UserService()
        user = svc.get_user(user_id)
        if user is None:
            from fastapi import HTTPException
            raise HTTPException(status_code=404, detail="Not found")
        return user

except ImportError:
    demo_app = None   # type: ignore[assignment]


# ─────────────────────────────────────────────────────────────────────────────
# 5. Testing — capture_logs for assertion-friendly log checking
# ─────────────────────────────────────────────────────────────────────────────

def test_example_capture() -> None:
    """
    capture_logs() intercepts log calls and returns them as a list of dicts.
    No configuration or sink setup needed — works independently of configure().
    """
    svc = UserService()

    with capture_logs() as cap:
        svc.create_user("[email protected]", role="admin")

    # cap is a list of dicts — one entry per log call
    create_events = [e for e in cap if e["event"] == "creating_user"]
    assert len(create_events) == 1
    assert create_events[0]["email"] == "[email protected]"
    assert create_events[0]["role"] == "admin"

    created_events = [e for e in cap if e["event"] == "user_created"]
    assert len(created_events) == 1

    print(f"Captured {len(cap)} log entries:")
    for entry in cap:
        print(f"  [{entry.get('log_level','?')}] {entry['event']} {dict(entry)}")


def test_warning_on_missing_user() -> None:
    svc = UserService()

    with capture_logs() as cap:
        result = svc.get_user(999)

    assert result is None
    warns = [e for e in cap if e.get("log_level") == "warning"]
    assert len(warns) == 1
    assert warns[0]["user_id"] == 999


def test_context_variables_in_logs() -> None:
    clear_contextvars()
    bind_contextvars(request_id="req-test-001", service="test")

    with capture_logs() as cap:
        log.info("test_event", key="value")

    clear_contextvars()

    assert cap[0]["request_id"] == "req-test-001"
    assert cap[0]["key"] == "value"


# ─────────────────────────────────────────────────────────────────────────────
# Demo
# ─────────────────────────────────────────────────────────────────────────────

if __name__ == "__main__":
    configure_dev_logging()

    svc = UserService()
    svc.create_user("[email protected]", role="admin")
    svc.get_user(999)   # triggers warning

    bind_contextvars(request_id="req-demo-001")
    log.info("demo_complete", status="ok")
    clear_contextvars()

    # Tests
    test_example_capture()
    test_warning_on_missing_user()
    test_context_variables_in_logs()
    print("All structlog tests passed.")

For the logging.getLogger() alternative — the stdlib logging module formats messages as unstructured strings: logging.info("User %s created with role %s", user_id, role) produces a text line that a log aggregator must parse with a regex, while log.info("user_created", user_id=user_id, role=role) with structlog’s JSONRenderer produces {"event":"user_created","user_id":42,"role":"admin","timestamp":"...","level":"info"} — a JSON object that Datadog, Loki, or Splunk can index, filter, and alert on without parsing. For the loguru alternative — loguru’s logger.info("msg {key}", key=val) format-string API is convenient but doesn’t produce structured dicts for downstream processing unless you add a custom sink, while structlog’s processor pipeline is composable: merge_contextvars automatically injects the request’s ContextVar-bound fields (request_id, user_id, trace_id) into every log call in that async task without passing any object around. The Claude Skills 360 bundle includes structlog skill sets covering configure() processor chains, JSONRenderer vs ConsoleRenderer, TimeStamper/add_log_level/add_logger_name, merge_contextvars and bind_contextvars for request context, ExceptionRenderer for structured exceptions, CallsiteParameterAdder for source location, FastAPI middleware integration, stdlib logging bridge, capture_logs for unit testing, and BoundLogger.bind for service-level context. Start with the free tier to try structured logging code generation.

Keep Reading

AI

Claude Code for email.contentmanager: Python Email Content Accessors

Read and write EmailMessage body content with Python's email.contentmanager module and Claude Code — email contentmanager ContentManager for the class that maps content types to get and set handler functions allowing EmailMessage to support get_content and set_content with type-specific behaviour, email contentmanager raw_data_manager for the ContentManager instance that handles raw bytes and str payloads without any conversion, email contentmanager content_manager for the standard ContentManager instance used by email.policy.default that intelligently handles text plain text html multipart and binary content types, email contentmanager get_content_text for the handler that returns the decoded text payload of a text-star message part as a str, email contentmanager get_content_binary for the handler that returns the raw decoded bytes payload of a non-text message part, email contentmanager get_data_manager for the get-handler lookup used by EmailMessage get_content to find the right reader function for the content type, email contentmanager set_content text for the handler that creates and sets a text part correctly choosing charset and transfer encoding, email contentmanager set_content bytes for the handler that creates and sets a binary part with base64 encoding and optional filename Content-Disposition, email contentmanager EmailMessage get_content for the method that reads the message body using the registered content manager handlers, email contentmanager EmailMessage set_content for the method that sets the message body and MIME headers in one call, email contentmanager EmailMessage make_alternative make_mixed make_related for the methods that convert a simple message into a multipart container, email contentmanager EmailMessage add_attachment for the method that attaches a file or bytes to a multipart message, and email contentmanager integration with email.message and email.policy and email.mime and io for building high-level email readers attachment extractors text body accessors HTML readers and policy-aware MIME construction pipelines.

5 min read Feb 12, 2029
AI

Claude Code for email.charset: Python Email Charset Encoding

Control header and body encoding for international email with Python's email.charset module and Claude Code — email charset Charset for the class that wraps a character set name with the encoding rules for header encoding and body encoding describing how to encode text for that charset in email messages, email charset Charset header_encoding for the attribute specifying whether headers using this charset should use QP quoted-printable encoding BASE64 encoding or no encoding, email charset Charset body_encoding for the attribute specifying the Content-Transfer-Encoding to use for message bodies in this charset such as QP or BASE64, email charset Charset output_codec for the attribute giving the Python codec name used to encode the string to bytes for the wire format, email charset Charset input_codec for the attribute giving the Python codec name used to decode incoming bytes to str, email charset Charset get_output_charset for returning the output charset name, email charset Charset header_encode for encoding a header string using the charset's header_encoding method, email charset Charset body_encode for encoding body content using the charset's body_encoding, email charset Charset convert for converting a string from the input_codec to the output_codec, email charset add_charset for registering a new charset with custom encoding rules in the global charset registry, email charset add_alias for adding an alias name that maps to an existing registered charset, email charset add_codec for registering a codec name mapping for use by the charset machinery, and email charset integration with email.message and email.mime and email.policy and email.encoders for building international email senders non-ASCII header encoders Content-Transfer-Encoding selectors charset-aware message constructors and MIME encoding pipelines.

5 min read Feb 11, 2029
AI

Claude Code for email.utils: Python Email Address and Header Utilities

Parse and format RFC 2822 email addresses and dates with Python's email.utils module and Claude Code — email utils parseaddr for splitting a display-name plus angle-bracket address string into a realname and email address tuple, email utils formataddr for combining a realname and address string into a properly quoted RFC 2822 address with angle brackets, email utils getaddresses for parsing a list of raw address header strings each potentially containing multiple comma-separated addresses into a list of realname address tuples, email utils parsedate for parsing an RFC 2822 date string into a nine-tuple compatible with time.mktime, email utils parsedate_tz for parsing an RFC 2822 date string into a ten-tuple that includes the UTC offset timezone in seconds, email utils parsedate_to_datetime for parsing an RFC 2822 date string into an aware datetime object with timezone, email utils formatdate for formatting a POSIX timestamp or the current time as an RFC 2822 date string with optional usegmt and localtime flags, email utils format_datetime for formatting a datetime object as an RFC 2822 date string, email utils make_msgid for generating a globally unique Message-ID string with optional idstring and domain components, email utils decode_rfc2231 for decoding an RFC 2231 encoded parameter value into a tuple of charset language and value, email utils encode_rfc2231 for encoding a string as an RFC 2231 encoded parameter value, email utils collapse_rfc2231_value for collapsing a decoded RFC 2231 tuple to a Unicode string, and email utils integration with email.message and email.headerregistry and datetime and time for building address parsers date formatters message-id generators header extractors and RFC-compliant email construction utilities.

5 min read Feb 10, 2029

Put these ideas into practice

Claude Skills 360 gives you production-ready skills for everything in this article — and 2,350+ more. Start free or go all-in.

Back to Blog

Get 360 skills free