Claude Code for json: JSON Serialization in Python — Claude Skills 360 Blog
Blog / AI / Claude Code for json: JSON Serialization in Python
AI

Claude Code for json: JSON Serialization in Python

Published: July 7, 2028
Read time: 5 min read
By: Claude Skills 360

Python’s json module handles JSON encoding and decoding. import json. dumps: json.dumps(obj) → str. loads: json.loads(s) → Python object. dump/load: json.dump(obj, f); obj = json.load(f). indent: json.dumps(obj, indent=2) — pretty-print. separators: json.dumps(obj, separators=(",",":")) — compact. sort_keys: json.dumps(obj, sort_keys=True). default: json.dumps(obj, default=str) — fallback serializer. JSONEncoder: class MyEncoder(json.JSONEncoder): def default(self, o): .... JSONDecoder: json.loads(s, cls=MyDecoder). object_hook: json.loads(s, object_hook=fn) — called on each dict. object_pairs_hook: json.loads(s, object_pairs_hook=OrderedDict). parse_float: json.loads(s, parse_float=Decimal) — custom float. parse_int: json.loads(s, parse_int=str) — preserve large ints. ensure_ascii: json.dumps(obj, ensure_ascii=False) — embed Unicode. allow_nan: json.dumps(obj, allow_nan=False) — strict IEEE. cls: json.dumps(obj, cls=MyEncoder). Streaming: readline + json.loads per line (JSON Lines). JSONDecodeError: except json.JSONDecodeError as e: e.msg, e.lineno, e.colno. json.JSONDecodeError is a subclass of ValueError. json.encoder.FLOAT_REPR — deprecated. json.loads("null") → None. json.loads("true") → True. Claude Code generates API serializers, config readers, event loggers, and data exchange helpers.

CLAUDE.md for json

## json Stack
- Stdlib: import json
- Encode: json.dumps(obj, indent=2, default=str)  |  json.dump(obj, f, indent=2)
- Decode: json.loads(s)  |  json.load(f)
- Custom: JSONEncoder.default() for types that aren't JSON-native (datetime, Decimal, dataclass)
- object_hook: for deserializing dicts into typed objects on load
- Errors: except json.JSONDecodeError as e  — shows line/col for debugging

json Serialization Pipeline

# app/serialization.py — JSONEncoder, JSONDecoder, object_hook, streaming, validators
from __future__ import annotations

import dataclasses
import io
import json
import uuid
from dataclasses import dataclass, field, fields, is_dataclass
from datetime import date, datetime
from decimal import Decimal
from enum import Enum
from pathlib import Path
from typing import Any, Callable, Iterator, TypeVar

T = TypeVar("T")


# ─────────────────────────────────────────────────────────────────────────────
# 1. Extended JSON encoder
# ─────────────────────────────────────────────────────────────────────────────

class ExtendedEncoder(json.JSONEncoder):
    """
    JSON encoder that handles types not supported by stdlib:
    datetime, date, Decimal, Path, UUID, Enum, dataclasses, bytes.

    Example:
        json.dumps(obj, cls=ExtendedEncoder)
        json.dumps(obj, cls=ExtendedEncoder, indent=2)
    """

    def default(self, obj: Any) -> Any:
        if isinstance(obj, datetime):
            return obj.isoformat()
        if isinstance(obj, date):
            return obj.isoformat()
        if isinstance(obj, Decimal):
            return str(obj)
        if isinstance(obj, Path):
            return str(obj)
        if isinstance(obj, uuid.UUID):
            return str(obj)
        if isinstance(obj, Enum):
            return obj.value
        if isinstance(obj, bytes):
            return obj.hex()
        if is_dataclass(obj) and not isinstance(obj, type):
            return dataclasses.asdict(obj)
        if hasattr(obj, "__dict__"):
            return obj.__dict__
        return super().default(obj)


def dumps(obj: Any, **kwargs) -> str:
    """
    Serialize to JSON string using ExtendedEncoder.

    Example:
        s = dumps({"ts": datetime.now(), "amount": Decimal("9.99")})
    """
    kwargs.setdefault("cls", ExtendedEncoder)
    return json.dumps(obj, **kwargs)


def dump(obj: Any, fp, **kwargs) -> None:
    """Serialize to file using ExtendedEncoder."""
    kwargs.setdefault("cls", ExtendedEncoder)
    json.dump(obj, fp, **kwargs)


def pretty(obj: Any) -> str:
    """
    Pretty-print JSON with 2-space indent.

    Example:
        print(pretty({"nested": {"key": [1, 2, 3]}}))
    """
    return dumps(obj, indent=2, sort_keys=True)


def compact(obj: Any) -> str:
    """
    Compact JSON with no extra whitespace.

    Example:
        line = compact(event_dict)   # for JSON Lines logging
    """
    return dumps(obj, separators=(",", ":"))


# ─────────────────────────────────────────────────────────────────────────────
# 2. Safe loading
# ─────────────────────────────────────────────────────────────────────────────

def loads(s: str | bytes, **kwargs) -> Any:
    """
    Deserialize JSON string; raise json.JSONDecodeError with context on failure.

    Example:
        data = loads('{"key": "value"}')
    """
    try:
        return json.loads(s, **kwargs)
    except json.JSONDecodeError as exc:
        raise json.JSONDecodeError(
            f"{exc.msg} (line {exc.lineno}, col {exc.colno})",
            exc.doc, exc.pos,
        ) from None


def loads_safe(s: str | bytes, default: Any = None) -> Any:
    """
    Deserialize JSON; return default instead of raising on bad input.

    Example:
        data = loads_safe(response_text, default={})
    """
    try:
        return json.loads(s)
    except (json.JSONDecodeError, TypeError, ValueError):
        return default


def load_file(path: str | Path, default: Any = None) -> Any:
    """
    Load JSON from file; return default if file missing or invalid.

    Example:
        config = load_file("config.json", default={})
    """
    p = Path(path)
    if not p.exists():
        return default
    try:
        return json.loads(p.read_text(encoding="utf-8"))
    except (json.JSONDecodeError, OSError):
        return default


def save_file(obj: Any, path: str | Path, indent: int = 2) -> Path:
    """
    Save object as JSON file atomically (write temp → rename).

    Example:
        save_file(config_dict, "config.json")
    """
    p   = Path(path)
    p.parent.mkdir(parents=True, exist_ok=True)
    tmp = p.with_suffix(p.suffix + ".tmp")
    tmp.write_text(dumps(obj, indent=indent), encoding="utf-8")
    tmp.replace(p)
    return p


# ─────────────────────────────────────────────────────────────────────────────
# 3. Decimal-preserving loader
# ─────────────────────────────────────────────────────────────────────────────

def loads_decimal(s: str | bytes) -> Any:
    """
    Deserialize JSON with floats parsed as Decimal (no fp precision loss).

    Example:
        data = loads_decimal('{"price": 9.99, "tax": 0.895}')
        type(data["price"])   # Decimal
    """
    return json.loads(s, parse_float=Decimal)


# ─────────────────────────────────────────────────────────────────────────────
# 4. Dataclass round-trip
# ─────────────────────────────────────────────────────────────────────────────

def dataclass_to_json(obj: Any, **kwargs) -> str:
    """
    Serialize a dataclass instance (recursively) to JSON.

    Example:
        @dataclass class Point: x: float; y: float
        s = dataclass_to_json(Point(1.0, 2.0))
    """
    return dumps(dataclasses.asdict(obj), **kwargs)


def dataclass_from_json(cls: type[T], s: str | bytes | dict) -> T:
    """
    Construct a flat dataclass from a JSON string or dict.
    Unknown keys are silently ignored.

    Example:
        p = dataclass_from_json(Point, '{"x": 1.0, "y": 2.0, "extra": "ignored"}')
    """
    if isinstance(s, (str, bytes)):
        data = json.loads(s)
    else:
        data = s
    valid  = {f.name for f in fields(cls)}
    kwargs = {k: v for k, v in data.items() if k in valid}
    return cls(**kwargs)


# ─────────────────────────────────────────────────────────────────────────────
# 5. JSON Lines (JSONL) streaming
# ─────────────────────────────────────────────────────────────────────────────

def iter_jsonl(path: str | Path | io.IOBase) -> Iterator[Any]:
    """
    Lazily parse a JSON Lines file (one JSON object per line).

    Example:
        for record in iter_jsonl("events.jsonl"):
            process(record)
    """
    if isinstance(path, (str, Path)):
        with open(path, encoding="utf-8") as f:
            for lineno, line in enumerate(f, 1):
                line = line.strip()
                if line and not line.startswith("#"):
                    try:
                        yield json.loads(line)
                    except json.JSONDecodeError as exc:
                        raise json.JSONDecodeError(
                            f"Line {lineno}: {exc.msg}", exc.doc, exc.pos
                        ) from None
    else:
        for line in path:
            line = line.strip() if isinstance(line, str) else line.strip().decode()
            if line:
                yield json.loads(line)


def write_jsonl(records: Any, path: str | Path) -> int:
    """
    Write iterable of objects to a JSON Lines file.
    Returns number of records written.

    Example:
        n = write_jsonl(events, "events.jsonl")
    """
    p = Path(path)
    p.parent.mkdir(parents=True, exist_ok=True)
    count = 0
    with open(p, "w", encoding="utf-8") as f:
        for record in records:
            f.write(compact(record) + "\n")
            count += 1
    return count


# ─────────────────────────────────────────────────────────────────────────────
# 6. Deep merge / patch
# ─────────────────────────────────────────────────────────────────────────────

def deep_merge(base: dict, override: dict) -> dict:
    """
    Deep-merge two dicts; override values win on conflicts.
    Nested dicts are merged recursively; non-dict values are replaced.

    Example:
        base     = {"a": 1, "b": {"x": 1, "y": 2}}
        override = {"b": {"y": 99, "z": 3}, "c": 4}
        result   = {"a": 1, "b": {"x": 1, "y": 99, "z": 3}, "c": 4}
    """
    result = dict(base)
    for key, val in override.items():
        if key in result and isinstance(result[key], dict) and isinstance(val, dict):
            result[key] = deep_merge(result[key], val)
        else:
            result[key] = val
    return result


def json_patch(original: str | dict, patch: str | dict) -> dict:
    """
    Apply a JSON patch (deep merge) and return the merged dict.

    Example:
        patched = json_patch(base_config_json, override_json)
    """
    base = json.loads(original) if isinstance(original, str) else dict(original)
    ovrd = json.loads(patch)    if isinstance(patch,    str) else dict(patch)
    return deep_merge(base, ovrd)


# ─────────────────────────────────────────────────────────────────────────────
# Demo
# ─────────────────────────────────────────────────────────────────────────────

if __name__ == "__main__":
    import tempfile

    print("=== json demo ===")

    # 1. Extended types
    @dataclass
    class Order:
        id:       uuid.UUID
        created:  datetime
        amount:   Decimal
        tags:     list[str] = field(default_factory=list)

    order = Order(
        id=uuid.UUID("550e8400-e29b-41d4-a716-446655440000"),
        created=datetime(2024, 1, 15, 10, 30),
        amount=Decimal("49.99"),
        tags=["premium","annual"],
    )

    print("\n--- ExtendedEncoder ---")
    s = dumps(order)
    print(f"  {s}")

    print("\n--- pretty ---")
    d = dataclasses.asdict(order)
    print(pretty({"id": str(order.id), "amount": str(order.amount), "tags": order.tags}))

    print("\n--- loads_safe ---")
    print(f"  good: {loads_safe('{\"k\":1}')}")
    print(f"  bad:  {loads_safe('not json', default={'error': True})}")

    print("\n--- loads_decimal ---")
    data = loads_decimal('{"price": 9.99}')
    print(f"  type={type(data['price']).__name__}  value={data['price']}")

    print("\n--- dataclass round-trip ---")
    @dataclass
    class Point:
        x: float
        y: float

    p   = Point(3.0, 4.0)
    s2  = dataclass_to_json(p)
    p2  = dataclass_from_json(Point, s2)
    print(f"  original: {p}  serialized: {s2!r}  restored: {p2}")

    print("\n--- JSON Lines ---")
    records = [{"id": i, "val": i**2} for i in range(5)]
    with tempfile.TemporaryDirectory() as td:
        path = f"{td}/data.jsonl"
        n = write_jsonl(records, path)
        print(f"  wrote {n} records")
        loaded = list(iter_jsonl(path))
        print(f"  read back: {loaded}")

    print("\n--- deep_merge ---")
    base  = {"a": 1, "b": {"x": 1, "y": 2}}
    patch = {"b": {"y": 99, "z": 3}, "c": 4}
    merged = deep_merge(base, patch)
    print(f"  merged: {merged}")

    print("\n--- save_file / load_file ---")
    with tempfile.TemporaryDirectory() as td:
        cfg_path = f"{td}/config.json"
        save_file({"host": "localhost", "port": 8080}, cfg_path)
        cfg = load_file(cfg_path)
        print(f"  loaded: {cfg}")
        missing = load_file(f"{td}/nonexistent.json", default={})
        print(f"  missing: {missing}")

    print("\n=== done ===")

For the orjson alternative — orjson (PyPI) is a Rust-backed JSON library that is 10–20x faster than stdlib json.dumps/json.loads, supports datetime, numpy, dataclasses, and UUID natively without a custom encoder, and produces bytes rather than str; stdlib json has zero external dependencies, is always available, and is fast enough for most use cases where JSON is not the throughput bottleneck — use orjson in hot paths (high-frequency API serialization, log pipelines) where benchmarks show json is a bottleneck, stdlib json for all other cases. For the msgspec alternative — msgspec provides JSON and MessagePack encoding/decoding with compile-time schema validation via Struct classes (similar to TypedDict but Rust-backed), 20–30x faster than stdlib json, and zero-copy decoding; stdlib json requires explicit JSONEncoder/object_hook for typed deserialization — use msgspec.Struct for high-performance typed JSON in microservices, stdlib json with dataclasses for all other code where external dependencies add friction. The Claude Skills 360 bundle includes json skill sets covering ExtendedEncoder (datetime/Decimal/UUID/Path/Enum/dataclass), dumps()/dump()/pretty()/compact() encoding helpers, loads()/loads_safe()/load_file()/save_file() decoding helpers, loads_decimal() precision float loading, dataclass_to_json()/dataclass_from_json() round-trip, iter_jsonl()/write_jsonl() JSON Lines streaming, and deep_merge()/json_patch() config merging. Start with the free tier to try JSON serialization and data exchange pipeline code generation.

Keep Reading

AI

Claude Code for email.contentmanager: Python Email Content Accessors

Read and write EmailMessage body content with Python's email.contentmanager module and Claude Code — email contentmanager ContentManager for the class that maps content types to get and set handler functions allowing EmailMessage to support get_content and set_content with type-specific behaviour, email contentmanager raw_data_manager for the ContentManager instance that handles raw bytes and str payloads without any conversion, email contentmanager content_manager for the standard ContentManager instance used by email.policy.default that intelligently handles text plain text html multipart and binary content types, email contentmanager get_content_text for the handler that returns the decoded text payload of a text-star message part as a str, email contentmanager get_content_binary for the handler that returns the raw decoded bytes payload of a non-text message part, email contentmanager get_data_manager for the get-handler lookup used by EmailMessage get_content to find the right reader function for the content type, email contentmanager set_content text for the handler that creates and sets a text part correctly choosing charset and transfer encoding, email contentmanager set_content bytes for the handler that creates and sets a binary part with base64 encoding and optional filename Content-Disposition, email contentmanager EmailMessage get_content for the method that reads the message body using the registered content manager handlers, email contentmanager EmailMessage set_content for the method that sets the message body and MIME headers in one call, email contentmanager EmailMessage make_alternative make_mixed make_related for the methods that convert a simple message into a multipart container, email contentmanager EmailMessage add_attachment for the method that attaches a file or bytes to a multipart message, and email contentmanager integration with email.message and email.policy and email.mime and io for building high-level email readers attachment extractors text body accessors HTML readers and policy-aware MIME construction pipelines.

5 min read Feb 12, 2029
AI

Claude Code for email.charset: Python Email Charset Encoding

Control header and body encoding for international email with Python's email.charset module and Claude Code — email charset Charset for the class that wraps a character set name with the encoding rules for header encoding and body encoding describing how to encode text for that charset in email messages, email charset Charset header_encoding for the attribute specifying whether headers using this charset should use QP quoted-printable encoding BASE64 encoding or no encoding, email charset Charset body_encoding for the attribute specifying the Content-Transfer-Encoding to use for message bodies in this charset such as QP or BASE64, email charset Charset output_codec for the attribute giving the Python codec name used to encode the string to bytes for the wire format, email charset Charset input_codec for the attribute giving the Python codec name used to decode incoming bytes to str, email charset Charset get_output_charset for returning the output charset name, email charset Charset header_encode for encoding a header string using the charset's header_encoding method, email charset Charset body_encode for encoding body content using the charset's body_encoding, email charset Charset convert for converting a string from the input_codec to the output_codec, email charset add_charset for registering a new charset with custom encoding rules in the global charset registry, email charset add_alias for adding an alias name that maps to an existing registered charset, email charset add_codec for registering a codec name mapping for use by the charset machinery, and email charset integration with email.message and email.mime and email.policy and email.encoders for building international email senders non-ASCII header encoders Content-Transfer-Encoding selectors charset-aware message constructors and MIME encoding pipelines.

5 min read Feb 11, 2029
AI

Claude Code for email.utils: Python Email Address and Header Utilities

Parse and format RFC 2822 email addresses and dates with Python's email.utils module and Claude Code — email utils parseaddr for splitting a display-name plus angle-bracket address string into a realname and email address tuple, email utils formataddr for combining a realname and address string into a properly quoted RFC 2822 address with angle brackets, email utils getaddresses for parsing a list of raw address header strings each potentially containing multiple comma-separated addresses into a list of realname address tuples, email utils parsedate for parsing an RFC 2822 date string into a nine-tuple compatible with time.mktime, email utils parsedate_tz for parsing an RFC 2822 date string into a ten-tuple that includes the UTC offset timezone in seconds, email utils parsedate_to_datetime for parsing an RFC 2822 date string into an aware datetime object with timezone, email utils formatdate for formatting a POSIX timestamp or the current time as an RFC 2822 date string with optional usegmt and localtime flags, email utils format_datetime for formatting a datetime object as an RFC 2822 date string, email utils make_msgid for generating a globally unique Message-ID string with optional idstring and domain components, email utils decode_rfc2231 for decoding an RFC 2231 encoded parameter value into a tuple of charset language and value, email utils encode_rfc2231 for encoding a string as an RFC 2231 encoded parameter value, email utils collapse_rfc2231_value for collapsing a decoded RFC 2231 tuple to a Unicode string, and email utils integration with email.message and email.headerregistry and datetime and time for building address parsers date formatters message-id generators header extractors and RFC-compliant email construction utilities.

5 min read Feb 10, 2029

Put these ideas into practice

Claude Skills 360 gives you production-ready skills for everything in this article — and 2,350+ more. Start free or go all-in.

Back to Blog

Get 360 skills free