Claude Code for aiofiles: Async File I/O in Python — Claude Skills 360 Blog
Blog / AI / Claude Code for aiofiles: Async File I/O in Python
AI

Claude Code for aiofiles: Async File I/O in Python

Published: April 2, 2028
Read time: 5 min read
By: Claude Skills 360

aiofiles provides async-compatible file I/O for Python asyncio applications. pip install aiofiles. Read: async with aiofiles.open("f.txt") as f: text = await f.read(). Write: async with aiofiles.open("f.txt", "w") as f: await f.write("hello"). Readline: await f.readline(). Readlines: await f.readlines() → list. Iteration: async for line in f:. Seek: await f.seek(0). Tell: await f.tell(). Binary: aiofiles.open("img.png", "rb"). Append: open("log.txt", "a"). Truncate: await f.truncate(0). Parallel read: asyncio.gather(read(f1), read(f2)) — both run concurrently. aiofiles.os.path.exists(p) async path check. aiofiles.os.stat(p). aiofiles.os.remove(p). aiofiles.os.rename(src, dst). aiofiles.os.makedirs(p, exist_ok=True). aiofiles.os.listdir(p) → list. Tempfile: async with aiofiles.tempfile.NamedTemporaryFile() as tmp: await tmp.write(b"data"). TemporaryDirectory. Wrap: aiofiles.threadpool.wrap(sync_file_obj). JSON: json.loads(await f.read()). CSV: read then pass to io.StringIO. FastAPI: UploadFile.read() then aiofiles.open(path, "wb"). BufferedReader. Claude Code generates aiofiles async file utilities, parallel batch readers, and FastAPI upload/download handlers.

CLAUDE.md for aiofiles

## aiofiles Stack
- Version: aiofiles >= 23.2 | pip install aiofiles
- Read: async with aiofiles.open(path) as f: text = await f.read()
- Write: async with aiofiles.open(path, "w") as f: await f.write(text)
- Binary: aiofiles.open(path, "rb") / "wb" — same API as text mode
- Parallel: asyncio.gather(read_file(a), read_file(b)) — concurrent reads
- OS ops: aiofiles.os.path.exists() | .stat() | .remove() | .makedirs()
- Temp: async with aiofiles.tempfile.NamedTemporaryFile() as tmp: ...

aiofiles Async File I/O Pipeline

# app/files.py — aiofiles async file read/write, batch ops, and FastAPI helpers
from __future__ import annotations

import asyncio
import io
import json
import os
from pathlib import Path
from typing import Any, AsyncIterator

import aiofiles
import aiofiles.os
import aiofiles.os.path
import aiofiles.tempfile


# ─────────────────────────────────────────────────────────────────────────────
# 1. Core read/write helpers
# ─────────────────────────────────────────────────────────────────────────────

async def read_text(
    path: str | Path,
    encoding: str = "utf-8",
    errors: str = "replace",
) -> str:
    """Read an entire text file asynchronously."""
    async with aiofiles.open(str(path), encoding=encoding, errors=errors) as f:
        return await f.read()


async def write_text(
    path: str | Path,
    content: str,
    encoding: str = "utf-8",
    mode: str = "w",
) -> int:
    """Write text to a file. Returns bytes written."""
    async with aiofiles.open(str(path), mode=mode, encoding=encoding) as f:
        return await f.write(content)


async def read_bytes(path: str | Path) -> bytes:
    """Read a binary file."""
    async with aiofiles.open(str(path), "rb") as f:
        return await f.read()


async def write_bytes(path: str | Path, data: bytes, mode: str = "wb") -> int:
    """Write bytes to a file."""
    async with aiofiles.open(str(path), mode) as f:
        return await f.write(data)


async def append_text(path: str | Path, line: str, encoding: str = "utf-8") -> None:
    """Append a line to a text file."""
    async with aiofiles.open(str(path), "a", encoding=encoding) as f:
        await f.write(line if line.endswith("\n") else line + "\n")


async def read_lines(
    path: str | Path,
    encoding: str = "utf-8",
    strip: bool = True,
) -> list[str]:
    """Read all lines from a text file."""
    async with aiofiles.open(str(path), encoding=encoding) as f:
        lines = await f.readlines()
    return [l.rstrip("\n") for l in lines] if strip else lines


# ─────────────────────────────────────────────────────────────────────────────
# 2. JSON helpers
# ─────────────────────────────────────────────────────────────────────────────

async def read_json(path: str | Path) -> Any:
    """Parse a JSON file asynchronously."""
    text = await read_text(path)
    return json.loads(text)


async def write_json(
    path: str | Path,
    data: Any,
    indent: int = 2,
    ensure_ascii: bool = False,
) -> None:
    """Write data as JSON to a file."""
    await write_text(path, json.dumps(data, indent=indent, ensure_ascii=ensure_ascii))


async def update_json(path: str | Path, update_fn) -> Any:
    """
    Read, modify, and write a JSON file atomically (within a single coroutine).
    update_fn: callable(data) → new_data
    """
    try:
        data = await read_json(path)
    except (FileNotFoundError, json.JSONDecodeError):
        data = {}
    new_data = update_fn(data)
    await write_json(path, new_data)
    return new_data


# ─────────────────────────────────────────────────────────────────────────────
# 3. Batch / parallel operations
# ─────────────────────────────────────────────────────────────────────────────

async def read_many(
    paths: list[str | Path],
    encoding: str = "utf-8",
    concurrency: int = 20,
) -> list[str]:
    """
    Read multiple files concurrently.
    Returns list of text content in same order as paths.
    Limits concurrency to avoid opening too many file handles.
    """
    sem = asyncio.Semaphore(concurrency)

    async def _read(path: str | Path) -> str:
        async with sem:
            return await read_text(path, encoding=encoding)

    return await asyncio.gather(*(_read(p) for p in paths))


async def write_many(
    files: dict[str | Path, str],
    encoding: str = "utf-8",
    concurrency: int = 20,
) -> None:
    """Write multiple files concurrently."""
    sem = asyncio.Semaphore(concurrency)

    async def _write(path: str | Path, content: str) -> None:
        async with sem:
            await write_text(path, content, encoding=encoding)

    await asyncio.gather(*(_write(p, c) for p, c in files.items()))


async def copy_file(src: str | Path, dst: str | Path) -> None:
    """Async file copy."""
    data = await read_bytes(src)
    await write_bytes(dst, data)


async def copy_many(
    pairs: list[tuple[str | Path, str | Path]],
    concurrency: int = 10,
) -> None:
    """Copy multiple files concurrently."""
    sem = asyncio.Semaphore(concurrency)

    async def _copy(src, dst):
        async with sem:
            await copy_file(src, dst)

    await asyncio.gather(*(_copy(s, d) for s, d in pairs))


# ─────────────────────────────────────────────────────────────────────────────
# 4. Line-by-line streaming
# ─────────────────────────────────────────────────────────────────────────────

async def stream_lines(
    path: str | Path,
    encoding: str = "utf-8",
) -> AsyncIterator[str]:
    """
    Async generator that yields lines one at a time — good for large files.

    Usage:
        async for line in stream_lines("big.log"):
            process(line)
    """
    async with aiofiles.open(str(path), encoding=encoding, errors="replace") as f:
        async for line in f:
            yield line.rstrip("\n")


async def tail(
    path: str | Path,
    n: int = 20,
    encoding: str = "utf-8",
) -> list[str]:
    """Return the last n lines of a file."""
    lines: list[str] = []
    async for line in stream_lines(path, encoding):
        lines.append(line)
    return lines[-n:]


# ─────────────────────────────────────────────────────────────────────────────
# 5. Async filesystem helpers
# ─────────────────────────────────────────────────────────────────────────────

async def exists(path: str | Path) -> bool:
    """Check if a path exists."""
    return await aiofiles.os.path.exists(str(path))


async def file_size(path: str | Path) -> int:
    """Return file size in bytes."""
    stat = await aiofiles.os.stat(str(path))
    return stat.st_size


async def remove(path: str | Path) -> None:
    """Remove a file."""
    await aiofiles.os.remove(str(path))


async def rename(src: str | Path, dst: str | Path) -> None:
    """Rename/move a file."""
    await aiofiles.os.rename(str(src), str(dst))


async def makedirs(path: str | Path, exist_ok: bool = True) -> None:
    """Create directory tree."""
    await aiofiles.os.makedirs(str(path), exist_ok=exist_ok)


async def listdir(path: str | Path) -> list[str]:
    """List directory contents."""
    return await aiofiles.os.listdir(str(path))


# ─────────────────────────────────────────────────────────────────────────────
# 6. Temp file helpers
# ─────────────────────────────────────────────────────────────────────────────

async def write_temp(
    data: bytes | str,
    suffix: str = "",
    prefix: str = "tmp_",
    delete: bool = True,
) -> str:
    """
    Write data to a temp file; return its path.
    delete=False keeps the file after the context exits.
    """
    mode = "wb" if isinstance(data, bytes) else "w"
    async with aiofiles.tempfile.NamedTemporaryFile(
        mode=mode,
        suffix=suffix,
        prefix=prefix,
        delete=delete,
    ) as tmp:
        await tmp.write(data)
        path = tmp.name
    return path


# ─────────────────────────────────────────────────────────────────────────────
# 7. FastAPI / aiohttp helpers
# ─────────────────────────────────────────────────────────────────────────────

async def save_upload(
    upload_file,
    destination: str | Path,
    chunk_size: int = 1024 * 64,
) -> int:
    """
    Save a FastAPI UploadFile to disk asynchronously.
    Returns total bytes written.

    Usage (FastAPI):
        @app.post("/upload")
        async def upload(file: UploadFile = File(...)):
            size = await save_upload(file, f"uploads/{file.filename}")
            return {"bytes": size}
    """
    total = 0
    async with aiofiles.open(str(destination), "wb") as out:
        while chunk := await upload_file.read(chunk_size):
            await out.write(chunk)
            total += len(chunk)
    return total


async def stream_file_response(path: str | Path, chunk_size: int = 1024 * 64):
    """
    Async generator for streaming a file — use as FastAPI StreamingResponse body.

    Usage:
        @app.get("/download/{filename}")
        async def download(filename: str):
            return StreamingResponse(
                stream_file_response(f"files/{filename}"),
                media_type="application/octet-stream",
            )
    """
    async with aiofiles.open(str(path), "rb") as f:
        while chunk := await f.read(chunk_size):
            yield chunk


# ─────────────────────────────────────────────────────────────────────────────
# 8. Async log writer
# ─────────────────────────────────────────────────────────────────────────────

class AsyncLogWriter:
    """
    Write log lines to a file asynchronously via an asyncio.Queue.
    Avoids blocking the event loop on every log write.

    Usage:
        writer = AsyncLogWriter("app.log")
        await writer.start()
        await writer.log("Starting up")
        ...
        await writer.stop()
    """

    def __init__(self, path: str | Path, encoding: str = "utf-8"):
        self._path = str(path)
        self._encoding = encoding
        self._queue: asyncio.Queue[str | None] = asyncio.Queue()
        self._task: asyncio.Task | None = None

    async def start(self) -> None:
        self._task = asyncio.create_task(self._worker())

    async def stop(self) -> None:
        await self._queue.put(None)  # sentinel
        if self._task:
            await self._task

    async def log(self, message: str) -> None:
        await self._queue.put(message + "\n")

    async def _worker(self) -> None:
        async with aiofiles.open(self._path, "a", encoding=self._encoding) as f:
            while True:
                item = await self._queue.get()
                if item is None:
                    break
                await f.write(item)


# ─────────────────────────────────────────────────────────────────────────────
# Demo
# ─────────────────────────────────────────────────────────────────────────────

async def demo():
    import tempfile

    with tempfile.TemporaryDirectory() as tmpdir:
        base = Path(tmpdir)

        print("=== Write + Read ===")
        await write_text(base / "hello.txt", "Hello, aiofiles!\nLine 2\nLine 3\n")
        text = await read_text(base / "hello.txt")
        print(f"  Read: {text.strip()!r}")

        print("\n=== JSON round-trip ===")
        data = {"name": "MyApp", "version": "1.0.0", "debug": True}
        await write_json(base / "config.json", data)
        loaded = await read_json(base / "config.json")
        print(f"  JSON: {loaded}")

        print("\n=== Parallel write + read ===")
        files = {base / f"f{i}.txt": f"Content of file {i}" for i in range(5)}
        await write_many(files)
        texts = await read_many(list(files.keys()))
        print(f"  Read {len(texts)} files in parallel: {texts[:2]}")

        print("\n=== Stream lines ===")
        lines = []
        async for line in stream_lines(base / "hello.txt"):
            lines.append(line)
        print(f"  Streamed {len(lines)} lines: {lines}")

        print("\n=== File size ===")
        size = await file_size(base / "config.json")
        print(f"  config.json: {size} bytes")

        print("\n=== AsyncLogWriter ===")
        log_path = base / "app.log"
        writer = AsyncLogWriter(log_path)
        await writer.start()
        for i in range(5):
            await writer.log(f"Log line {i}")
        await writer.stop()
        log_lines = await read_lines(log_path)
        print(f"  Log lines: {log_lines}")


if __name__ == "__main__":
    asyncio.run(demo())

For the asyncio.to_thread alternative — asyncio.to_thread(sync_fn) runs any synchronous I/O in a thread pool executor, which works for file operations that don’t have a native async API; aiofiles uses the same thread pool mechanism but provides a native-feeling async file object with await f.read(), async for line in f:, and await f.write(), which is more idiomatic and easier to test than wrapping every stdlib call with to_thread. For the anyio alternative — anyio’s anyio.open_file() is the equivalent for projects using anyio’s backend-agnostic async I/O (works with asyncio and trio); aiofiles is asyncio-specific but has wider adoption and a more complete API including aiofiles.os.* and aiofiles.tempfile. The Claude Skills 360 bundle includes aiofiles skill sets covering aiofiles.open() read/write/append/binary modes, read_text()/write_text()/read_bytes()/write_bytes() helpers, read_json()/write_json()/update_json(), read_many()/write_many() concurrent batch ops, stream_lines() async generator, tail() last-N-lines, aiofiles.os.path.exists/stat/remove/makedirs, write_temp() temporary file, save_upload() FastAPI upload, stream_file_response() FastAPI download, and AsyncLogWriter queue-based async logger. Start with the free tier to try async file I/O code generation.

Keep Reading

AI

Claude Code for email.contentmanager: Python Email Content Accessors

Read and write EmailMessage body content with Python's email.contentmanager module and Claude Code — email contentmanager ContentManager for the class that maps content types to get and set handler functions allowing EmailMessage to support get_content and set_content with type-specific behaviour, email contentmanager raw_data_manager for the ContentManager instance that handles raw bytes and str payloads without any conversion, email contentmanager content_manager for the standard ContentManager instance used by email.policy.default that intelligently handles text plain text html multipart and binary content types, email contentmanager get_content_text for the handler that returns the decoded text payload of a text-star message part as a str, email contentmanager get_content_binary for the handler that returns the raw decoded bytes payload of a non-text message part, email contentmanager get_data_manager for the get-handler lookup used by EmailMessage get_content to find the right reader function for the content type, email contentmanager set_content text for the handler that creates and sets a text part correctly choosing charset and transfer encoding, email contentmanager set_content bytes for the handler that creates and sets a binary part with base64 encoding and optional filename Content-Disposition, email contentmanager EmailMessage get_content for the method that reads the message body using the registered content manager handlers, email contentmanager EmailMessage set_content for the method that sets the message body and MIME headers in one call, email contentmanager EmailMessage make_alternative make_mixed make_related for the methods that convert a simple message into a multipart container, email contentmanager EmailMessage add_attachment for the method that attaches a file or bytes to a multipart message, and email contentmanager integration with email.message and email.policy and email.mime and io for building high-level email readers attachment extractors text body accessors HTML readers and policy-aware MIME construction pipelines.

5 min read Feb 12, 2029
AI

Claude Code for email.charset: Python Email Charset Encoding

Control header and body encoding for international email with Python's email.charset module and Claude Code — email charset Charset for the class that wraps a character set name with the encoding rules for header encoding and body encoding describing how to encode text for that charset in email messages, email charset Charset header_encoding for the attribute specifying whether headers using this charset should use QP quoted-printable encoding BASE64 encoding or no encoding, email charset Charset body_encoding for the attribute specifying the Content-Transfer-Encoding to use for message bodies in this charset such as QP or BASE64, email charset Charset output_codec for the attribute giving the Python codec name used to encode the string to bytes for the wire format, email charset Charset input_codec for the attribute giving the Python codec name used to decode incoming bytes to str, email charset Charset get_output_charset for returning the output charset name, email charset Charset header_encode for encoding a header string using the charset's header_encoding method, email charset Charset body_encode for encoding body content using the charset's body_encoding, email charset Charset convert for converting a string from the input_codec to the output_codec, email charset add_charset for registering a new charset with custom encoding rules in the global charset registry, email charset add_alias for adding an alias name that maps to an existing registered charset, email charset add_codec for registering a codec name mapping for use by the charset machinery, and email charset integration with email.message and email.mime and email.policy and email.encoders for building international email senders non-ASCII header encoders Content-Transfer-Encoding selectors charset-aware message constructors and MIME encoding pipelines.

5 min read Feb 11, 2029
AI

Claude Code for email.utils: Python Email Address and Header Utilities

Parse and format RFC 2822 email addresses and dates with Python's email.utils module and Claude Code — email utils parseaddr for splitting a display-name plus angle-bracket address string into a realname and email address tuple, email utils formataddr for combining a realname and address string into a properly quoted RFC 2822 address with angle brackets, email utils getaddresses for parsing a list of raw address header strings each potentially containing multiple comma-separated addresses into a list of realname address tuples, email utils parsedate for parsing an RFC 2822 date string into a nine-tuple compatible with time.mktime, email utils parsedate_tz for parsing an RFC 2822 date string into a ten-tuple that includes the UTC offset timezone in seconds, email utils parsedate_to_datetime for parsing an RFC 2822 date string into an aware datetime object with timezone, email utils formatdate for formatting a POSIX timestamp or the current time as an RFC 2822 date string with optional usegmt and localtime flags, email utils format_datetime for formatting a datetime object as an RFC 2822 date string, email utils make_msgid for generating a globally unique Message-ID string with optional idstring and domain components, email utils decode_rfc2231 for decoding an RFC 2231 encoded parameter value into a tuple of charset language and value, email utils encode_rfc2231 for encoding a string as an RFC 2231 encoded parameter value, email utils collapse_rfc2231_value for collapsing a decoded RFC 2231 tuple to a Unicode string, and email utils integration with email.message and email.headerregistry and datetime and time for building address parsers date formatters message-id generators header extractors and RFC-compliant email construction utilities.

5 min read Feb 10, 2029

Put these ideas into practice

Claude Skills 360 gives you production-ready skills for everything in this article — and 2,350+ more. Start free or go all-in.

Back to Blog

Get 360 skills free