Course:Node.js & Express/
Lesson

In the previous lesson you learned that synchronous file operations are safe only at startup. For everything else, request handlers, background jobs, APIWhat is api?A set of rules that lets one program talk to another, usually over the internet, by sending requests and getting responses. responses, you need the asynchronous versions that let the event loopWhat is event loop?The mechanism that lets Node.js handle many operations on a single thread by delegating slow tasks and processing their results when ready. keep running while your I/O is in progress. Think of it like placing an order at a restaurant: the non-blocking version lets the waiter take other tables' orders while the kitchen is preparing your food. The synchronous version would have the waiter stand frozen at your table staring into the middle distance until your food arrives.

Node.js offers three different APIs for async file operations: the legacy callbackWhat is callback?A function you pass into another function to be called later, often when an operation finishes or an event occurs. style, the slightly newer fs.promises namespace, and the cleanest option, fs/promises as a dedicated import. Always use the last one.

Reading and writing with async/awaitWhat is async/await?A syntax that lets you write asynchronous code (like fetching data) in a readable, step-by-step style instead of chaining callbacks.

The fs/promises moduleWhat is module?A self-contained file of code with its own scope that explicitly exports values for other files to import, preventing name collisions. mirrors the synchronous APIWhat is api?A set of rules that lets one program talk to another, usually over the internet, by sending requests and getting responses. almost exactly, the method names are the same, you just drop the Sync suffix and add await:

import { readFile, writeFile } from 'fs/promises';

async function updateConfig() {
  try {
    // Read - yields control to the event loop while disk I/O happens
    const raw = await readFile('./config.json', 'utf-8');
    const config = JSON.parse(raw);

    // Mutate in memory
    config.version = '2.0.0';
    config.updatedAt = new Date().toISOString();

    // Write - yields again while the file is being written
    await writeFile('./config.json', JSON.stringify(config, null, 2));

    console.log('Config updated successfully');
  } catch (error) {
    console.error('Failed to update config:', error.message);
  }
}

await updateConfig();

The flow is identical to the synchronous version conceptually, but Node.js is free to handle other work during each await. Two awaited I/O operations in this function means two opportunities for the event loopWhat is event loop?The mechanism that lets Node.js handle many operations on a single thread by delegating slow tasks and processing their results when ready. to serve other requests.

02

Reading multiple files concurrently

One of the biggest advantages of the async APIWhat is api?A set of rules that lets one program talk to another, usually over the internet, by sending requests and getting responses. is that you can kick off multiple operations simultaneously and wait for all of them together. Promise.all() is your tool for this:

import { readFile } from 'fs/promises';

async function loadTranslations() {
  // All three reads start at the same time
  const [en, fr, es] = await Promise.all([
    readFile('./locales/en.json', 'utf-8'),
    readFile('./locales/fr.json', 'utf-8'),
    readFile('./locales/es.json', 'utf-8')
  ]);

  return {
    en: JSON.parse(en),
    fr: JSON.parse(fr),
    es: JSON.parse(es)
  };
}

Compare this to running them one after the other with three separate await calls. If each file takes 10ms to read, the sequential approach takes 30ms total. The Promise.all() approach takes roughly 10ms, all three reads happen in parallel.

03

Managing directories

The async directory methods work the same way as their sync counterparts, just awaited:

import { mkdir, readdir, rm } from 'fs/promises';

async function setupProjectStructure() {
  // Create nested folder structure in one call
  await mkdir('./project/src/components', { recursive: true });
  await mkdir('./project/src/utils', { recursive: true });
  await mkdir('./project/public/images', { recursive: true });

  // List directory contents with type information
  const entries = await readdir('./project', { withFileTypes: true });

  for (const entry of entries) {
    const type = entry.isDirectory() ? 'dir ' : 'file';
    console.log(`[${type}] ${entry.name}`);
  }

  // Remove a directory and all its contents
  await rm('./project/temp', { recursive: true, force: true });
}

The withFileTypes: true option is worth highlighting. Without it, readdir() returns an array of plain strings (file names). With it, you get Dirent objects that know whether each entry is a file or directory, saving you an extra stat() call per entry when you need to know.

04

Handling errors properly

File system errors are inevitable: files get deleted, permissions change, disks fill up. Node.js attaches an error.code property to file system errors so you can respond to different failure modes differently:

import { readFile, mkdir } from 'fs/promises';

async function safeReadFile(filePath) {
  try {
    return await readFile(filePath, 'utf-8');
  } catch (error) {
    if (error.code === 'ENOENT') {
      // File does not exist - this might be expected
      return null;
    }
    if (error.code === 'EACCES') {
      throw new Error(`Permission denied reading ${filePath}`);
    }
    // Anything else is unexpected - re-throw so callers know
    throw error;
  }
}

async function ensureDir(dirPath) {
  try {
    await mkdir(dirPath);
  } catch (error) {
    // Directory already exists - that's fine
    if (error.code !== 'EEXIST') throw error;
  }
}
Always re-throw errors you do not explicitly handle. Silently swallowing unknown errors makes debugging extraordinarily difficult, a disk-full error looks exactly like a missing file error if you catch everything and return null.
05

Processing large files with streams

If you read a 2GB log file with readFile(), Node.js loads the entire 2GB into memory at once. For large files, you want streams: a mechanism that processes data chunk by chunk as it arrives off the disk.

import { createReadStream, createWriteStream } from 'fs';
import { pipeline } from 'stream/promises';
import { createGzip } from 'zlib';

async function compressLogFile(inputPath, outputPath) {
  await pipeline(
    createReadStream(inputPath),      // Read from disk in chunks
    createGzip(),                     // Compress each chunk
    createWriteStream(outputPath)     // Write compressed chunks to disk
  );

  console.log(`Compressed to ${outputPath}`);
}

await compressLogFile('./app.log', './app.log.gz');

pipeline() from stream/promises handles cleanup automatically, if any stage in the pipelineWhat is pipeline?A sequence of automated steps (install, lint, test, build, deploy) that code passes through before reaching production. throws, it tears down all the other streams and rejects the promiseWhat is promise?An object that represents a value you don't have yet but will get in the future, letting your code keep running while it waits.. This was a common source of resource leaks in older Node.js code.

06

Common error codes

CodeMeaningCommon cause
ENOENTNo such file or directoryPath typo, file deleted
EACCESPermission deniedWrong file permissions
EISDIRIs a directoryPassed a folder path where a file was expected
ENOTDIRNot a directoryPassed a file path where a folder was expected
EEXISTFile already existsCreating a file or dir that is already there
ENOSPCNo space left on deviceDisk is full