The blog of

Posts from March 2024

"Hang loose" is for surfers, not developers [Why I pin dependency versions in Node.js packages]

A few days ago, I posted a response to a question I get asked about open-source project management. Here we go again - this time the topic is dependency versioning.

What is a package dependency?

In the Node.js ecosystem, packages (a.k.a. projects) can make use of other packages by declaring them as a dependency in package.json and specifying the range of supported versions. When a package gets installed, the package manager (typically npm) makes sure appropriate versions of all dependencies are included.

How are dependency versions specified?

The Node community uses semantic versioning of the form major.minor.patch. There are many ways to specify a version range, but the most common is to identify a specific version (typically the most recent) and prefix it with a tilde or caret to signify that later versions which differ only by patch or minor.patch are also acceptable. For example: ~1.2.3 and ^1.2.3. This is what is meant by "loose" versioning.

Why does the community use "loose" versioning?

The intent of loose versioning is to automatically benefit from bug fixes and non-breaking changes to dependent packages. Any time an install is run or an update is performed, the latest (allowable) version of such dependencies will be included and the user will seamlessly benefit from any bug fixes that were made since the named version was published.

What is a "pinned" dependency version?

A pinned dependency version specifies a particular major.minor.patch version and does not include any modifiers. The only version that satisfies this range is the exact version listed. For example: 1.2.3. Bug fixes to such package dependencies will not be used until a new version of the package that references them is published (with updated references).

Why is pinning a better versioning strategy?

Pinning ensures that users only run a package with the set of dependencies it has been tested with. While this doesn't rule out the possibility of bugs, it's far safer and more predictable than loose versioning, which allows users to run with an unpredictable set of dependencies. In the loose versioning worst case, every install of a package could have a different set of dependencies. This is a nightmare for quality and reliability. With pinning, behavior changes only show up when the user decides to update versions. If anything breaks, the upgrade can be skipped while the issue is investigated. Loose versioning doesn't allow "undo"; when something breaks, you're stuck until a fix gets published.

What's so bad about running untested configurations?

As much as developers may try to ensure consistent behavior across minor- and patch-level version updates, any change - no matter how small - has the possibility of altering behavior and causing failures. Worse, such behavior changes show up unexpectedly and unpredictably and can be difficult to track down, especially for users who may not even realize the broken package was being used. I've had to investigate such issues on multiple occasions and think it is a waste of time for users and package maintainers alike.

Are popular projects safer to version loosely?

Well-run projects with thorough testing are probably less likely to cause problems then single-person hobby projects. But the underlying issue is the same: any change to dependency code can change runtime behavior and cause problems.

What about missing out on security bug fixes due to pinning?

While the urgency to include a security bug fix may be higher than a normal bug fix, the same challenges apply. There's no general-purpose way to identify a security fix from a normal fix from a breaking change.

Could pinning lead to larger install sizes?

Yes, because the package manager doesn't have as much freedom to choose among package versions that are shared by multiple dependencies. However, this is a speculative optimization with limited benefit in practice as disk space is comparatively inexpensive. Correctness and predictability are far more important.

Isn't pinning pointless if dependent packages version loosely?

No, though it's less effective because those transitive dependencies can change/break at any time. My opinion is that every package should use pinning, but I can only enforce that policy for my own packages. (But maybe by setting a good example, I can be the change I want to see in the world...)

Is there a way to force a dependency update for a pinned package?

Yes, by updating a project's package.json to use overrides (npm) or resolutions (yarn). This means users who are worried about a specific dependency version can make sure that version is used in their scenario - and any resulting problems are their responsibility to deal with.

Does pinning versions create more work for a maintainer?

No, maintainers should already be updating package dependencies as part of each release. This can be done manually or automatically through the use of a tool like Dependabot.

Further reading

"DRINK ME" [Why I do not include npm-shrinkwrap.json in Node.js tool packages]

I maintain a few open source projects and get asked some of the same questions from time to time. I wrote the explanation below in August of 2023 and posted it as a GitHub Gist; I am capturing it here for easier reference.


For historical purposes and possible future reference, here are my notes on why I backed out a change to use npm-shrinkwrap.json in markdownlint-cli2.

The basic problem is that npm will include platform-specific packages in npm-shrinkwrap.json. Specifically, if one generates npm-shrinkwrap.json on Mac, it may include components (like fsevents) that are only supported on Mac. Attempts to use a published package with such a npm-shrinkwrap.json on a different platform like Linux or Windows fails with EBADPLATFORM. This seems (to me, currently) like a fundamental and fatal flaw with the way npm implements npm-shrinkwrap.json. And while there are ways npm might address this problem, the current state of things seems unusably broken.

To make this concrete, the result of running rm npm-shrinkwrap.json && npm install && npm shrinkwrap for this project on macOS can be found here: Note that fsevents is an optional Mac-only dependency: Including it is not wrong per se, but sets the stage for failure as reproduced via GitHub Codespaces:

@DavidAnson > /workspaces/temp (main) $ ls
@DavidAnson > /workspaces/temp (main) $ node --version
@DavidAnson > /workspaces/temp (main) $ npm --version
@DavidAnson > /workspaces/temp (main) $ npm install markdownlint-cli2@v0.9.0
npm WARN deprecated date-format@0.0.2: 0.x is no longer supported. Please upgrade to 4.x or higher.

added 442 packages in 4s

9 packages are looking for funding
  run `npm fund` for details
@DavidAnson > /workspaces/temp (main) $ npm clean-install
npm ERR! notsup Unsupported platform for fsevents@2.3.3: wanted {"os":"darwin"} (current: {"os":"linux"})
npm ERR! notsup Valid os:  darwin
npm ERR! notsup Actual os: linux

npm ERR! A complete log of this run can be found in: /home/codespace/.npm/_logs/2023-08-27T18_24_58_585Z-debug-0.log
@DavidAnson > /workspaces/temp (main) $

Note that the initial package install succeeded, but the subsequent attempt to use clean-install failed due to the platform mismatch. This is a basic scenario and the user is completely blocked at this point.

Because this is a second-level failure, it is not caught by most reasonable continuous integration configurations which work from the current project directory instead of installing and testing via the packed .tgz file. However, attempts to reproduce this failure in CI via .tgz were unsuccessful: From what I can tell, npm install of a local .tgz file is handled differently than when that same (identical) file is installed via the package repository.

While there are some efforts to test the .tgz scenario better (for example:, better testing does not solve the fundamental problem that npm-shrinkwrap.json is a platform-specific file that gets used by npm in a cross-platform manner.

Unrelated, but notable: npm installs ALL package dependencies when npm-shrinkwrap.json is present - even in a context where it would normally NOT install devDependencies. Contrast the 442 packages installed above vs. the 40 when --omit=dev is used explicitly:

@DavidAnson > /workspaces/temp (main) $ npm install markdownlint-cli2@v0.9.0 --omit=dev

added 40 packages in 1s

9 packages are looking for funding
  run `npm fund` for details
@DavidAnson > /workspaces/temp (main) $

But the default behavior of a dependency install in this manner is not to include devDependencies as seen when installing a version of this package without npm-shrinkwrap.json:

@DavidAnson > /workspaces/temp (main) $ npm install markdownlint-cli2@v0.9.2

added 35 packages in 2s

7 packages are looking for funding
  run `npm fund` for details
@DavidAnson > /workspaces/temp (main) $