Or just use ISO8601 standard notation (e.g. "P1D" for one day)
fzeindl 3 hours ago [-]
ISO8601 durations should be used, like PT3M.
mort96 25 minutes ago [-]
Oh wow, never looked at ISO8601 durations before and I had no idea they were this ugly. Please, no, don't make me deal with ISO8601. I'd rather write a number of seconds or a format like 'X weeks' or 'Y hours Z minutes'x ISO8601 looks exclusively like a data interchange format
aa-jv 3 hours ago [-]
Should be easy, just add the ISO8601-duration package to your project ..
/s
postepowanieadm 4 hours ago [-]
If everyone is going to wait 3 days before installing the latest version of a compromised package, it will take more than 3 days to detect an incident.
anematode 3 hours ago [-]
A lot of people will still use npm, so they'll be the canaries in the coal mine :)
More seriously, automated scanners seem to do a good job already of finding malicious packages. It's a wonder that npm themselves haven't already deployed an automated countermeasure.
> It started with a cryptic build failure in our CI/CD pipeline, which my colleague noticed
> This seemingly minor error was the first sign of a sophisticated supply chain attack. We traced the failure to a small dependency, error-ex. Our package-lock.json specified the stable version 1.3.2 or newer, so it installed the latest version 1.3.3, which got published just a few minutes earlier.
2 hours ago [-]
vasachi 3 hours ago [-]
If only there was a high-ranking official at Microsoft, who could prioritize security[1]! /s
Not really, app sec companies scan npm constantly for updated packages to check for malware. Many attacks get caught that way.
e.g. the debug + chalk supply chain attack was caught like this: https://www.aikido.dev/blog/npm-debug-and-chalk-packages-com...
blamestross 55 minutes ago [-]
1) Checks and audits will still happen (if they are happening at all)
2) Real chances for owners to notice they have been compromised
3) Adopt early before that commons is fully tragedy-ed.
the_mitsuhiko 2 hours ago [-]
I think uv should get some credit for being an early supporter of this. They originally added it as a hidden way to create stable fixtures for their own tests, but it has become a pretty popular flag to use.
This for instance will only install packages that are older than 14 days:
It's great to see this kind of stuff being adopted in more places.
mcintyre1994 2 hours ago [-]
Nice, but I think the config file is a much better implementation for protecting against supply chain attacks, particularly those targeting developers rather than runtime. You don’t want to rely on every developer passing a flag every time they install. This does suffer from the risk of using `npm install` instead of `pnpm install` though.
It would also be nice to have this as a flag so you can use it on projects that haven't configured it though, I wonder if that could be added too.
OskarS 3 hours ago [-]
I have a question: when I’ve seen people discussing this setting, people talk about using like ”3 days” or ”7 days” as the timeout, which seems insanely short to me for production use. As a C++ developer, I would be hesitant to use any dependency in the first six months of release in production, unless there’s some critical CVE or something (then again, we make client side applications with essentially no networking, so security isn’t as critical for us, stability is much more important).
Does the JS ecosystem really move so fast that you can’t wait a month or two before updating your packages?
dtech 1 minutes ago [-]
Waiting 6 months to upgrade a dependency seems crazy, that's definitely not a thing in other languages or maybe companies. (It might be due to priorization, but not due to some rule of thumb)
In the JVM ecosystem it's quite common to have Dependabot or Renovate automatically create PRs for dependency upgrades withing a few hours of it being released. If it's manual it highly irregular and depends on the company.
patwolf 3 minutes ago [-]
It's common to have npm auditing enabled, which means your CI/CD will force you to update to a brand new version of a package because a security vulnerability was reported in an older one.
I've also had cases where I've found a bug in a package, submitted a bug report or PR, and then immediately pulled in the new version as soon as it was fixed. Things move fast in the JavaScript/npm/GitHub ecosystem.
codemonkey-zeta 16 minutes ago [-]
I think the surface area for bugs in a C++ dependency is way bigger than a JS one. Pulling in a new node module is not going to segfault my app, for example.
creesch 2 hours ago [-]
> Does the JS ecosystem really move so fast that you can’t wait a month or two before updating your packages?
Really depends on the context and where the code is being used. As others have pointed out most js packages will use semantic versioning. For the patch releases (the last of the three numbers), for code that is exposed to the outside world you generally want to apply those rather quickly. As those will contain hotfixes including those fixing CVEs.
For the major and minor releases it really depends on what sort of dependencies you are using and how stable they are.
The issue isn't really unique to the JavaScript eco system either. A bigger java project (certainly with a lot of spring related dependencies) will also see a lot of movement.
That isn't to say that some tropes about the JavaScript ecosystem being extremely volatile aren't entirely true. But in this case I do think the context is the bigger difference.
> then again, we make client side applications with essentially no networking, so security isn’t as critical for us, stability is much more important)
By its nature, most JavaScript will be network connected in some fashion in environments with plenty of bad actors.
ozim 3 hours ago [-]
NPM packages follow semantic versioning so minor versions should be fine to auto update. (there is still an issue what for package maintainer might be minor not being minor for you - but let's stick to ideal world for that)
I don't think people are having major versions updated every month, it is more really like 6 months or once a year.
I guess the problem might be people think auto updating minor versions in CI/CD pipeline will keep them more secure as bug fixes should be in minor versions but in reality we see it is not the case and attackers use it to spread malware.
progx 3 hours ago [-]
Yes, but this is not only JS dependent, in PHP (composer) is the same.
Normally old major or minor packages don't get an update, only the latest.
E.g. 4.1.47 (no update), 4.2.1 (yes got update).
So if the problem is in 4.1 you must "upgrade" to 4.2.
With "perfect" semver, this shouldn't be a problem, cause 4.2 only add new features... but... back to reality, the world is not perfect.
pandemic_region 2 hours ago [-]
> Does the JS ecosystem really move so fast that you can’t wait a month or two before updating your packages?
In 2 months, a typical js framework goes through the full Gartner Hype Cycle and moves to being unmaintained with an archived git repo and dozens of virus infected forks with similar names.
There's an open discussion about adding something similar to bun as well^
minimumReleaseAge doesn't seem to be a bulletproof solution so there's still some research/testing to be done in this area
keraf 2 hours ago [-]
I might be naive but why isn't any package manager (npm, pnpm, bun, yarn, ...) pushing for a permission system, where packages have to define in the package.json what permission they would like to access? À la Deno but scoped to dependencies or like mobile apps do with their manifest.
I know it would take time for packages to adopt this but it could be implemented as parameters when installing a new dependency, like `npm i ping --allow-net`. I wouldn't give a library like chalk access to I/O, processes or network.
IanCal 1 hours ago [-]
I feel like that would require work from the language side, or at least runtimes. Is there a way of stopping code in one package from, say, hitting the network?
You might be able to do this around install scripts, though disk writing is likely needed for all (but perhaps locations could be controlled).
Filligree 1 hours ago [-]
We've seen a lot of stunningly incompetent attacks that nevertheless get to a lot of people.
Yeah, it needs work from the language runtime, but I think even a hacky, leaky 'security' abstraction would be helpful, because the majority of malware developers probably aren't able to break out of a language-level sandbox, even if the language still allows you to do unsafe array access.
Then we can iterate.
__MatrixMan__ 52 minutes ago [-]
Its not a bad idea, might help in certain cases.
But the real solution to this kind of attack is to stop resolving packages by name and instead resolve them by hash, then binding a name to that hash for local use.
That would of course be a whole different, mostly unexplored, world, but there's just no getting around the fact that blindly accepting updated versions of something based on its name is always going to create juicy attack surface around the resolution of that name to some bits.
mort96 21 minutes ago [-]
The problem here isn't, "someone introduced malware into an existing version of a package". The problem is, "people want to stay up to date, so when a new patch version is released, everyone upgrades to that new patch version".
frankdejonge 36 minutes ago [-]
Resolving by hash is a half solution at best. Not having automated dependency upgrades also has severe security downsides. Apart from that, lock files basically already do what you describe, they contain the hashes and the resolution is based off the name while the hash ensures for the integrity of the resolved package. The problem is upgrade automation and supply chain scanning. The biggest issue there is that scanning is not done where the vulnerability is introduced because there is no money for it.
mirekrusin 46 minutes ago [-]
name + version are immutable, you can't republish packages in npm under existing version.
you can only unpublish.
content hash integrity is verified in lockfiles.
the problem is with dependencies using semver ranges, especially wide ones like "debug": "*"
initiatives like provenance statements [0] / code signing are also good complement to delayed dependency updates.
also not running as default / whitelisting postinstall scripts is good default in pnpm.
modifying (especially adding) keys in npmjs.org should be behind dedicated 2fa (as well as changing 2fa)
'Delayed dependency updates' is a response to supply-side attacks in the JavaScript world, but it aptly describes how I have come to approach technology broadly.
Large tech companies, as with most industry, have realized most people will pay with their privacy and data long before they'll pay with money. We live in a time of the Attention Currency, after all.
But you don't need to be a canary to live a technology-enabled life. Much software that you pay with your privacy and data has free or cheap open-source alternatives that approach the same or higher quality. When you orient your way of consuming to 'eh, I can wait till the version that respects me is built', life becomes more enjoyable in myriad ways.
I don't take this to absolute levels. I pay for fancy pants LLM's, currently. But I look forward to the day not too far away where I can get today's quality for libre in my homelab.
A better (not perfect) solution: Every package should by AI analysed on an update before it is public available, to detect dangerous code and set a rating.
In package.json should be a rating defined, when remote package is below that value it could be updated, if it is higher a warning should appear.
But this will cost, but i hope, that companies like github, etc. will allow package-Repositories to use their services for free. Or we should find a way, to distribute this services to us (the users and devs) like a BOINC-Client.
jonkoops 3 hours ago [-]
Ah, yes! The universal and uncheatable LLM! Surely nothing can go wrong.
NitpickLawyer 2 hours ago [-]
Perfect is the enemy of good. Current LLM systems + "traditional tools" for scanning can get you pretty far into detecting the low hanging fruit. Hell, I bet even a semantic search with small embedding models could give you a good insight into "what's in the release notes matches what's in the code". Simply flag it for being delayed a few hours, till a human can view it. Or run additional checks.
progx 3 hours ago [-]
I can't wait to read about your solution.
orphea 2 hours ago [-]
You don't need to be a chef to tell that the soup is too salty.
progx 3 hours ago [-]
As i wrote "not perfect". But better than anything else or nothing.
https://pnpm.io/settings#modulescachemaxage
/s
More seriously, automated scanners seem to do a good job already of finding malicious packages. It's a wonder that npm themselves haven't already deployed an automated countermeasure.
> It started with a cryptic build failure in our CI/CD pipeline, which my colleague noticed
> This seemingly minor error was the first sign of a sophisticated supply chain attack. We traced the failure to a small dependency, error-ex. Our package-lock.json specified the stable version 1.3.2 or newer, so it installed the latest version 1.3.3, which got published just a few minutes earlier.
[1] https://blogs.microsoft.com/blog/2024/05/03/prioritizing-sec...
2) Real chances for owners to notice they have been compromised
3) Adopt early before that commons is fully tragedy-ed.
This for instance will only install packages that are older than 14 days:
uv sync --exclude-newer $(date -u -v-14d '+%Y-%m-%dT%H:%M:%SZ')
It's great to see this kind of stuff being adopted in more places.
It would also be nice to have this as a flag so you can use it on projects that haven't configured it though, I wonder if that could be added too.
Does the JS ecosystem really move so fast that you can’t wait a month or two before updating your packages?
In the JVM ecosystem it's quite common to have Dependabot or Renovate automatically create PRs for dependency upgrades withing a few hours of it being released. If it's manual it highly irregular and depends on the company.
I've also had cases where I've found a bug in a package, submitted a bug report or PR, and then immediately pulled in the new version as soon as it was fixed. Things move fast in the JavaScript/npm/GitHub ecosystem.
Really depends on the context and where the code is being used. As others have pointed out most js packages will use semantic versioning. For the patch releases (the last of the three numbers), for code that is exposed to the outside world you generally want to apply those rather quickly. As those will contain hotfixes including those fixing CVEs.
For the major and minor releases it really depends on what sort of dependencies you are using and how stable they are.
The issue isn't really unique to the JavaScript eco system either. A bigger java project (certainly with a lot of spring related dependencies) will also see a lot of movement.
That isn't to say that some tropes about the JavaScript ecosystem being extremely volatile aren't entirely true. But in this case I do think the context is the bigger difference.
> then again, we make client side applications with essentially no networking, so security isn’t as critical for us, stability is much more important)
By its nature, most JavaScript will be network connected in some fashion in environments with plenty of bad actors.
I don't think people are having major versions updated every month, it is more really like 6 months or once a year.
I guess the problem might be people think auto updating minor versions in CI/CD pipeline will keep them more secure as bug fixes should be in minor versions but in reality we see it is not the case and attackers use it to spread malware.
Normally old major or minor packages don't get an update, only the latest.
E.g. 4.1.47 (no update), 4.2.1 (yes got update).
So if the problem is in 4.1 you must "upgrade" to 4.2.
With "perfect" semver, this shouldn't be a problem, cause 4.2 only add new features... but... back to reality, the world is not perfect.
In 2 months, a typical js framework goes through the full Gartner Hype Cycle and moves to being unmaintained with an archived git repo and dozens of virus infected forks with similar names.
There's an open discussion about adding something similar to bun as well^
minimumReleaseAge doesn't seem to be a bulletproof solution so there's still some research/testing to be done in this area
I know it would take time for packages to adopt this but it could be implemented as parameters when installing a new dependency, like `npm i ping --allow-net`. I wouldn't give a library like chalk access to I/O, processes or network.
You might be able to do this around install scripts, though disk writing is likely needed for all (but perhaps locations could be controlled).
Yeah, it needs work from the language runtime, but I think even a hacky, leaky 'security' abstraction would be helpful, because the majority of malware developers probably aren't able to break out of a language-level sandbox, even if the language still allows you to do unsafe array access.
Then we can iterate.
But the real solution to this kind of attack is to stop resolving packages by name and instead resolve them by hash, then binding a name to that hash for local use.
That would of course be a whole different, mostly unexplored, world, but there's just no getting around the fact that blindly accepting updated versions of something based on its name is always going to create juicy attack surface around the resolution of that name to some bits.
you can only unpublish.
content hash integrity is verified in lockfiles.
the problem is with dependencies using semver ranges, especially wide ones like "debug": "*"
initiatives like provenance statements [0] / code signing are also good complement to delayed dependency updates.
also not running as default / whitelisting postinstall scripts is good default in pnpm.
modifying (especially adding) keys in npmjs.org should be behind dedicated 2fa (as well as changing 2fa)
[0] https://docs.npmjs.com/generating-provenance-statements
Large tech companies, as with most industry, have realized most people will pay with their privacy and data long before they'll pay with money. We live in a time of the Attention Currency, after all.
But you don't need to be a canary to live a technology-enabled life. Much software that you pay with your privacy and data has free or cheap open-source alternatives that approach the same or higher quality. When you orient your way of consuming to 'eh, I can wait till the version that respects me is built', life becomes more enjoyable in myriad ways.
I don't take this to absolute levels. I pay for fancy pants LLM's, currently. But I look forward to the day not too far away where I can get today's quality for libre in my homelab.
Good to see some OSS alternatives showing up!
A better (not perfect) solution: Every package should by AI analysed on an update before it is public available, to detect dangerous code and set a rating.
In package.json should be a rating defined, when remote package is below that value it could be updated, if it is higher a warning should appear.
But this will cost, but i hope, that companies like github, etc. will allow package-Repositories to use their services for free. Or we should find a way, to distribute this services to us (the users and devs) like a BOINC-Client.
[0] https://en.wikipedia.org/wiki/Politician's_syllogism
I thought we discuss here problems and possible solutions.
My fault.