Thoughts

mental health break ,./'"**^^$_---
I think I just fully disagree with the popular application of Knuth's "premature optimization is the root of all evil."
nvm is a 4,600+ line sh file that runs on shell startup. Search for "nvm slow", there are thousands of hits. When I benchmarked it in 2020, it took almost 700ms to run. I benchmarked it with the intention of finding the slow part, and making it faster. This is the approach Knuth suggests: > Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. > Yet we should not pass up our opportunities in that critical 3%. A good programmer will not be lulled into complacency by such reasoning, he will be wise to look carefully at the critical code; but only after that code has been identified. So I looked into nvm to figure out what part was slow. And the answer was, all of it, or at least, the important parts. There's no tight inner loop, there's no unnecessary recalculation of something multiple times, there's no 3% that would magically fix all of the performance problems. Figuring out the latest node version takes 100ms. Loading the config takes 200ms. Checking if the latest node version is installed takes 70ms. etc. => https://github.com/nvm-sh/nvm/issues/2334 Since 2020, there have been huge advances in processor technology (I've moved from an i5 to an M3 Pro), and there have been performance improvements to nvm performance. But I'm sure without even checking it still takes 200ms at shell start because it fundamentally has not changed. It still calls functions by creating sub-shells and doing string interpolation because that's how the shell works. => https://github.com/nvm-sh/nvm/blob/master/nvm.sh#L1387 Meanwhile, I've been using volta. Volta is written in Rust and designed to be fast. It takes, drumroll please, 0ms at shell startup. Because it doesn't run any code at shell startup. > Volta setup runs once and takes 26ms. nvm runs every time you open your shell and takes 600ms. => https://thoughts.learnerpages.com/?show=1ca38752-f52d-450a-a62c-2c0f1ea16972 This matches my experience making websites and Rails apps. Most good engineers are able to do what Knuth proposes without being told. They're able to avoid having subsets of the application that are implemented in a dumb way that is way too slow for no reason. Most of the time, there isn't a 3%, instead, performance dies by papercuts. There's a 1000 places with unnecessary if statements and object loads and allocations, and those inefficiencies compound. Every unnecessary if pushes a necessary if out of the branch cache. Every unnecessary page load pushes a necessary page out of the page cache. > The secret is that Rails apps aren't slow or hard to scale by default - they die a slow death by a thousand papercuts. => https://www.railsspeed.com/ Rails apps and nvm work. They're not unusably slow. And there's something to be said for that. Rails trades performance for iteration speed and nvm trades performance for platform compatibility. But if you want actually fast software you can't expect there to be a single hotspot that solves your performance; you can't say that optimizations are premature. You have to architect for performance. > Think about performance from the outset, from the beginning. The best time to solve performance, to get the huge 1000x wins, is in the design phase, which is precisely when we can't measure or profile. => https://github.com/tigerbeetle/tigerbeetle/blob/main/docs/TIGER_STYLE.md#performance => https://thoughts.learnerpages.com/?show=cbab325a-4d63-4eef-aac3-066462c66c59
Link 2:44 p.m. Nov 10, 2025 UTC-5