Summarized from https://jerf.org/iri/post/2025/the_uselessness_of_fast/

  1. Programming uniquely spans vast orders of magnitude, from nanoseconds to months of computation, making terms like “fast” and “slow” ambiguous.
  2. Software engineers frequently work across a 19-order-of-magnitude range of operational speeds, a scope uncommon in most other disciplines.
  3. Human brains and current terminology struggle to meaningfully differentiate performance across such a broad spectrum.
  4. Developers often overemphasize “requests per second” benchmarks for web frameworks, neglecting that their application code is typically the bottleneck, not the framework.
  5. Most web development performance needs are well within the capabilities of commodity frameworks, making speed an irrelevant selection criterion for many.
  6. For extremely performance-critical systems like commercial databases, language choice and optimization at the nanosecond level are crucial.
  7. The concept of “premature optimization” is complex, but understanding the rough order of magnitude of operations helps prioritize where optimization efforts are truly beneficial.
  8. In systems with vast magnitude differences, “little things” (e.g., nanoseconds) often do not add up to significant savings compared to larger bottlenecks (e.g., milliseconds).
  9. Vague terms like “fast” and “slow” in software discussions lead to misunderstandings and project delays, as different teams may hold vastly different performance expectations.
  10. It is crucial to replace general performance descriptors with specific metrics, benchmarks, and orders of magnitude to facilitate clear communication and informed decision-making in software engineering.