The Uselessness of Fast and Slow in Programming
Summarized from https://jerf.org/iri/post/2025/the_uselessness_of_fast/
- Programming uniquely spans vast orders of magnitude, from nanoseconds to months of computation, making terms like “fast” and “slow” ambiguous.
- Software engineers frequently work across a 19-order-of-magnitude range of operational speeds, a scope uncommon in most other disciplines.
- Human brains and current terminology struggle to meaningfully differentiate performance across such a broad spectrum.
- Developers often overemphasize “requests per second” benchmarks for web frameworks, neglecting that their application code is typically the bottleneck, not the framework.
- Most web development performance needs are well within the capabilities of commodity frameworks, making speed an irrelevant selection criterion for many.
- For extremely performance-critical systems like commercial databases, language choice and optimization at the nanosecond level are crucial.
- The concept of “premature optimization” is complex, but understanding the rough order of magnitude of operations helps prioritize where optimization efforts are truly beneficial.
- In systems with vast magnitude differences, “little things” (e.g., nanoseconds) often do not add up to significant savings compared to larger bottlenecks (e.g., milliseconds).
- Vague terms like “fast” and “slow” in software discussions lead to misunderstandings and project delays, as different teams may hold vastly different performance expectations.
- It is crucial to replace general performance descriptors with specific metrics, benchmarks, and orders of magnitude to facilitate clear communication and informed decision-making in software engineering.