The Laws of Computer Science
Check out the implications of some of the laws in computer science, like Moore's Law, Amdahl's Law, Gustafson's Law, and Wirth's Law.
Join the DZone community and get the full member experience.
Join For FreeAccording to Wikipedia, a law is a set of rules decided by a particular state meant for the purpose of keeping the peace and security of society.
Another example of a different type of law is the laws of physics, which are stated facts that have been deduced and derived based on empirical observations. The world around us works in a certain way, and physical laws are a way of classifying that “way of working.”
But what is the meaning of laws in computer science? Are they a way of keeping peace in the software society or facts deduced based on certain observations?
This article is an attempt to list and classify the most popular laws in computer science.
Moore's Law
Moore's Law refers to Moore's perception that the number of transistors on a microchip doubles every two years, though the cost of computers is halved. Moore's Law states that we can expect the speed and capability of our computers to increase every couple of years, and we will pay less for them. We are accustomed to thinking that computer speed doubles every 18 months as predicted by Moore’s law. Indeed, for the past 50 years that was the case. However Moore’s law is coming to an end due to technical obstacles. As stated in a very popular article written in 2005 by Herb Sutter:
Moore’s Law predicts exponential growth, and clearly exponential growth can’t continue forever before we reach hard physical limits.
The key difference, which is at the heart of this article, is that the performance gains are going to be accomplished in fundamentally different ways for at least the next couple of processor generations. And most current applications will no longer benefit from the free ride without significant redesign. If you want your application to benefit from the continued exponential throughput advances in new processors, it will need to be a well-written concurrent (usually multithreaded) application. And that’s easier said than done, because not all problems are inherently parallelizable and because concurrent programming is hard, which brings us to Amdahl's Law.
Amdahl's Law
Amdahl's Law is a formula used to find the maximum improvement possible by improving a particular part of a system. In parallel computing, Amdahl's Law is mainly used to predict the theoretical maximum speedup for program processing using multiple processors.
where
- Slatency is the theoretical speedup of the execution of the whole task;
- s is the speedup of the part of the task that benefits from improved system resources;
- p is the proportion of execution time that the part benefiting from improved resources originally occupied.
Gustafson's Law
Gustafson's Law gives the theoretical speedup in latency of the execution of a task at fixed execution time that can be expected of a system whose resources are improved. Gustafson estimated the speedup S gained by using N processors (instead of just one) for a task with a serial fraction s (which does not benefit from parallelism) as follows:
Using different variables, Gustafson's Law can be formulated the following way:
where,
- Slatency is the theoretical speedup in latency of the execution of the whole task;
- s is the speedup in latency of the execution of the part of the task that benefits from the improvement of the resources of the system;
- p is the percentage of the execution workload of the whole task concerning the part that benefits from the improvement of the resources of the system before the improvement.
Gustafson's Law addresses the shortcomings of Amdahl's Law, which is based on the assumption of a fixed problem size, that is of an execution workload that does not change with respect to the improvement of the resources. Gustafson's Law instead proposes that programmers tend to set the size of problems to fully exploit the computing power that becomes available as the resources improve. Therefore, if faster equipment is available, larger problems can be solved within the same time.
The impact of Gustafson's Law was to shift research goals to select or reformulate problems so that solving a larger problem in the same amount of time would be possible. In a way, the law redefines efficiency, due to the possibility that limitations imposed by the sequential part of a program may be countered by increasing the total amount of computation.
Basically the difference between Andahl's Law and Gustafson's Law lies in the application's objective, to achieve the same application execution time with increasing problem size or decrease application execution time with the same problem size. The goal for optimization may be to make a program run faster with the same workload (reflected in Amdahl’s Law) or to run a program in the same time with a larger workload (Gustafson Law).
Wirth's Law
Wirth's Law states that software is getting slower more rapidly than hardware is becoming faster. The law was restated in 2009 and attributed to Larry Page, founder of Google. It has also been referred to as Page's Law. Other common forms use the names of the leading hardware and software companies of the 1990s (Intel & Microsoft), or their CEOs (Andy Grove & Bill Gates): "What Intel giveth, Microsoft taketh away" and "What Andy giveth, Bill taketh away."
All the laws we talked about in this article are based on observations, experiments, and math in general. It turns out they are more similar to physical laws then those meant for keeping the peace and security of society. As a matter of fact, computer science demands those kind of laws and some of them are also very popular: SOLID Principles, GRASP Principles, various architectural patterns and good practices. They are meant to keep programs managable, stable and adaptable to changes.
Further Reading
SOLID, GRASP, and Other Basic Principles of Object-Oriented Design
Opinions expressed by DZone contributors are their own.
Comments