DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Because the DevOps movement has redefined engineering responsibilities, SREs now have to become stewards of observability strategy.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Related

  • How Does Video Annotation Augment Computer Vision?
  • Demystifying Big O Notation
  • The Battle of Data: Statistics vs Machine Learning
  • Introduction To Artificial Intelligence With Code: Part 1

Trending

  • The Role of AI in Identity and Access Management for Organizations
  • Exploring Intercooler.js: Simplify AJAX With HTML Attributes
  • Next Evolution in Integration: Architecting With Intent Using Model Context Protocol
  • Traditional Testing and RAGAS: A Hybrid Strategy for Evaluating AI Chatbots

The Laws of Computer Science

Check out the implications of some of the laws in computer science, like Moore's Law, Amdahl's Law, Gustafson's Law, and Wirth's Law.

By 
Daniela Kolarova user avatar
Daniela Kolarova
DZone Core CORE ·
Feb. 04, 20 · Analysis
Likes (32)
Comment
Save
Tweet
Share
15.8K Views

Join the DZone community and get the full member experience.

Join For Free

According to Wikipedia, a law is a set of rules decided by a particular state meant for the purpose of keeping the peace and security of society.

Another example of a different type of law is the laws of physics, which are stated facts that have been deduced and derived based on empirical observations. The world around us works in a certain way, and physical laws are a way of classifying that “way of working.”

But what is the meaning of laws in computer science? Are they a way of keeping peace in the software society or facts deduced based on certain observations?

This article is an attempt to list and classify the most popular laws in computer science.


Moore's Law

Moore's Law refers to Moore's perception that the number of transistors on a microchip doubles every two years, though the cost of computers is halved. Moore's Law states that we can expect the speed and capability of our computers to increase every couple of years, and we will pay less for them. We are accustomed to thinking that computer speed doubles every 18 months as predicted by Moore’s law. Indeed, for the past 50 years that was the case. However Moore’s law is coming to an end due to technical obstacles. As stated in a very popular article written in 2005 by Herb Sutter:

Moore’s Law predicts exponential growth, and clearly exponential growth can’t continue forever before we reach hard physical limits.

The key difference, which is at the heart of this article, is that the performance gains are going to be accomplished in fundamentally different ways for at least the next couple of processor generations. And most current applications will no longer benefit from the free ride without significant redesign. If you want your application to benefit from the continued exponential throughput advances in new processors, it will need to be a well-written concurrent (usually multithreaded) application. And that’s easier said than done, because not all problems are inherently parallelizable and because concurrent programming is hard, which brings us to Amdahl's Law.

Amdahl's Law

Amdahl's Law is a formula used to find the maximum improvement possible by improving a particular part of a system. In parallel computing, Amdahl's Law is mainly used to predict the theoretical maximum speedup for program processing using multiple processors.



Amdahl's Law formula


where

  • Slatency is the theoretical speedup of the execution of the whole task;
  • s is the speedup of the part of the task that benefits from improved system resources;
  • p is the proportion of execution time that the part benefiting from improved resources originally occupied.

Gustafson's Law

Gustafson's Law gives the theoretical speedup in latency of the execution of a task at fixed execution time that can be expected of a system whose resources are improved. Gustafson estimated the speedup S gained by using N processors (instead of just one) for a task with a serial fraction s (which does not benefit from parallelism) as follows:

         {\displaystyle S=N+(1-N)s}




Using different variables, Gustafson's Law can be formulated the following way:


{\displaystyle S_{\text{latency}}(s)=1-p+sp,}


where,

  • Slatency is the theoretical speedup in latency of the execution of the whole task;
  • s is the speedup in latency of the execution of the part of the task that benefits from the improvement of the resources of the system;
  • p is the percentage of the execution workload of the whole task concerning the part that benefits from the improvement of the resources of the system before the improvement.


Gustafson's Law addresses the shortcomings of Amdahl's Law, which is based on the assumption of a fixed problem size, that is of an execution workload that does not change with respect to the improvement of the resources. Gustafson's Law instead proposes that programmers tend to set the size of problems to fully exploit the computing power that becomes available as the resources improve. Therefore, if faster equipment is available, larger problems can be solved within the same time.

The impact of Gustafson's Law was to shift research goals to select or reformulate problems so that solving a larger problem in the same amount of time would be possible. In a way, the law redefines efficiency, due to the possibility that limitations imposed by the sequential part of a program may be countered by increasing the total amount of computation.

Basically the difference between Andahl's Law and Gustafson's Law lies in the application's objective, to achieve the same application execution time with increasing problem size or decrease application execution time with the same problem size. The goal for optimization may be to make a program run faster with the same workload (reflected in Amdahl’s Law) or to run a program in the same time with a larger workload (Gustafson Law).


Wirth's Law

Wirth's Law states that software is getting slower more rapidly than hardware is becoming faster.  The law was restated in 2009 and attributed to Larry Page, founder of Google. It has also been referred to as Page's Law. Other common forms use the names of the leading hardware and software companies of the 1990s (Intel & Microsoft), or their CEOs (Andy Grove & Bill Gates): "What Intel giveth, Microsoft taketh away" and "What Andy giveth, Bill taketh away."


All the laws we talked about in this article are based on observations, experiments, and math in general. It turns out they are more similar to physical laws then those meant for keeping the peace and security of society.  As a matter of fact, computer science demands  those kind of laws and some of them are also very popular: SOLID Principles, GRASP Principles, various architectural patterns and good practices. They are meant to keep programs managable, stable and adaptable to changes.

Further Reading

SOLID, GRASP, and Other Basic Principles of Object-Oriented Design



Law (stochastic processes) philosophy Computer science Computer

Opinions expressed by DZone contributors are their own.

Related

  • How Does Video Annotation Augment Computer Vision?
  • Demystifying Big O Notation
  • The Battle of Data: Statistics vs Machine Learning
  • Introduction To Artificial Intelligence With Code: Part 1

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!