Load balancing (which performs load balancer) is a type of service that executes a computer that assigns the workload of the group (networked) server stations so that computer resources are used optimally.
In computer technology, extensive calculations or large amounts of requests by means of load distribution over several concurrent systems. This can have very different characteristics. A simple load distribution takes place, for example, on computers with multiple processors. Each process can run on its own processor. The nature of the distribution of processes on processors can have a major impact on the overall performance of the system, since, for example, the cache content is local to each processor.
Another method is found in computer clusters. Here are several computers to a composite that behaves to the outside mostly as a single system. This is implemented with server load balancing process. Some possible methods are installing a computer, which divides the questions or the use of DNS with the round-robin.
Load distribution takes place even with large server farms that serve, for example, the response to HTTP requests. There systems are upstream (front-end server), which are distributed according to specified criteria, the individual requests to the back-end server. This additional information from the HTTP request is used to putting everyone on a session with a user belonging to the same server packages. This is also important when using SSL to encrypt the communication so that can not be performed for each request a new SSL handshake must.
A good implementation of a load distribution requires more information about how the utilization of the target systems looks like.
The term load balancing in the broadest sense, a mechanism for resilience is understood: by building a cluster and the distribution of queries on individual systems, achieved an increase of the reliability, provided that the failure of a system is detected and the request will be automatically transferred to another system.