Friday, August 29, 2008

Walking the Network Tight Rope made easier... With Load Balancers


Load Balancing is defined as a process and technology that distributes site traffic among several servers using a network based device. This device intercepts traffic destined for a site and redirects that traffic to various servers.
It is a technique to spread work between two or more computers, network links, CPUs, hard drives, or other resource. in order to get optimal resource utilization, throughput, or response time. Using multiple components with load balancing, instead of a single component, may increase reliability through redundancy. The balancing service is usually provided by a dedicated program or hardware device (such as a multilayer switch). It is commonly used to mediate internal communications in computer clusters, especially high-availability clusters. This process is completely transparent to the end user.

Benefits of Load Balancing:

- Optimal resource utilization
- Better throughput and response time
- Increases reliability through redundancy
- Streamlining of data communication
- Ensures a response to every request
- Reduces dropping of requests and data.
- Offers content aware distribution, by doing things such as reading URLS, intercepting cookies and XML parsing.
- Maintains a watch on the servers and ensures that they respond to the traffic. If they are not responding, then it takes them out of rotation.
- Priority activation: When the number of available servers drop below a certain number, or load gets too high, standby servers can be brought online.
- SSL Offload and Acceleration reduces the burden on the Web Servers and performance will not degrade for the end users.
- Distributed Denial of Service (DDoS) attack protection through features such as SYN cookies and delayed-binding to mitigate SYN flood attacks and generally offload work from the servers to a more efficient platform.
- HTTP compression: reduces amount of data to be transferred for HTTP objects by utilizing gzip compression available in all modern web browsers.
- TCP buffering: the load balancer can buffer responses from the server and spoon-feed the data out to slow clients, allowing the server to move on to other tasks.
- HTTP caching: the load balancer can store static content so that some requests can be handled without contacting the web servers.
- Content Filtering: some load balancers can arbitrarily modify traffic on the way through.
- HTTP security: some load balancers can hide HTTP error pages, remove server identification headers from HTTP responses, and encrypt cookies so end users can't manipulate them.
- Priority queuing: also known as rate shaping, the ability to give different priority to different traffic.
- Client authentication: authenticate users against a variety of authentication sources before allowing them access to a website.
- Firewall: Direct connections to backend servers are prevented, for security reasons

References: Server Load Balancing by Tony Bourke
Wikipedia

Image Reference: http://images.newsfactor.com/images/id/4443/story-data-012.jpg

No comments: