An Introduction to DigitalOcean Load Balancers
DigitalOcean Load Balancers are a fully-managed, highly available service that distribute traffic to pools of Droplets. This lightens the burden placed on each Droplet and makes it easy to alter your backend infrastructure without affecting the availability of your services. Load Balancers provide a stable interface with automatic failover, simplifying the way you set up scalable infrastructure and react to changing demands.
When Should I Use a Load Balancer?
The generic term load balancer refers to a traffic-directing infrastructure component that accepts network requests and distributes them among a pool of interchangeable backend servers. This shares the processing workload among a group of machines rather than relying on a single server to handle every request.
The main benefits of placing services behind a load balancer are:
- Availability: Load balancing can help decouple service health from the health of a single machine. If an application or web server crashes on a single machine, the load balancer can direct traffic elsewhere until service has been restored. When the load balancer itself has a built-in failover mechanism, the chances of service interruption are even more reduced.
- Performance: Dividing incoming traffic among a group of backend servers can help prevent any one machine from being overwhelmed by requests. Each backend server only receives a portion of incoming requests, meaning that more resources are available on each machine.
- Flexibility: Using a load balancer as a gateway gives you flexibility to change the backend infrastructure at will. This can help with anything from rolling out deployments seamlessly to large architecture redesigns. You can also easily scale your infrastructure by adjusting the number backend servers.
DigitalOcean's Load Balancer service provides the above advantages in a fully managed environment. Users can modify the Load Balancer behavior to suit their needs without the burden of managing the operational complexities.
DigitalOcean Load Balancers at a Glance
- Price: $20 per month. No additional bandwidth charges apply.
- Regional availability: Load Balancers are available in every region.
- Supported protocols: HTTP, HTTPS, HTTP/2, and TCP.
- Balancing algorithms: Round robin and least connections.
- Backend management: Manual Droplet selection or tag-based management.
- Backend membership requirements: All backends must be in a single region.
- Load Balancer type: Network.
While the terms are used loosely in the industry an application load balancer typically enables capabilities such as the ability to direct traffic to specific backends based on URL, Cookies, HTTP Headers, etc. A Network load balancer does not use these fields when making load distribution decisions.
- Private networking: The Load Balancer will connect to the backend over the private network if it is enabled on the Droplet in question when it is added to the Load Balancer. If private networking is disabled, the Load Balancer will contact the Droplet using its public IP address.
Load Balancers can be created and managed through the DigitalOcean Control Panel or using the DigitalOcean API.
Load Balancer Features
Aside from basic traffic directing functionality, DigitalOcean Load Balancers offer the following advantages.
On May 8, 2018, we released a significant update to the Load Balancer service, including support for HTTP/2, free SSL certificates using Let's Encrypt, and improved performance. The back-end enhancements are automatically available.
In addition, we made some adjustments to the creation flow. Rather than creating a Load Balancer and adding Droplets on the same screen, users now create the Load Balancer first and may add Droplets immediately after the Load Balancer has been created and is fully functional.
A DigitalOcean Load Balancer monitors backend Droplets to ensure that each service is operating healthy. Users can define health check endpoints and set the parameters around what constitutes a healthy response. The Load Balancer will automatically remove machines from rotation that fail health checks until those health checks indicate that service has been restored.
While this helps ensure the health of the backend pool, the Load Balancer itself must also be responsive to failures. DigitalOcean Load Balancers are configured with automatic failover in order to maintain availability even when failures occur at the balancing layer. Internally, the active balancing component is monitored and fails over to a standby if necessary. You can grow your infrastructure without introducing a new single point of failure.
Flexible Multi-Protocol Routing
A single DigitalOcean Load Balancer can be configured to handle multiple protocols and ports.
You can control traffic routing is controlled with configurable rules that specify the ports and protocols that the Load Balancer should listen on, as well as the way that it should select and forward requests to the backend servers.
Standard HTTP balancing directs requests based on standard HTTP mechanisms. The Load Balancer sets the
X-Forwarded-Port headers to give the backend servers information about the original request. If user sessions depend on the client always connecting to the same backend, a cookie can be sent to the client to enable sticky sessions.
Encrypted Balancing with HTTPS and HTTP/2
SSL passthrough allows the Load Balancer to accept and forward HTTPS encrypted traffic to your backend Droplets. This is a good option if you need end-to-end encryption and want to spread the SSL decryption overhead out among your various machines, but requires you to manage the SSL certificates yourself.
With SSL termination, or SSL offloading, the Load Balancer handles SSL decryption and sends plain HTTP traffic to the backend machines. In this configuration, you'll add your SSL certificate and private key to the Load Balancer itself. These secrets are placed in a secure, encrypted storage system and are not accessible to anyone, including DigitalOcean staff.
When the Load Balancer forwards requests to the backend, the server logs will display the IP address of the load balancer. To help you identify the IP address, port, and protocol the client used when connecting, the load balancer sends these headers:
X-Forwarded-Forprovides the originating IP address of the client.
X-Forwarded-Portidentifies the port the client used to connect to the load balancer.
X-Forwarded-Protolets you know the protocol the client used to connect to the load balancer.
Both HTTPS and HTTP/2 have the same balancing features as HTTP, but HTTP/2 will generally provide faster and safer encrypted website browsing. When you use SSL termination, your Load Balancer can also act as a gateway between HTTP/2 client traffic and HTTP/1.0 or HTTP/1.1 backend applications.
Additionally, you can configure Load Balancers to redirect HTTP traffic on port 80 to HTTPS or HTTP/2 on port 443. This way, the Load Balancer can listen for traffic on both ports but redirect unencrypted traffic for better security.
Finally, TCP balancing is available for applications that do not speak HTTP. For example, deploying a Load Balancer in front of a database cluster like Galera would allow you spread requests across all available machines.
There are two different ways to define backend Droplets for a Load Balancer. The first is to explicitly add the desired Droplets to the Load Balancer by name using the Control Panel or API.
However, a more powerful way of managing backends is to select by tag. Instead of selecting individual Droplets, a single tag can be used as the selection criteria. The Load Balancers evaluate tags at runtime, meaning that whenever a tag is added or removed from a Droplet, the Load Balancer will adjust the routing accordingly without further configuration.
This makes it far simpler to scale the backend by adding or removing tags from Droplets. You can configure rolling deployments by removing a tag from a Droplet, applying the update, and re-applying the tag. Once the Droplet passes its health check, the next server can be updated in the same way.
Creating a New Load Balancer
Setting up a Load Balancer is a two step process: creating the Load Balancer and then adding backend Droplets.
Step 1 — Creating the Load Balancer
You can create a Load Balancer using the Create menu at any time or use the Create Load Balancer button on the Load Balancers overview page.
On the creation page, you will:
Choose a datacenter region. Your Load Balancer and Droplets need to be in the same datacenter, so choose the one where your Droplets are or will be located.
Add forwarding rules. You need at least one rule to create a Load Balancer. You can either create new rules now by clicking the New Rule drop down, or accept the default route (HTTP port 80 on the Load Balancer to HTTP port 80 on the backend Droplets) and configure the rules you want after the Load Balancer has been created.
The left side of each rule defines the listening port and protocol on the Load Balancer itself, and the right side defines where and how the requests will be routed to the backends. You can change the protocols using the drop down menus. If you use HTTPS or HTTP2, you will also be asked to choose an existing SSL certificate, create a free certificate using Let's Encrypt, provide custom certificate files, or use SSL passthrough.
- Set advanced settings. dvanced settings allow you to modify some additional parameters for the Load Balancer. The defaults work well for most cases, but the options are:
- Algorithm: The default round robin algorithm sends requests to each available backend in turn. The alternative least connections algorithm sends requests to the backend with the least number of active connections.
- Sticky sessions: If your application's sessions rely on connecting to the same backend for each request, you can enable sticky sessions. This sets a cookie with a configurable name and TTL (to define how long the cookie is valid) so that the Load Balancer can send future requests to the same server.
- Health checks: The Load Balancer will only forward requests to healthy servers. You can modify the criteria the Load Balancer uses to remove and re-add server as well as the endpoint it checks for a response.
- SSL Redirect: You can redirect HTTP requests on port 80 to HTTPS port 443 by enabling this option.
- Choose a name and create. Load Balancer names must be unique and contain alphanumeric characters, dashes, and periods only. Once created, you can change the name at any time by clicking on the existing name on the Load Balancer page.
Step 2 — Choosing Droplets
Your choice of Droplets will be limited to the region where the Load Balancer was created. You can add individual Droplets or select by using a tag. When you use a tag, only the tagged Droplets in the same region as the Load Balancer will be part of its pool. Only one tag can be used per Load Balancer.
When you've selected the tag or the Droplets, click Add Droplets. The Load Balancer will check the health of the backend Droplets. Once the backends have passed the health check the required number of times, they will be marked healthy and the Load Balancer will begin forwarding requests to them.
Managing Existing Load Balancers
You can manage your existing Load Balancers by going to the Load Balancer index page.
Click Networking on the top menu and then click Load Balancers. All of your existing Load Balancers will be displayed:
Click on an individual Load Balancer name to view the Droplets currently attached to that Load Balancer:
Clicking on a Droplet name takes you to the Droplet's detail page. If you are managing backend Droplets by name, you can add additional Droplets by clicking the Add Droplets button. If you are managing by tag, you will instead have an Edit Tag button to change the selector tag.
Click the Graphs tab to get a visual representation of traffic patterns and infrastructure health:
The Frontend section holds graphs related to requests to the Load Balancer itself, while the Droplets section beneath it provides insight into the traffic that each Droplet handles.
Clicking Settings gives you the opportunity to modify the way that the Load Balancer functions:
You will be able to modify almost all of the settings you selected during the creation process. Additionally, you can delete the Load Balancer here if you no longer need it.
Managing SSL Certificates
When using your Load Balancer for SSL termination, the SSL certificate, private key, and certificate chain must all be uploaded to your DigitalOcean account. These secrets are placed in a secure, encrypted storage system and are not accessible to anyone, including DigitalOcean staff.
Let's Encrypt Certificates
Detailed directions for creating and managing Let's Encrypt Certificates are available in How to Use Let's Encrypt with DigitalOcean Load Balancers. Like the custom SSL Certificates described next, Let's Encrypt certificates are displayed in your account in Settings on the Security tab.
From here, you can view the fingerprint or delete the certificate:
For custom certificates, you can add the required SSL files during the Load Balancer creation process or ahead of time.
To view and create custom certificates, first click on your user icon in the upper-right corner and select Settings. On the settings page, select Security from the left-hand menu:
In the TLS/SSL certificates section, you can see your existing certificates' names and SHA1 fingerprints. You can find the fingerprint of your certificates to compare it with the value in the Control Panel by running the following command on the machine where the certificate is located:
- openssl x509 -noout -sha1 -fingerprint -in certificate_file.pem
To add a new certificate ahead of time, click Add Certificate. You will be prompted to choose a name and then enter the certificate, private key, and certificate chain to continue. These files must be entered in PEM format to be accepted.
To delete a certificate, click More and then Delete from the certificate list:
The certificate will be removed from your account.
For more specific information about how to integrate Load Balancers with your infrastructure, check out the guides below:
- Best Practices for Performance on DigitalOcean Load Balancers
- How To Create Your First DigitalOcean Load Balancer
- How To Configure SSL Passthrough on DigitalOcean Load Balancers
- How To Configure SSL Termination on DigitalOcean Load Balancers
- How To Balance TCP Traffic with DigitalOcean Load Balancers