Hey, we’d like to run our REST API service in multiple locations to lower the request latency. However we’d like to keep the endpoint URLs the same.
I wonder if we can utilize DigitalOcean loadbalancer to route traffic to the “right” server by deciding by either:
If not, does anybody know if this is something which is to be implemented in the near future?
This textbox defaults to using Markdown to format your answer.
You can type !ref in this text area to quickly search our full set of tutorials, documentation & marketplace offerings and insert the link!
These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.
Sign up for Infrastructure as a Newsletter.
Working on improving health and education, reducing inequality, and spurring economic growth? We'd like to help.
Get paid to write technical tutorials and select a tech-focused charity to receive a matching donation.
Hey!
That’s a great question, and it’s great to see that you’re considering advanced load-balancing strategies for your REST API service.
Regarding the latency-based routing, DigitalOcean’s load balancers primarily focus on layer 4 and support distribution methods like round-robin, least connections, and source IP. They don’t natively support latency-based routing. This feature is more commonly found in global DNS services, which can direct traffic based on the geographical location of the user, thereby reducing latency. One option here would be to use Cloudflare.
Regarding the content-based routing (like routing based on URL path or hostname) is supported by DigitalOcean load balancers. However, routing based on specific parameters within the HTML or HTTP headers isn’t supported natively by DigitalOcean load balancers.
The best thing to do to get your voice heard regarding this would be to head over to our Product Ideas board and post a new idea, including as much information as possible for what you’d like to see implemented.
An alternative option for you would be to use a self-hosted load balancer like Nginx or HAProxy where you will have full control over the configuration:
Hope that helps!
- Bobby.