I have kubernetes cluster set up that hosts graphql api server https://hasura.io. This cluster is exposed to “the world” via digital ocean’s load balancer service.

After reading through load balancer limitations documentation I am concerned about two specific ones: 60s keep alive limit and max of 10,000 concurrent connections.

As my graphql api server exposes subscriptions through web-sockets and is used to power mobile game that relies on this live data, these limits can be reached relatively fast.

Hence I wanted to clarify if I understand these limits correctly and if they are indeed a concern for my use case. If so are there any alternatives to Load Balancers that I could utilise in this scenario?

These answers are provided by our Community. If you find them useful, show some love by clicking the heart. If you run into issues leave a comment, or add your own answer to help others.

×
Submit an Answer
1 answer

As user needs have increased we’ve actually released additional sizes for load balancers so that you can create larger nodes that support up to 40,000 concurrent connections, you can also find out more information that here:

https://www.digitalocean.com/docs/networking/load-balancers/