This is a great question, though I regret it is also one that cannot be answered clearly. Capacity is extremely relative to factors that cannot always be predicted. To give you a rough idea of what I’m talking about:
Let’s say /page1 uses 30% more resources than /page2 to load, perhaps by performing more database queries to build the page. This means /page1 has less capacity than /page2, but how can you predict the resources required for all pages and which pages will be visited by users simultaneously? You honestly can’t, at least not reasonably or reliably.
Every piece of software will be unique in how it performs, and users/crawlers will be difficult to predict. The best you can do, I would suggest, is get a rough idea by using something like loadimpact.com to simulate traffic, and then comparing the traffic against resource usage while running something like “htop” over SSH. This will only be a rough estimate, and should not be considered overly reliable for calculation.