Hi,
I have a requirement to rate limit per target (backend) that I know can be done using concurrent rate limiting policy.
Also, I need to enforce this rate limit across multiple proxies. Means, Multiple proxies connecting to the same backend should have a common (shared) rate limiting. Not sure if it is covered by the concurrent rate limit policy (target identifier) ?
I think this is very common requirement to prevent backend overload. Any alternative or better implementation approach would be appreciated.
Thanks.
You can do the rate limiting in proxies and back-end adding mediation policy(for example Quota Policy) in proper step of your proxy configuration.
the challenge is to have a mediation policy that measures across proxies and not just for single proxy. Not sure what is the construct(if any) in apigee that can help counting target calls across all proxies.
User | Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |