Why Concurrent Rate Limit Policy in RESPONSE pre flow ?

Apigee Docs says Concurrent rate limit policy can be attached to Target End Point Request "Pre Flow" or Response "Pre Flow". I can understand request "Pre Flow" where we can avoid too many concurrent requests going to Target server.

But why in RESPONSE pre flow ? What use cases are we targeting ? OR Is it a documentation error ?

Solved Solved
1 4 1,444
1 ACCEPTED SOLUTION

Not applicable

It needs to be in both flows.

The policy is expected to throttle concurrent requests, based on the active request count. This means it needs to track requests inflight.

The simplest way to implement such a policy is to increment request count when request arrives and decrement once the response goes out. I hope this makes it clear why the policy needs to be in both the request and the response flow.

Please note: what i have explained here is an simplification of the implementation.

View solution in original post

4 REPLIES 4

Not applicable

It needs to be in both flows.

The policy is expected to throttle concurrent requests, based on the active request count. This means it needs to track requests inflight.

The simplest way to implement such a policy is to increment request count when request arrives and decrement once the response goes out. I hope this makes it clear why the policy needs to be in both the request and the response flow.

Please note: what i have explained here is an simplification of the implementation.

@sriki77 , That means i have to always attach Concurrent Rate Limit policy to both request and response flows to make it work ?

@asagar You need to have it in fault handler too so that either fault or successful response helps in maintaing the count .

Got it .. Thank you @sriki77 and @Maruti Chand