Multiple target endpoints

Hello

 

I would like to ask if we have multiple target urls for the same proxy. What is the better option, using multiple target endpoints or using Java script policy to route for specific URL based on conditions with one code ?

Solved Solved
0 6 649
2 ACCEPTED SOLUTIONS

Yes, it does reduces the size of the xml defn of the proxy source. But when it comes to performance of the proxy (which matters the most) avoiding javascript is better wherever its possible. 

Thanks. 

View solution in original post


@Rohit_Kancharla wrote:

do you have the exact number and by what percentage proxy performance is reduced?


oh. my goodness, no. That would be very hard to predict.

In general my philosophy is: use the target endpoint if you can. You get all kinds of cool analytics and controls with the target endpoint. If you send outbound requests from a JS callout, you don't get that. So that alone, irrespective of the performance of the thing, would push ME towards using Target endpoints.

There is a middle path that I don't think we suggested, and that is to use a JS step to set a dynamic URL for the target. Use a single targetendpoint, but override the URL, based on a JS step that sets target.url. Then you get the benefits fo the TargetEndpoint, with the flexibilty of dynamic URLs for your target, determined by JS.

In this middle path, regarding the performance, sure you're using JS, but only for string manipulation. In my experience, the dominant factor in latency of APIs is most often not the steps in your proxy, but the behavior and performance characteristics of the upstream systems. If your upstream responds in 200ms, the proxy does its work (including JS) in 15ms, Then..... worrying about the ~1ms that you might spend in JavaScript is probably not the best use of optimization effort.

View solution in original post

6 REPLIES 6

Multiple target endpoints is better. Using javascript there is always overhead in the js execution stepping away from the main flow. 

Thanks.

@ganadurai  Agree with you. but I'm thinking if we have more than 5 target endpoints, we must add RouteRules  in the default XML for each target. On the other hand, if we use JS policy we add these rules inside the JS, then number of policies will be reduced and XML will be reduced too.  

 

 

That's probably true.  There is a point beyond which the number of discrete targets gets unwieldy. 

@dchiesa1 , do you have the exact number and by what percentage proxy performance is reduced?


@Rohit_Kancharla wrote:

do you have the exact number and by what percentage proxy performance is reduced?


oh. my goodness, no. That would be very hard to predict.

In general my philosophy is: use the target endpoint if you can. You get all kinds of cool analytics and controls with the target endpoint. If you send outbound requests from a JS callout, you don't get that. So that alone, irrespective of the performance of the thing, would push ME towards using Target endpoints.

There is a middle path that I don't think we suggested, and that is to use a JS step to set a dynamic URL for the target. Use a single targetendpoint, but override the URL, based on a JS step that sets target.url. Then you get the benefits fo the TargetEndpoint, with the flexibilty of dynamic URLs for your target, determined by JS.

In this middle path, regarding the performance, sure you're using JS, but only for string manipulation. In my experience, the dominant factor in latency of APIs is most often not the steps in your proxy, but the behavior and performance characteristics of the upstream systems. If your upstream responds in 200ms, the proxy does its work (including JS) in 15ms, Then..... worrying about the ~1ms that you might spend in JavaScript is probably not the best use of optimization effort.

Yes, it does reduces the size of the xml defn of the proxy source. But when it comes to performance of the proxy (which matters the most) avoiding javascript is better wherever its possible. 

Thanks.