What are the performance concerns for a JSON-to-XML policy when the response payload JSON is large (in this case, around 15MB)?

 
Solved Solved
1 2 296
2 ACCEPTED SOLUTIONS

Not applicable

Network is a concern because it will take about 3TPS to saturate a 1Gbps connection. If you have a big enough pipe, say 10Gbps, your network theoretically could squeeze through thirty or so per second, but you will need a lot of CPU resources to transform that much JSON to XML at that rate. So CPU is a concern too. With this scenario there are a bunch of things competing to be the biggest bottleneck. I'd have to know more about the scenario to venture a guess as to which one would win out.

View solution in original post

Not applicable

As mentioned by Chris above, it's true that large payloads will demand a great deal of resources when trying to process a big file in a single shot. However, with Node.js you could leverage the power of streams and pipes, which can help you to process large streams of text data in flight in smaller chunks, instead of waiting for all data and loading it in memory for further processing/transformation by other policies. This document provides more insight on the benefits of this feature, it's worth a read. So, to deal with a large volume response, which needs to be transformed into JSON, you could also leverage for instance JSONStream to parse incoming data stream for the target response and then pipe the response back (response stream) to the consumer, in this case loading the entire payload won't require to be loaded in memory and CPU should be able to handle this scenario more effectively. I understand this doesn't answer directly your question about JSON-to-XML policy, however it's an alternative to this pattern. Please let me know if you want to discuss further.

View solution in original post

2 REPLIES 2

Not applicable

Network is a concern because it will take about 3TPS to saturate a 1Gbps connection. If you have a big enough pipe, say 10Gbps, your network theoretically could squeeze through thirty or so per second, but you will need a lot of CPU resources to transform that much JSON to XML at that rate. So CPU is a concern too. With this scenario there are a bunch of things competing to be the biggest bottleneck. I'd have to know more about the scenario to venture a guess as to which one would win out.

Not applicable

As mentioned by Chris above, it's true that large payloads will demand a great deal of resources when trying to process a big file in a single shot. However, with Node.js you could leverage the power of streams and pipes, which can help you to process large streams of text data in flight in smaller chunks, instead of waiting for all data and loading it in memory for further processing/transformation by other policies. This document provides more insight on the benefits of this feature, it's worth a read. So, to deal with a large volume response, which needs to be transformed into JSON, you could also leverage for instance JSONStream to parse incoming data stream for the target response and then pipe the response back (response stream) to the consumer, in this case loading the entire payload won't require to be loaded in memory and CPU should be able to handle this scenario more effectively. I understand this doesn't answer directly your question about JSON-to-XML policy, however it's an alternative to this pattern. Please let me know if you want to discuss further.