Streaming behavior with multiple responses

We are looking for a Streaming pattern where a request made to backend service results in response sent in multiple batches one after another untill the complete payload is responded. Payload is around 80MB of data.

Client is able to aggregate the data and backend service is able to send response data in batches. I do not want apigee to close the connection untill all batches are sent out to client. Data size could increase over time so configuration of the parameters should be consistent.

When tested, we see the behavior is irregular where "connect.timeout.millis & io.timeout.millis" is governing the processing of the response batches and result is not always a success.

Want to know if:

1. Is this ideally how a streaming works?

2. Setting Apigee as a front end for this type of backend service is a valid approach

3. If yes, what is the bestway to tackle this scenario to not to have any type of timeouts (data size could increase overtime)

0 0 91
0 REPLIES 0