How to Conditionally Enable Streaming in Apigee

Introduction

Streaming can be enabled for API proxies which handles large payload messages and do not require message routing based on payload content. This optimizes the memory usage in the router and the message processor for providing better performance at the API gateway. In Apigee API platform, streaming can be enabled at two levels; firstly between client applications and API proxies, either on the request flow, response flow or on both. Secondly between Apigee and the backend services, either on the request flow, response flow or on both.

Once streaming is enabled, Apigee will skip buffering payload messages and directly write incoming content to the backend services and vise versa depending on the configuration. It would be important to note that using this approach will force the target endpoints to receive messages at the same rate that they are received from the client applications since buffering is not used.

Enabling Streaming in API Proxies

<ProxyEndpoint name="default">  
   <HTTPProxyConnection>  
   <URL>http://target-host/target/endpoint</URL>  
   <Properties>  
      <Property name="response.streaming.enabled">true</Property>  
      <Property name="request.streaming.enabled">true</Property>  
   </Properties>  
   </HTTPProxyConnection>
</ProxyEndpoint>

As shown in the above example streaming can be enabled in the communication between the client applications and Apigee in request and response message flows using above two properties.

Enabling Streaming in Target Endpoints

<TargetEndpoint name="default">
   <HTTPTargetConnection>
      <URL>http://target-host/target/endpoint</URL>
      <Properties>
         <Property name="response.streaming.enabled">true</Property>
         <Property name="request.streaming.enabled">true</Property>
      </Properties>
   </HTTPTargetConnection>
</TargetEndpoint>

As shown in the above example streaming can be enabled in the communication between Apigee and backend services in request and response message flows using above two properties.

Conditionally Enabling Streaming in Target Endpoints

<RouteRule name="StreamingRoute">
   <Condition>request.queryparam.streamToTarget="true"</Condition>
   <TargetEndpoint>TargetEndpointWithStreaming</TargetEndpoint>
</RouteRule>

<RouteRule name="DefaultRoute">  
   <TargetEndpoint>DefaultTargetEndpoint</TargetEndpoint>
</RouteRule>

<TargetEndpoint name="TargetEndpointWithStreaming">
   <HTTPTargetConnection>  
      <URL>http://target-host/target/endpoint</URL>
      <Properties>
         <Property name="response.streaming.enabled">true</Property>
         <Property name="request.streaming.enabled">true</Property>
      </Properties>
   </HTTPTargetConnection>
</TargetEndpoint>

<TargetEndpoint name="DefaultTargetEndpoint">
   <HTTPTargetConnection>
      <URL>http://target-host/target/endpoint</URL>
      <Properties>
         <Property name="response.streaming.enabled">false</Property>
         <Property name="request.streaming.enabled">false</Property>
      </Properties>
   </HTTPTargetConnection>
</TargetEndpoint>

Streaming in target endpoints can be conditionally handled by using routing rules with multiple target endpoints as shown in the above example. In this approach routing conditions would need to be implemented based on query parameters or HTTP headers without using the message payload as it will not be read by the message processor.

It is important to note that it would not be possible to apply a similar routing rule at the API proxy level for conditionally enabling streaming because of streaming configuration being directly applied in the API proxy itself before engaging any policies. If required, multiple API proxies can be used with different streaming configurations for this purpose.

Conclusion

API proxies that require handling large message payloads could either increase buffer limits in the router and the message processor or use streaming. Increasing buffer limits may result in lower performance at the API gateway due to the high utilization of memory. Hence, enabling streaming might be a better option depending on the throughput that can be handled by the backend services and client applications. Streaming can be enabled in Apigee at two levels; at the API proxy configuration and target endpoints. Out of these two options streaming at the target endpoints can be conditionally controlled using routing rules. However, the same cannot be applied at the API proxies.

References

[1] Streaming requests and responses: https://docs.apigee.com/api-platform/develop/enabling-streaming

[2] Understanding Nginx HTTP Proxying, Load Balancing, Buffering and Caching: https://www.digitalocean.com/community/tutorials/understanding-nginx-http-proxying-load-balancing-bu...

[3] How does Apigee Edge Streaming works: https://community.apigee.com/questions/29759/how-does-apigee-edge-streaming-works.html

Comments
JL04
Bronze 1
Bronze 1

Hi @imesh ,

We are getting below error for one of the Apis as response is huge more than 10mb and Apigee is restricting it :
{"fault": { "faultstring": "Body buffer overflow","detail": {"errorcode": "protocol.http.TooBigBody"}}}
We want to enable response streaming for specific endpoint/api only in APIGEE, can you please help how to enable streaming for one endpoint not for whole application?

It would be great if you can help us out. Thanks in advance.

Version history
Last update:
‎10-08-2018 10:08 PM
Updated by: