Response Cache policy is failing to cache the APIs response whose size more than 512 KB ?

Hi, For a better performance we have a requirement to cache the APIs response and all the APIs response sizes are more than 1.5 MB. Can anyone please suggest us here the best way to implement Caching in Apigee?

Currently we are using Response Cache policy with expiry of 1 day. But even within day the policy is failing to retrieve the data from cache and the request is going to the backend, which is ending up taking more time to give the response back to the client and causing the business to breach their SLA's. I see from Apigee point of view, it is recommended to only cache the response up to max of 512 KB. This is not meeting our business requirements. So suggest to provide any work arounds on the same?

Note: a Back-end cache implementation is out of scope for us.

0 4 836
4 REPLIES 4

Not applicable

@Nagaraju Is this on cloud or private cloud?

Unfortunately the max size of the payload is 512 kb and if we increase that, it can impact the performance and you will end up in the same case of not meeting the SLA.

If you don't want the whole response to be cached or parsing the response that you get from the cache/backend, you can parse it upfront (see if it can be reduced to 512)and cache the payload that you really want using the populate and retrieve cache policies.

Ignore the above workaround if you want the whole response which is > 512 kb.

How many items are you caching, at 1.5mb each?

You could write some nodejs code to do the caching for you. Rather than using the HttpTarget, use a ScriptTarget, and something like this node-cache module to manage the cached items. Have your nodejs logic look in the cache, and return the result if it is present. call the backend only if no cache hit. Like this (assuming express):

var NodeCache = require( "node-cache" );
var myCache = new NodeCache( { stdTTL: 100, checkperiod: 120 } );
var httprequest = require('request');
...
app.get('/objects/:resourceId', function(request, response) {
  response.header('Content-Type', 'application/json');
  var cacheKey = "resource-" + request.params.resourceId;
  myCache.get( cacheKey, function(e, value){
    if( !e ){
      if(value == undefined){
        // key not present, need to call the backend here
        var options = { 
              timeout : 66000, // in ms
              uri: 'https://mybackend/whatever/' + request.params.resourceId,
              method: 'get',
              headers: {
                'authorization' : 'maybe something here',
                'accept' : 'application/json',
                'user-agent' : 'my-nodejs-code'
              }
            };
        httprequest(options, function(e, httpResp, body) {
          if (e) {
            response.header('Content-Type', 'application/json')
               .status(500)
               .send({ error: "backend"});
          }
          else {
            myCache.set( cacheKey, body, function(e, success){
              // handle error here
              response.header('Content-Type', 'application/json')
                 .status(200)
                 .send(body);
            });
          }
        });
      }
      else {
        // we have a value, send it back
        response.header('Content-Type', 'application/json')
           .status(200)
           .send(value);  
      }
    }
    else {
      response.header('Content-Type', 'application/json')
               .status(500)
               .send({ error: "cache"});
    } 
  });
});

I haven't tried this, but it seems like it would work. But you need to be careful. If you plan to cache 1000's of items at 1.5MB each, you're not gonna have a good time. You will exceed the memory of your system. On the other hand if you are caching 100 items at 1.5MB, that's probably feasible.

It is also possible to use something like Apigee Edge BaaS for a nearby cache. It is not integrated into the Cache policy, and it does not have an auto-expire capability. But you could use it for that purpose. This would be appropriate if you have lots and lots of objects, which are very large. Usergrid allows you to attach "assets" to an item, and that asset can be binary, and quite large. You would need to write the code that reaps stale entries, yourself.

I prefer the nodejs approach, myself. If it's appropriate for your purposes, it's cleaner.

HI @Nagaraju/ @Dino-at-Google

We do have the same business requirement.Was the node.js approach helped to achieve this? Please share any sample proxy/github link if any for caching data more than 512KB for specified time period

Thanks

Maivizhi A

Sure, it works, but it's very simplistic. If you want a heavier cache, you can try Google Cloud MemoryStore.

You'll need to do your own evaluation according to your specific requirements!