Cache orchestration or prepopulation

Hi all, my use case is like this: our client has a backend which is very slow and gives out responses after 30-40 secs, which creates problem for the front end to load things at the dashboard page where some calls are going to backend, now we decided to use Apigee edge to pass through those api calls to the backend but we are planning to run those api calls in the background and getting responses from those calls we want to populate a cache and then allow our frontend to call to Apigee proxy and serve response from cache which is already pre populated. i want all of your suggestions on how to can achieve this cache pre population in the best possible way... thanks

0 4 496
4 REPLIES 4

Hi @Mohammad Ilyas Shah, so if I understand correctly, you need the following:

- one API to trigger the fetching of the resource and populate into cache (but not return the resource)

- one API to fetch the resource from cache

One concern I have with your idea is using Apigee as a dedicated data cache. Apigee's cache has limits associated to it which you might want to review first. Moreover, it feels like an anti-pattern to build an API that relies 100% on Apigee's cache, having no fallback. Thus for pro-active caching (part 1), I would recommend implementing a custom backend. For example, you could write an app to poll the slow backend and store data in Google Cloud storage (https://cloud.google.com/storage). Apigee can then fetch data from GCS using extensions (beta).

Apigee provides the 'response cache' policy, which enables you to implement read-through caching in your API proxies. I.e. when a resource is read, if it's in the cache then the cached value will be returned, otherwise it will make the backend call, cache the response from the backend, and return the object to the API client. The benefit of this is to reduce load on the backend and to decrease latency.

Remember, do not cache errors. Also FYI, data stored in Apigee's cache is not encrypted at REST.

Please check the populate response cache video.

Hi @Mohammad Ilyas Shah,

Did you was able to setup a configuration at Apigee to resolve your issue?

We are facing the same problem, we are thinking to implement 2 caches, but without sucess.

Not applicable

The first thing I would like to know how often your backend response changes and is the response varies to requests?

In case of non varying or less frequent changing backend you can use response cache.

If you want you can use populate cache and lookup cache. If the size is outside the limit in Apigee then you can go for external caching systems.

If the response is varying with request, then caching is not the option. In that case you have to ask the service provider team to do pagination or you can implement streaming in Apigee proxy.

If you will always hit to backend to cache the response then it will increase extra traffic on backend and also will impact other proxies' performance.