Alternatives for Apigee caching because of the 2.5 million items limit in each cache

Some of our services use a DB to store/retrieve identifiers.

We want want to implement a caching mechanism to limit the number of reads on the DB and also reduce the latency of these reads.

Initially we wanted to use the Apigee built in caching policies, but noticed the "2.5 million items in each cache":

https://docs.apigee.com/api-platform/reference/limits#cloud-private-cloud

The cache would potentially have to handle/store more than 10 million items.

Any suggestion for an approach we can use for this, eg. extensions or other built in integrations?

Solved Solved
0 2 219
1 ACCEPTED SOLUTION

Good question.

How do you connect from Apigee to the Database?

The preferred architecture is to use a service (microservice?) that presents an HTTP facade to Apigee, and queries the backend database. If I understand your requirements, that service layer would be a good layer in which to build caching.

You could use memcached, redis, or other systems for caching. Or use a caching library... If you're using Java for the microservice, then you could use ehCache or Google's Guava cache for caching results.

View solution in original post

2 REPLIES 2

Good question.

How do you connect from Apigee to the Database?

The preferred architecture is to use a service (microservice?) that presents an HTTP facade to Apigee, and queries the backend database. If I understand your requirements, that service layer would be a good layer in which to build caching.

You could use memcached, redis, or other systems for caching. Or use a caching library... If you're using Java for the microservice, then you could use ehCache or Google's Guava cache for caching results.

We do use a microservice to query the DB, so we'll have a look at that as an option. Thanks mate.