Get information about a cache API- Cache size doesn't work

Hi,

I have populated my cache with at least 2MB data. Now when I run the below the curl command in the management server I am not able to get the cache current size. It is coming as zero.

curl -X GET http://localhost:8080/v1/organizations/my-org/environments/my-env/caches/myAPICache -u username:password

Response

{ "description" : "myAPICache", "diskSizeInMB" : 0, "distributed" : true, "expirySettings" : { "expiryDate" : { "value" : "08-31-2017" }, "valuesNull" : false }, "inMemorySizeInKB" : 0, "maxElementsInMemory" : 0, "maxElementsOnDisk" : 0, "name" : "myAPICache", "overflowToDisk" : false, "persistent" : false }

Am i missing something?

BTW, I am able to retrieve data from the cache.

Thanks,

Krish

1 3 415
3 REPLIES 3

@Dino, @Anil Sagar, @Mohammed Zuber can you guys help me on this?

Hi Krish, yes maybe I can clarify.

The properties you are seeing in the response are not designed to indicate the current, dynamic properties of the cache as it exists. Instead they are the properties that the cache was initialized with.

When you create a cache administratively, it is possible to set limits on the in memory size, and the maximum number of elements to store in memory, and so on. Apparently for the cache called myAPICache, you have not set these parameters. So when you query the cache, the administrative API is telling you: those parameters have not been set.

I do not believe it is possible to query the current size of a cache "right now" in memory. And also I'm not sure what you would do with that information.

Does this help?

If you have the on-prem version of Apigee Edge you may be able to connect a JMX client to the message processor to determine the size of the cache, but that seems to be a very low-level detail. Are you sure you need that information?

Hi @Dino,

I was away a couple of weeks. Couldn't reply you. Sorry!

My intention is to find out how much capacity we can handle for caching. In our on premise environment, we haven't started response caching capability for any API. Before we open up the gate for response caching we want to see how much capacity we can handle. That's why we are performing a performance testing with 1000 TPS and keep on putting data in the cache. We want to monitor the cache size and performance test result. If we see performance started deteriorating at 5TB of response cache. We know that 5TB caching we can handle. Once we get the number we can distribute the cache size across different APIs.

Does this idea make sense?

Thanks,

Krish