Is it possible to use External Caching framework like Redis within Apigee Edge?

Not applicable

Is it possible to use External Caching framework like Redis within Apigee Edge?

0 15 1,856
15 REPLIES 15

Not applicable

Yes, assuming you're ok not using the built in policies for caching. You'd basically need to interact with Redis by either putting an HTTP interface like Webdis in front of it, or use a programmatic callout to interact with Redis and put it into your flow variables.

Why would you want to do this? What problem are you trying to solve? Does the Apigee Edge built-in cache not satisfy your requirements? If so, why not?

There might be simpler paths to satisfaction.

Dino,

We need to cache responses that are bigger then 512 KB. What are the options to do that in Apigee.

Thanks,

Surabhi

To ask a new question, click the "Ask a Question" button. 7066-ask-a-question.png

@Dino,

Some of the APIs response size is more then 512 KB. As far as I am aware that is the limitation with Apigee Caching. So I need to find alternative solution.

What would you recommend.

adas
Participant V

@Binaya Kumar lenka

Using an external caching layer, probably isn't a good idea especially since Apigee provides such a sophisticated cache implementations along with policies to exercise response caching, populate, lookup and invalidate cache. Also by going out to an external caching service implemented over http, I am not sure how much benefit you would get because its almost as if you are going out to an external target. So, I would suggest you expand a bit more on your use-case and let us guide to the most appropriate implementation. Without knowing the problem statement, it would be hard to make a recommendation.

But to your original question, you can definitely interact with any external caching framework as long as they are accessible via an http endpoint. You can use the service callout policy to interact with it, but I think the management overhead of the cache would be too high. Better to stick to the built-in cache implementation.

@Binaya Kumar Lenka,

We are using Apigee's cache implementation. But for some of the URIs, JSON response size is more then 512 KB. what I have read so far is, 512 KB is max cache size.

So that's why i am looking for alternative to cache more then 512 KB response.

Did you see Chris Latimer's response? And? Have you tried that approach?

We are exploring that option as well but was wondering if that's a recommendation from Apigee to cache larger objects or if there is anything else can be done before we introduce a new entity in between.

Ahh, I understand. There is nothing now in Apigee Edge itself that is appropriate for use as a caching layer.

I suggest that you evaluate a SaaS like Google Cloud MemoryStore. https://cloud.google.com/memorystore/

This would probably be a better option than managing your own redis cluster.

seems like a good option. but it says it's in beta. Also do you know if an extension would be added for this in Apigee?

Yes, still beta. I don't know the schedule for GA right now.

As for an Extension - we can ask the extensions team @Noah Dietz . But even if that is not in the nearterm plan, you could easily connect to the MemoryStore via a Hosted Target that you implement yourself. The lack of an extension should not be an impediment to you.

We don't currently have an extension on the road map for caching services. I think I'm a little out of the loop here, but if someone can clearly define a use case for us and identify the tech we can open a request and track it. We aren't currently focused on extension development atm, but these are the use cases we need to expand in the future!

Thanks Dino. I'll look into this. This seems like same solution that MS has on Azure as Azure Redis. Will see how it goes.

@Surabhi.gupta

Can you please let us know what solution have you built for the cache data which is larger that 512 KB?

Have you used external caching framework -Redis?

We also have similar use case where our cache data is greater than 512 KB.