How to design to deal large data caching needs?

Not applicable

Hi All

We have requirement wherein we have to call multiple data sources and get data from them. Data from these sources are in different schemas. At apigee layer we need to merge and transform these set of different data and then provide the ready-to-serve data to our client apps over APIs.

There are following options that we are considering. Please advise us which option suits us best and if there is any other alternative approach we can go with.

OPTION A:

Have a job running on local servers which will pull data from all these sources and after merging and transforming will save the data in local file system. There will be APIs which will read data from these files and will return in response to clients.

questions:

1. what is the best way to do file system read and write in apigee edge?

OPTION B:

Create proxies in apigee which will get from different data sources and will store this data in apigee BaaS. Create ready-to-serve data after merging and transforming the data received from all sources. Insert this ready-to-serve data in BaaS again in separate collections. This data will be returned to clients by APIs.

questions:

1. what is the minimum number of nodes that will be required to install and setup apigee BaaS in our existing physical architecture.

OPTION C:

Get data from all sources and store that data in appigee-cache objects. After merging and transforming, save the ready-to-use data again in new cache objects, and delete the raw data of all sources stored in cache.

questions:

1. can the cache provided by apigee edge, capable of storing data upto 200 MB in size?

Thanks

Akhil

0 0 452
0 REPLIES 0