Goggle cloud storage

fikri
New Member

Hi Apigee experts

I have various services built on GCP, including GCS and BigQuery. We have another service and data with API on AWS . How to interface between these two platforms if I would like to send/push my data regularly to Google Cloud Storage.

Thanks for help.

Solved Solved
1 5 583
1 ACCEPTED SOLUTION

Hi Fikri, Apigee provides a platform for developing and managing APIs. If you have services on on GCP and AWS, you could use Apigee to create API proxies for those services and let client applications and other services talk to each other via API proxies while providing security, rate limiting, quotas, analytics, and many other API management features.

This would also apply to Google Cloud Storage. You could expose Google Cloud Storage API via an API proxy through Apigee if needed.

In this design, you may first need to consider whether you really need API management features for these APIs. If not, you could directly use existing APIs without having to create API proxies for them.

Hope this would help. If you could share more specifics of your requirements we might be able provide a better answer. Thanks!

View solution in original post

5 REPLIES 5

Hi Fikri, Apigee provides a platform for developing and managing APIs. If you have services on on GCP and AWS, you could use Apigee to create API proxies for those services and let client applications and other services talk to each other via API proxies while providing security, rate limiting, quotas, analytics, and many other API management features.

This would also apply to Google Cloud Storage. You could expose Google Cloud Storage API via an API proxy through Apigee if needed.

In this design, you may first need to consider whether you really need API management features for these APIs. If not, you could directly use existing APIs without having to create API proxies for them.

Hope this would help. If you could share more specifics of your requirements we might be able provide a better answer. Thanks!

Hi Imesh,

We have API management using Apigee on AWS where we have applications and interface to external users or consumers.

What we are going to do is simply transfer data regularly or continuously from the application or services on this AWS using API to Google Cloud Storage. This could be in reverse direction where we collect data using Apigee API on AWS from GCP.

Any suggestion on what is correct and most efficient way to do this?

Thank you!

There are a bunch of options.

This page lays out some of them.

https://cloud.google.com/storage/docs/interoperability

I'm not sure you need API Management to make the data transfer happen.

Thank you for the link!

I am aware of this interoperability, but due to some restrictions imposed in the organization and different managed area we're not allowed to pull out raw data from the storage directly on AWS but only using processed data, which is available via API>

Similarly access from external consumers to query some processed data. Any suggestion how this can be effectively done?

Hi Fikri, According to your explanation and the constraints you have on fetching raw data from AWS S3 buckets, I think you could do following:

  1. Expose data available on AWS S3 via an API proxy by doing required filtering/processing
  2. Secure above API proxy appropriately: https://docs.apigee.com/api-platform/security/api-security
  3. Create a job on GCP Cloud Scheduler for fetching processed data from the above API proxy and writing that to GCS periodically: https://cloud.google.com/scheduler/