What is the best practice for Facet search API design ?

Hi,

I have a requirement for writing a reports type microservice which will basically connects with different databases and returns the data in a form of resultsets, file output (downloadable) and with sorting and many (dynamic) filters. I have gone throw some recommendations mentioned in e-books (from apigee site) but still i am thinking what would be the best way to achieve this.

Two ways I am thinking -

a- I have more and more filters(parameters) which doesn't look good in the query string if I go with GET

b- If I do POST request. is it good idea? in case of Facet (selecting any field column or row and sending the combined query to backend) Is it secure as I am allowing POST and sending all parameters as requestBody.

Hybrid way - I am thinking to have two APIs, first the POST request which will store the parameters in serialize manner in some cache and return that cache id or ref in the response and in the GET request, I will send that cached reference in query parameter or additional header, In the backend first get the parameter from the cache ref and pass to the controller. Isn't it complex solution?

The UI exists on the devportal but reports has to go on microservices.

Any suggestions would be helpful.

Solved Solved
0 2 666
1 ACCEPTED SOLUTION

I don't see that using the POST body is any more or less secure than using GET - both have SQL injection risks which you should take care to address either in your API proxy or backend. (Unless your query parameters might include passwords or PII, in which case you probably want POST)

Depending on the size of the data you're returning, you may find something along the lines of the BigQuery jobs API useful: https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/insert

Basically, clients issue the query with a POST which returns immediately with an id and creates a 'job' that runs in the background. The client can then use GET requests https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/get / https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/getQueryResults to get the status / results of the query.

One of the advantages of this is that avoids HTTP timeout issues with long-running queries, which may or may not be an issue for you.

View solution in original post

2 REPLIES 2

I don't see that using the POST body is any more or less secure than using GET - both have SQL injection risks which you should take care to address either in your API proxy or backend. (Unless your query parameters might include passwords or PII, in which case you probably want POST)

Depending on the size of the data you're returning, you may find something along the lines of the BigQuery jobs API useful: https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/insert

Basically, clients issue the query with a POST which returns immediately with an id and creates a 'job' that runs in the background. The client can then use GET requests https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/get / https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/getQueryResults to get the status / results of the query.

One of the advantages of this is that avoids HTTP timeout issues with long-running queries, which may or may not be an issue for you.

It makes sense, thanks, I will explore more on this and will let you know if I have more cases.