Lock Apikey to domain via referrer

I'd like to use our own API from a javascript frontend and therefore our own apikey will be fully exposed to the public. To prevent someone just doing a simple copy and paste into their own site I'd like to verify that the request came from ours by checking the HTTP referrer header.

My theory at the moment is that I could store a custom variable against our app in Apigee. When the proxy checks the key it also if the custom variable is present and if so verifies that the domain matches the referrer.

Obviously this can be faked with a very simple cURL command but since our website is public anyway they could get the data from there if they tried hard enough, so I'm not looking for 100% blockage, just enough that it would make other website devs register for their own key.

Is this sensible or is there a smarter alternative?

update

I added the following condition to the step in PreFlow that would have checked the Apikey

<Step>
	<Condition>!(request.header.Referer ~~ ".*test.com.*")</Condition>
        <Name>Flow-Callout-CheckKey</Name>
</Step>
	

It's not super secure but since getting an Apikey is free anyway it's only meant to encourage people to do that instead of screen scraping our free site

Solved Solved
0 1 372
1 ACCEPTED SOLUTION

If you're expecting the unwanted traffic to come from other websites copying your API key and using it for their site I would say this is a sensible approach, probably combined with CORS.

If you're worried about unwanted traffic from people copying your API key in general, I would probably suggest adding an IP-address based quota policy as well to catch high-volume scraping with curl or similar tools.

View solution in original post

1 REPLY 1

If you're expecting the unwanted traffic to come from other websites copying your API key and using it for their site I would say this is a sensible approach, probably combined with CORS.

If you're worried about unwanted traffic from people copying your API key in general, I would probably suggest adding an IP-address based quota policy as well to catch high-volume scraping with curl or similar tools.