Property sets are not returning any values

anuprai
Participant IV

I am using Property sets in Apigee Hybrid to store key value pairs. When I try to read the keys it gives null value or returns error.

File Name - myprops.properties
# myProps.properties file
foo=bar

Using ExtractVariables policy -

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<ExtractVariables name="ExtractVariables-1">
    <DisplayName>ExtractVariables-1</DisplayName>
    <Source>request</Source>
    <Variable name="myVar">
        <Pattern>{propertyset.myprops.properties.foo}</Pattern>
    </Variable>
    <IgnoreUnresolvedVariables>true</IgnoreUnresolvedVariables>
</ExtractVariables>

This variable returns empty value.

Using AssignMessage Policy - This returns error response as Key is invalid.

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<AssignMessage async="false" continueOnError="false" enabled="true" name="Assign-Message-1">
    <DisplayName>Assign Message-1</DisplayName>
    <Properties/>
    <Set>
        <Payload content-type="application/json">{
                "message": "{myVar}",
                "value": "{propertyset.myprops.properties.foo}"
            }
        </Payload>
    </Set>
    <IgnoreUnresolvedVariables>true</IgnoreUnresolvedVariables>
    <AssignTo createNew="false" transport="http" type="response"/>
</AssignMessage>

Using Javascript - This also returns error as invalid Key

 var property = context.getVariable('propertyset.myprops.properties.foo');
 print("property is -"+ property);

Is there anything I am missing here.

Solved Solved
1 41 3,385
1 ACCEPTED SOLUTION

I'm happy to report that the Property Set issue at the proxy scope is now fixed.

Attached is an example proxy showing usage for Assign Message, Extract Variables and JavaScript policies.

property-set-proxy-test-rev3-2021-02-26.zip

View solution in original post

41 REPLIES 41

Can you try using {propertyset.myprops.foo} instead?

Also, I'm assuming you have already used the Management API to update the property set? and also, to view and verify?

https://cloud.google.com/apigee/docs/api-platform/cache/property-sets#create-api

https://cloud.google.com/apigee/docs/api-platform/cache/property-sets#view-api

I have tried with both -

{propertyset.myprops.foo} - This give null value.

{propertyset.myprops.properties.foo}

- This gives error response.

I have added properties file through gui and deployed as new revision.

The docs are wrong, internal bug filed b/178508219

Should be:

<ExtractVariables name="ExtractVariables-1">
   <DisplayName>Extract a property value</DisplayName>
   <Source>request</Source>
   <Variable name="propertyset.MyPropSet.foo">
      <Pattern>{myVar}</Pattern>
   </Variable>
   <VariablePrefix>foobar</VariablePrefix>
   <IgnoreUnresolvedVariables>true</IgnoreUnresolvedVariables>
</ExtractVariables>

The variable to read from is the property identifier in the attribute name and the value is assigned to the variable in Pattern.

You can also do the same using Assign Message policy:

    <AssignVariable>
        <Name>myVar</Name>
        <PropertySetRef>MyPropSet.foo"</PropertySetRef>
    </AssignVariable>

I tried with above but still not getting the values.

Extract Variable policy -

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<ExtractVariables name="ExtractVariables-1">
    <DisplayName>ExtractVariables-1</DisplayName>
    <Source>request</Source>
    <Variable name="propertyset.myprops.properties.foo">
        <Pattern>{myVar}</Pattern>
    </Variable>
    <IgnoreUnresolvedVariables>true</IgnoreUnresolvedVariables>
</ExtractVariables>

This give below error -

{
    "fault": {
        "faultstring": "Illegal propertyset key {} [myprops.properties.foo]",
        "detail": {
            "errorcode": "Bad Request"
        }
    }
}

I tried below after removing .properties but it gives empty value -

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<ExtractVariables name="ExtractVariables-1">
    <DisplayName>ExtractVariables-1</DisplayName>
    <Source>request</Source>
    <Variable name="propertyset.myprops.foo">
        <Pattern>{myVar}</Pattern>
    </Variable>
    <IgnoreUnresolvedVariables>true</IgnoreUnresolvedVariables>
</ExtractVariables><p>
Same is the case with Assign Message Policy -
<AssignVariable>
        <Name>myVar1</Name>
        <PropertySetRef>myprops.properties.foo</PropertySetRef>
    </AssignVariable>

This give 400 Bad Request error.

Below gives empty values -

 <AssignVariable>
        <Name>myVar1</Name>
        <PropertySetRef>myprops.foo</PropertySetRef>
    </AssignVariable>

I was successfully able to access a property set that's attached to my api proxy by using the following pattern in my API proxy

{propertyset.<propertyset file name>.<property name>}

eg if my propertyset file is named testprop.properties, and it contains a property foo:

{propertyset.testprop.foo}

eg

<Set> <Payload>{propertyset.testprop.foo}</Payload> </Set>

If this doesn't work, I suspect that your propertyset resource isnt created as expected?

Hi @dane knezic

I have attached my proxy here. Can you please check if I am missing something?sampletest.zip

Your example works for me, it returns me a response with message bar

Same proxy is returning empty value for me. Is there a way we can check if propertyset resource file is created properly in hybrid run time or not?

I was able to reproduce your situation in my Apigee X (formerly NG SaaS).

It's not working when the Property Set is at the API Proxy scope.

It does work when the Property Set is at the environment scope.

Hi @Kurt Googler Kanaskie,

Is there a way to resolve this issue?

You could create a Property Set at the environment level based on the name of the proxy.

Thanks @Kurt Googler Kanaskie, I am able to retrieve the values from environment level property set.

But we wanted to use proxy level property set as we have a limitation that only 10 environment scoped property sets can created. And if we start creating Property Set at the environment level based on the name of the proxy then soon we will reach this limit.

How can we fix this Proxy level property set issue?

In Apigee X, If you use a proxy-scoped propertyset, there is a bug that you will not be able to retrieve the values if the propertyset resource has a .properties suffix.

If you add the propertyset resource without the suffix, then at runtime you will be able to retrieve the properties.

Let me give a further explanation.

Suppose your apiproxy bundle looks like this:

  Length      Date    Time    Name
---------  ---------- -----   ----
        0  02-18-2021 12:41   apiproxy/
        0  02-18-2021 12:43   apiproxy/resources/
        0  02-18-2021 13:19   apiproxy/resources/properties/
       67  02-18-2021 13:08   apiproxy/resources/properties/set2
       67  02-18-2021 13:08   apiproxy/resources/properties/set1.properties
      360  02-18-2021 12:41   apiproxy/propset-demo1.xml
        0  02-18-2021 13:12   apiproxy/policies/
      421  02-18-2021 13:11   apiproxy/policies/...
        0  02-18-2021 13:11   apiproxy/proxies/
     1458  02-18-2021 13:11   apiproxy/proxies/endpoint1.xml
---------                     -------
     4145                     XX files

Then, at runtime, your proxy WILL be able to retrieve properties from set2, and will not be able to retrieve properties from set1.

This is a bug, ref: b/179371811

We expect to fix this soon. In that case you will be able to retrieve from either set1 or set2.

There is a second factor: if you use the UI to create your proxy and add the propertyset, it always adds the .properties suffix. This means, for now, you need to use the API to import your API proxy, or to add the propertyset to the API proxy, in order to be able to retrieve them at runtime.

Thanks Dino-at-Google for the response. Yes this looks like a bug and it is fixed in Apigee hybrid 1.3.5

I am able to read the property sets in this version. I will try this approach in older versions also.

I'm happy to report that the Property Set issue at the proxy scope is now fixed.

Attached is an example proxy showing usage for Assign Message, Extract Variables and JavaScript policies.

property-set-proxy-test-rev3-2021-02-26.zip

Whoo-hoo! Thank for the followup, Kurt. And the nice example!

@Anup Rai FYI

This thread helped me to get started on property sets. However, I discovered & was wondering that this capability is not extended at a shared-flow level (though addition of properties is possible it doesn't work).

Scenarios like say for example I want to create a shared-flow that builds target.url taking configurations from .properties file at runtime across APIs proxies, then having properties within that shared-flow and the file contains target endpoint config details for all API proxies in the environment.

Also, though documentation says that API Proxy scope level property sets can be created & edited via Apigee API, not able to find relevant API at the given link: https://cloud.google.com/apigee/docs/reference/apis/apigee/rest#rest-resource:-v1.organizations.envi... (all APIs listed here are for Environment scope is what I believe).

 

Scope Runtime behavior Administration

API proxyProperties are available only to the revision of the API proxy that contains the property set resource. No other API proxy or revision of the same proxy can access that particular property set.Administrators can use the /resourcefiles Apigee API or the UI to create and edit property sets. Saving the API proxy in the UI will create a new revision, and the modified property set will be associated with that revision only.
EnvironmentProperties are available to all revisions of all API proxies within that environment. API proxies within other environments cannot access that property set.Administrators must use the /resourcefiles Apigee API to create, view, update or delete environment-scoped property sets. These property sets are not displayed and cannot be edited in the Apigee UI.

Interesting point, property sets do not work in Shared Flows, I hadn't considered that, and you're right, I can specify a property set in a Shared Flow but it doesn't work. Not sure why that is because other resources work (e.g. JavaScript).

Regarding API scoped resource files, there's no API to directly modify those since they are in the API proxy definition. To modify those values you need to either use the UI or "using the API" download the API proxy bundle, un-bundle it, edit the resource file contents, then re-bundle and import the zip file. 

Property Sets: https://cloud.google.com/apigee/docs/api-platform/cache/property-sets#create-api

Is there a fix or a feature request to resolve the "property sets do not work in Shared Flows" issue ?

I'm using a shared flow for a error handler where I have a javascript file trying to read data from a properties file which is attached to the shared flow

 

shared_flow_properties_issue.png 

No AFAIK, it's by design only in API Proxies. But ya it'd be great to have this in working in Shared-Flows.

@shrenikkumar-s Ok Thanks for the response. I was able to progress by scoping my PropertySet at the Environment level (using Management APIs) after which I was able to use it in my Shared Flows.

However this solution beats the purpose of a self sufficient SharedFlow bundle that can be shared independently across environments. Hopefully the restriction is removed for having a PropertySet as part of the SharedFlow itself. 


@SidDas wrote:

However this solution beats the purpose of a self sufficient SharedFlow bundle that can be shared independently across environments.


It that's the case, why bother at all with property sets? 

I'm assuming your mapping codes (e.g. "B2_500_02") to some descriptive text in your property set. You could achieve the same using a JS object. For example:

 

const codes_to_descriptions = {
  P1_400_01:"Bad Request",
  P1_400_02:"Invalid content type",
  B2_500_01:"Internal Error",
  B2_500_02:"Backend Error"
};
// testing
var code = "B2_500_02";
var description = codes_to_descriptions[code] ? codes_to_descriptions[code] : "Unknown Error";
var error_message = {
  "code":code,
  "description":description,
  "info":"https://developers.exco.com/errors#"+code
};

 

 

 

@kurtkanaskie Many thanks for your suggestion and indeed it can work and will also meet the objective of having a fully independent SharedFlow for handling errors. However, for adding new error maps (when new backend and their respective errors are introduced) it will require a code change and hence testing, etc.

My objective was to have this as a dynamic and customizable error handler which operations can maintain and add/change without the need for code change/testing/deployment etc.

I also considered a JAR file with a CSV included in the same, but changes to the CSV will again require the creation of a new JAR and hence re-deployment of the package.

So to keep the error mapping data decoupled from the error handling framework, I could either go with a hosted data accessible via an API or keep it within Apigee using the Property Sets

Even with proxy (or if we supported shared flow) scoped property sets, it would be a code change to the property set resource and redeploy.

To support configuration decoupling you have options:

  1. Use a Property Set at the environment level (not supported at Org level).
  2. Use a shared JavaScript resource at the environment level for the mapping (not supported at Org level).
  3. Use Key Value Map at the Org level - an entry can be a single JSON object (escaped) or individual entries for each code mapping.

Given your cross-environment requirement, I recommend using a KVM; entries can now be managed via the API. You would use a KVM to get the entry and then access via JavaScript.

For example, if you created a single JSON entry for the "codes_to_descriptions" mapping via the Apigee API (note the use of escaped JSON):

curl -X POST 'https://apigee.googleapis.com/v1/organizations/$ORG/keyvaluemaps/error-handling-v1/entries' \
--header 'Authorization: Bearer $TOKEN \
--header 'Content-Type: application/json' \
--data-raw '{
  "name" : "codes_to_descriptions",
  "value" : "{ \"400.001\":\"Bad Request\", \"401.001\":\"Invalid API Key or Token\" }"
}'

You would get via KVM policy (note the Get is not using an index attribute):

<KeyValueMapOperations name="KV-CodesToDescriptions" mapIdentifier="error-handling-v1">
    <ExclusiveCache>false</ExclusiveCache>
    <ExpiryTimeInSecs>300</ExpiryTimeInSecs>
    <Get assignTo="codes_to_descriptions">
        <Key>
            <Parameter>codes_to_descriptions</Parameter>
        </Key>
    </Get>
    <Scope>organization</Scope>
</KeyValueMapOperations>

And access in JavaScript using:

var codes_to_descriptions = JSON.parse(context.getVariable('codes_to_descriptions'));

 

Excellent ! Thanks for the detailed response. Some more thoughts on the options you provided for the de-coupling of the config (error map) from the code (shared flow for error handling):

  1. Property Set at Environment Level: I like the fact that I can have the config in a file (which can be versioned controlled) and easy to read, update and maintain. However, I'm not sure if I can iterate the contents of the name value pairs in JavaScript. It looks like the only access option is by a key i.e. context.getVariable('propertyset.MyPropSet.foo);
  2. Shared JS Resource: This is a good self contained option, but the config will be difficult to maintain in code as the error_codes can easily go up to 100+ (considering different types of backend error desc for the same code)  and this will require code changes and deployment to add/update
  3. KVM with JSON Data: This is a good option as it can be maintained at Proxy, Env or Org level. Also the fact that the entire content can be read as a single KVM key, will allow me to iterate the code mappings and match the correct one. The only gripe I see with this option is the add/update of the KVM with all the escaped contents. It will be very messy to load the json data for 100 mappings. Would be perfect if we can load a KVM value from the contents of a JSON file - will this be possible? i.e. reference a JSON file in the CURL command for keyvaluemaps Mgmt. API that you have shown above 

Is CSV Resource File Possible: This would have been the best option to store the Config Error Map and it is very easy to update by opening in Excel spreadsheet. If there is a way to store this at an Env/Org level i.e. set the CSV data to a KVM by loading it from a file using the CURL command and then access the KVM content by Shared Code. This will be the perfect solution.

Apologies for the longish response, but your thoughts have been very helpful. Thanks in advance.

@kurtkanaskie I tried the raw data option mentioned by you for adding the JSON content as a KVM and it works, here is the command:

 

curl -X POST 'https://apigee.googleapis.com/v1/organizations/app-modernization-istio/environments/prod/keyvaluemaps/error-handler/entries' \
-H "Authorization: Bearer $TOKEN" \
-H "Content-type: application/json" \
--data-raw '{
	 "name" : "errormap",
  	 "value" : "{\"errorcode\":\"500\",\"errors\":[{\"target_identifier\":\"T01\",\"error_map\":[{\"original_error_desc\":\"TheJDBCconnectorhadatimeoutissuewhileconnectingtoOracleserverat3.20.40.60\",\"transformed_error_desc\":\"Ourservercouldnotprocessthedatainyourrequest.Pleasecontactsupport@mycompany.comwiththistransactionidentifierforfurtherdetails\",\"transformed_error_text\":\"Theserverencounteredanunexpectedconditionthatpreventeditfromfulfillingtherequest.\",\"transformed_error_code\":\"500\"}]},{\"target_identifier\":\"T02\",\"error_map\":[{\"original_error_desc\":\"NGINXtimeoutoccuredandrequestcouldnotbeprocessed\",\"transformed_error_desc\":\"Apologies,atransientnetworkerrorhasoccured,pleaseretrytherequest.Pleasecontactsupport@mycompany.comwiththistransactionidentifierifthisissuecontinues\",\"transformed_error_text\":\"Transientnetworkfailurehasstoppedthisrequestfrombeingprocessed\",\"transformed_error_code\":\"500\"}]}]}"
}'

 

However, this will become unweildy when there are 100 error mappings instead of two, so I tested by reading JSON data from a file into a KVM Entry but getting an error, any thoughts on what content in the JSON file is incorrect - I validated the content online as well. Below is the JSON file and the commands I'm using to create the KVM along with the error.

1. CURL Command:

 

 

curl -X POST 'https://apigee.googleapis.com/v1/organizations/app-modernization-istio/environments/prod/keyvaluemaps/error-map/entries' \
-H "Authorization: Bearer $TOKEN" \
-H "Content-type: application/json" \
-F file=@/home/s_das/error_dictionary.json

 

 

2. Error I'm getting:

 

 

{
  "error": {
    "code": 400,
    "message": "Invalid JSON payload received. Unable to parse number.\n--------------------\n^",
    "status": "INVALID_ARGUMENT"
  }
}

 

 

3. The contents of the JSON file that I tried. Also tried the escaped version as well as the no space version of the JSON content

 

 

{
	"erorr code": "500",
	"errors": [{
			"target_identifier": "T01",
			"error_map": [{
				"original_error_desc": "The JDBC connector had a timeout issue while connecting to Oracle server at 3.20.40.60",
				"transformed_error_desc": "Our server could not process the data in your request. Please contact support@mycompany.com with this transaction identifier for further details",
				"transformed_error_text": "The server encountered an unexpected condition that prevented it from fulfilling the request.",
				"transformed_error_code": "500"
			}]
		},
		{
			"target_identifier": "T02",
			"error_map": [{
				"original_error_desc": "NGINX timeout occured and request could not be processed",
				"transformed_error_desc": "Apologies, a transient network error has occured, please retry the request. Please contact support@mycompany.com with this transaction identifier if this issue continues",
				"transformed_error_text": "Transient network failure has stopped this request from being processed",
				"transformed_error_code": "500"
			}]
		}
	]
}
{
	\"erorr code\": \"500\",
	\"errors\": [{
			\"target_identifier\": \"T01\",
			\"error_map\": [{
				\"original_error_desc\": \"The JDBC connector had a timeout issue while connecting to Oracle server at 3.20.40.60\",
				\"transformed_error_desc\": \"Our server could not process the data in your request. Please contact support@mycompany.com with this transaction identifier for further details\",
				\"transformed_error_text\": \"The server encountered an unexpected condition that prevented it from fulfilling the request.\",
				\"transformed_error_code\": \"500"
			}]
		},
		{
			\"target_identifier\": \"T02\",
			\"error_map\": [{
				\"original_error_desc\": \"NGINX timeout occured and request could not be processed\",
				\"transformed_error_desc\": \"Apologies, a transient network error has occured, please retry the request. Please contact support@mycompany.com with this transaction identifier if this issue continues\",
				\"transformed_error_text\": \"Transient network failure has stopped this request from being processed\",
				\"transformed_error_code\": \"500\"
			}]
		}
	]
}
{"erorrcode":"500","errors":[{"target_identifier":"T01","error_map":[{"original_error_desc":"TheJDBCconnectorhadatimeoutissuewhileconnectingtoOracleserverat3.20.40.60","transformed_error_desc":"Ourservercouldnotprocessthedatainyourrequest.Pleasecontactsupport@mycompany.comwiththistransactionidentifierforfurtherdetails","transformed_error_text":"Theserverencounteredanunexpectedconditionthatpreventeditfromfulfillingtherequest.","transformed_error_code":"500"}]},{"target_identifier":"T02","error_map":[{"original_error_desc":"NGINXtimeoutoccuredandrequestcouldnotbeprocessed","transformed_error_desc":"Apologies,atransientnetworkerrorhasoccured,pleaseretrytherequest.Pleasecontactsupport@mycompany.comwiththistransactionidentifierifthisissuecontinues","transformed_error_text":"Transientnetworkfailurehasstoppedthisrequestfrombeingprocessed","transformed_error_code":"500"}]}]}

 

 

I think I found the issue, your JSON is valid when not escaped, the escaped JSON is missing a \ making it invalid.

Just add a \ at the line # 9 after 500, hope it helps 🙂

PS: If it's fine could you please share the Proxy/Shared-Flow bundle of the working solution?

 

shrenikkumars_0-1666546463816.png

 

Great catch and thanks for the response. But still does not work and has the same error. But now that I think, the curl command with the file option does not mention the key and the value like the raw one does - maybe because of this the error is coming?

curl -X POST 'https://apigee.googleapis.com/v1/organizations/app-modernization-istio/environments/prod/keyvaluemaps/error-handler/entries' \
-H "Authorization: Bearer $TOKEN" \
-H "Content-type: application/json" \
-F file=@/home/s_das/error_dictionary.json

These are the two versions I tried for the error_dictionary.json file:

"name" : "error-mapping",
"value" :"{
	\"erorr code\": \"500\",
	\"errors\": [{
			\"target_identifier\": \"T01\",
			\"error_map\": [{
				\"original_error_desc\": \"The JDBC connector had a timeout issue while connecting to Oracle server at 3.20.40.60\",
				\"transformed_error_desc\": \"Our server could not process the data in your request. Please contact support@mycompany.com with this transaction identifier for further details\",
				\"transformed_error_text\": \"The server encountered an unexpected condition that prevented it from fulfilling the request.\",
				\"transformed_error_code\": \"500\"
			}]
		},
		{
			\"target_identifier\": \"T02\",
			\"error_map\": [{
				\"original_error_desc\": \"NGINX timeout occured and request could not be processed\",
				\"transformed_error_desc\": \"Apologies, a transient network error has occured, please retry the request. Please contact support@mycompany.com with this transaction identifier if this issue continues\",
				\"transformed_error_text\": \"Transient network failure has stopped this request from being processed\",
				\"transformed_error_code\": \"500\"
			}]
		}
	]
}"
{
	\"erorr code\": \"500\",
	\"errors\": [{
			\"target_identifier\": \"T01\",
			\"error_map\": [{
				\"original_error_desc\": \"The JDBC connector had a timeout issue while connecting to Oracle server at 3.20.40.60\",
				\"transformed_error_desc\": \"Our server could not process the data in your request. Please contact support@mycompany.com with this transaction identifier for further details\",
				\"transformed_error_text\": \"The server encountered an unexpected condition that prevented it from fulfilling the request.\",
				\"transformed_error_code\": \"500\"
			}]
		},
		{
			\"target_identifier\": \"T02\",
			\"error_map\": [{
				\"original_error_desc\": \"NGINX timeout occured and request could not be processed\",
				\"transformed_error_desc\": \"Apologies, a transient network error has occured, please retry the request. Please contact support@mycompany.com with this transaction identifier if this issue continues\",
				\"transformed_error_text\": \"Transient network failure has stopped this request from being processed\",
				\"transformed_error_code\": \"500\"
			}]
		}
	]
}

Even tried with this content in the json file but did not work. In one of the threads I read that white space is the cause of the error which might explain why the raw data works where I stripped off all white spaces before the curly braces. But then the solution is not feasible with the content as raw data without white space and new lines. Would be a bit unfortunate if that is indeed the case.

'{
	"name" : "error-mapping",
	"value" :"{
		\"erorr code\": \"500\",
		\"errors\": [{
			\"target_identifier\": \"T01\",
			\"error_map\": [{
				\"original_error_desc\": \"The JDBC connector had a timeout issue while connecting to Oracle server at 3.20.40.60\",
				\"transformed_error_desc\": \"Our server could not process the data in your request. Please contact support@mycompany.com with this transaction identifier for further details\",
				\"transformed_error_text\": \"The server encountered an unexpected condition that prevented it from fulfilling the request.\",
				\"transformed_error_code\": \"500\"
			}]
		},
		{
			\"target_identifier\": \"T02\",
			\"error_map\": [{
				\"original_error_desc\": \"NGINX timeout occured and request could not be processed\",
				\"transformed_error_desc\": \"Apologies, a transient network error has occured, please retry the request. Please contact support@mycompany.com with this transaction identifier if this issue continues\",
				\"transformed_error_text\": \"Transient network failure has stopped this request from being processed\",
				\"transformed_error_code\": \"500\"
			}]
		}]
	}"
}'

 

Try JSON.stringify(content), it might work. Share the code if you're ok, that's help to check as well.

SidDas
Participant II

After a lot of tries, I got the correct CURL command to call the JSON data from a file to create a KVM. Wrapped everything up in a single shell script to do the following:

  1. Read the JSON Data from a file and remove line feeds, white spaces and escape double quotes
  2. Store the formatted JSON data (not very readable) in a temp file
  3. Use the CURL command to use the data from the temp file and create a key/value in an existing KVM scope to the environment (can be changed to an org scope as well)
  4. Finally remove the temp file

Here is the JSON Data and the script:

{
	"erorr code": "500",
	"errors": [{
			"target_identifier": "T01",
			"error_map": [{
				"original_error_desc": "The JDBC connector had a timeout issue while connecting to Oracle server at 3.20.40.60",
				"transformed_error_desc": "Our server could not process the data in your request. Please contact support@mycompany.com with this transaction identifier for further details",
				"transformed_error_text": "The server encountered an unexpected condition that prevented it from fulfilling the request.",
				"transformed_error_code": "500"
			}]
		},
		{
			"target_identifier": "T02",
			"error_map": [{
				"original_error_desc": "NGINX timeout occured and request could not be processed",
				"transformed_error_desc": "Apologies, a transient network error has occured, please retry the request. Please contact support@mycompany.com with this transaction identifier if this issue continues",
				"transformed_error_text": "Transient network failure has stopped this request from being processed",
				"transformed_error_code": "500"
			}]
		}
	]
}
#!/bin/bash
## prep_kvm_json.sh
## The script reads normal JSON data from a file, formats it and uses Apigee's Mgmt. API to create a KVM with the JSON data as value
## @author s.das@tcs.com
## @since 24-Oct-2022
##
##Declare variables
KVM_NAME=error-handler
KEY_NAME=error_map
JSON_DATA_FILE_PATH=~/error_dictionary-normal.json
APIGEE_BASE_URL=https://apigee.googleapis.com/v1/organizations/app-modernization-istio/environments/
APIGEE_ENVIRONMENT_NAME=prod
##Create a temp file to store the formatted JSON content
FORMATTED_JSON_FILE=formatted_$(basename $JSON_DATA_FILE_PATH)
##Create formatted JSON file by removing line feeds and escaping double quotes
echo {\"name\" : \"$KEY_NAME\",\"value\" : \"$(cat $JSON_DATA_FILE_PATH | tr -d '\011\012\013\014\015' | sed 's/"/\\"/g')\"} > $FORMATTED_JSON_FILE
##Form the Apigee's Management API to add key/value to an existing KVM
APIGEE_URL="POST ${APIGEE_BASE_URL}${APIGEE_ENVIRONMENT_NAME}/keyvaluemaps/${KVM_NAME}/entries" 
##Call Apigee Management URL using curl
curl -X $APIGEE_URL -H "Authorization: Bearer $TOKEN" -H "Content-type: application/json" --data-binary "@$FORMATTED_JSON_FILE"
##Remove the temporary file with the formatted JSON content
rm $FORMATTED_JSON_FILE

Hopefully now it will be a straightforward way to read the JSON data into the Shared Flow and map the raw error thrown by the backend target to a transformed polished error for the consumer.

The "value" part of the KMV entry is a string, so it needs to be valid JSON in the curl request.
You may find @dchiesa1 's solution useful, it's referenced in this Q&A https://www.googlecloudcommunity.com/gc/Apigee/Apigee-X-KVM-script-helper/m-p/434384

@kurtkanaskie Many thanks for pointing me at @dchiesa1 's solution for an entire JSON content as a value for a single KVM. But since I figured it out myself it was good learning. Will check and compare my script against the solution to see how it can be improved.

I would guess the most accessible way to make your script better is... to avoid using shell to handle json. If you were doing this in a nodejs script, it would be really easy to quote the JSON payload and do the right thing. 

On the other hand, your script is already working! 

Agree with your recommendation - nodejs is a much better option to handle the JSON to a valid string. Since I'm developing this Error Handler Framework for Apigee as an utility, I will probably create different options to manage the Error Map. Node can be a more elegant option as a micro service with a potential frontend to manage changes to the Error Map. Until then, Shell can be a quick and dirty way for admins to add/update a new error description into the map and update the KVM.

Ahh yes, I know nodejs is often used to run microservices. 

What I meant by my earlier suggestion was implementing a nodejs-based command-line utility.  A direct replacement for, or alternative to, a shell script.  Something you run from a terminal. It would have a node requirement of course. 

I suppose if you're handy with python you could probably do similar with Python.  or Windows Powershell if that's your bag.   Shell is not the greatest when dealing with JSON. You can do it, but it takes extra work. These other scripting environments offer more capability dealing with JSON; that's what I meant by the nodejs suggestion.

If you're willing to drop the org-level configuration and use env-level, then I'd suggest using a "Shared JS Resource" as it's the easiest to manage.

Simply put the code in a file and create or update for each environment using curl or maven.

 

curlx -X PUT 'https://apigee.googleapis.com/v1/organizations/$ORG/environments/$ENV/resourcefiles/jsc/JS-ErrorMapping.js' -F data=@JS-ErrorMapping.js

 

The JS policy is:

 

<Javascript async="false" timeLimit="2000" name="JS-ErrorHandling">
    <IncludeURL>jsc://JS-ErrorMapping.js</IncludeURL>
    <ResourceURL>jsc://JS-ErrorHandling.js</ResourceURL>
</Javascript>

 

Updated code gets picked up automatically, no need to re-deploy anything.