Intuitive data mapping using Data Transformer (Script) task

We are excited to introduce the new Data Transformer (Script) task in Application Integration. Data mapping lies at the core of seamless integration and efficient data flow within iPaaS (Integration Platform as a Service) solutions. This new task offers an intuitive and powerful method to perform data mapping and transformation, leveraging the flexibility and expressiveness of a sophisticated templating language.

How does the Data Transformer (Script) task work?

Data Transformer (Script) task helps you write, edit and evaluate custom Jsonnet code. Jsonnet is a data templating language designed to simplify and enhance the ease with which you can manipulate data in Json. It allows you to create modular and reusable Json templates using features like variables, functions, conditionals, and imports.

The custom Jsonnet code can be used to access and transform simple to even complex data payloads like Json into meaningful schema. The transformed output can be stored as integration variables and used in integration. This is represented as a mapping of target integration variables to transformation logic on source.

The jsonnet script will appear similar to this:

local f = import "functions"; // Import all supported functions


// Write your own custom function
local sum = function(x, y) x+y;


local diff = function(x,y) f.abs(x-y); // Use abs() function to find absolute difference


// Use standard library function extVar() to access integration variables
local a = std.extVar("number1");
local b = std.extVar("number2");


// Create mapping of output integration variables to transformed input
{
"sum": sum(a,b), // "sum": a+b is also correct
"difference": diff(a,b)
}

In the above example, the returned mapping would set the value of the sum and difference variable in the integration.

Key Features and Benefits:

  1. Flexible data mapping: With support for writing code, much more is achievable with the Data Transformer (Script) task. You can use predefined functions or write your own custom functions to achieve the desired output.
  2. Large set of functions: Apart from Jsonnet standard library, the task also supports additional functions for users to choose from, enabling them to perform even complex mappings proficiently. All these functions can be used by importing the functions library in your script.
  3. Easy debugging: The new Data Transformer Script task provides detailed error messages and syntax highlighting which allow quick and easy identification and rectification of errors.

Example

Let’s try to solve the basic E-commerce backend scenario using both Data Mapper task and Data Transformer Script task for data mapping purposes.

Problem

Given an orders_request json payload which contains few fields and a  list of items in line_items json array. Find the total order value of all the line items in orders_request payload which is the sum of order value of each item calculated by multiplying quantity with price_per_unit. Here’s a sample payload below:

{
  "order_no": "12345",
  "buyer_id": "raambo",
  "line_items": [
    {
      "line": 1,
      "sku": "tr100",
      "vendor": "Internal",
      "quantity": 1,
      "price_per_unit": 10
    },
    {
      "line": 2,
      "sku": "tbz",
      "vendor": "External",
      "quantity": 24,
      "price_per_unit": 1
    }
  ]
}

Solution

Use Data Transformer (Script) Task to perform quick data mapping:

salonijuneja_5-1689847755968.png

Jsonnet brings expressiveness, flexibility, and maintainability to your script, enabling you to handle even the most intricate requirements with ease. As illustrated in the above example, Jsonnet script looks concise and intuitive.

Let us try to understand the structure of the script. The following steps are performed in the script to solve the example problem:

  1. Import all the Jsonnet standard library functions and predefined custom functions supported by us by importing functions library.
  2. Access the integration variable “orders_request” using f.extVar() function.
  3. Extract the required value from json and store it in a local variable for better code readability.
  4. Use map function to iterate over json array line_items. Then, use the sum function over it to find total_order_value as the sum of order value of all individual line items

This can also be solved using Data Mapper Task as follows:

salonijuneja_4-1689847326214.pngThe steps followed here are:

  1. Get the property line_items from orders_request json.
  2. Iterate over json array line_items to get 2 properties quantity and price_per_unit. GET_PROPERTY() returns a json object thus it needs to be typecast to double for multiplication to get order value per item.
  3. Get the DOUBLE_ARRAY of order values of all line items.
  4. Find SUM() of order values of all items to get total order value and assign it to the total_order_value integration variable.

We hope you find the Data Transformer Script task to be a valuable addition to your Integrations. Please let us know if you have questions or feedback.

Note : Special thanks to @rohitjangid   for providing feedback and helping me write this post.

Resources:

  1. Release notes
  2. Github reference for sample integration using data transformer
  3. Jsonnet Website
  4. Jsonnet Standard Library
  5. Data Transformer Script task
  6. Data Mapper Task
  7. Additional supported functions
  8. ExampleIntegration
8 8 4,106
8 REPLIES 8

Hi

could you provide a simple script example to take an Input string variable "24/11/2023" which represents a date.

I need to a: change the format from MM/DD/YYY to DD/MM/YYYY

b: Get a new output from the variable "24/11/2023" of "24th Novemeber 2023"

 

Total newby to Jsonnet so any help much appreciated

 

Regards Stuart

Hi Stuart,
Data Transformer task has dateFormat function for such use cases.

Here are the example scripts for using the same:
a. 

local f = import "functions";
{
 "date_1": f.dateFormat("24/11/2023", "MM/DD/YYYY", "UTC", "DD/MM/YYYY")
}


b. We don't yet support the `th` format. The closest one will be: `24 November 2023`. Example for it:

local f = import "functions";
{
 "date_2": f.dateFormat("24/11/2023", "DD MMMM YYYY", "UTC", "DD/MM/YYYY")
}

Let us know if you face any issues in trying this out.

Excellent!
I have created the following:

StuartLayton_0-1694617139955.pnglocal f = import "functions"; // Import predefined Data Transformer function library

 

local str1 = f.extVar("APPOINTMENT_DATE"); // Access integration input variable "pubsub_input_payload.appointment_date" with value "24/11/2023"
{
 "date_1": f.dateFormat(str1, "MM/DD/YYYY", "UTC", "DD/MM/YYYY"),
 "date_2": f.dateFormat(str1, "DD MMMM YYYY", "UTC", "DD/MM/YYYY")
}
 
How can I now use/map the new output variables date_1 & date_2 as they do not appear in the mapper?
Thanks for your help

Hi Stuart,
Please create 2 integration variables `date_1` and `date_2` (should be of same name as output variables in Data Transformer Task) of type string. These can be used in other tasks in integration. When the integration executes and Data Transformer Task succeeds, the values get assigned to integration variables and can be used in subsequent tasks. You can also verify the value from execution logs.

Thanks!

Thank you salonijuneja
that worked really well!

Got another question!
we currently use a regex case statement to correct malformed phone numbers.

phone_number     = CASE WHEN LENGTH(REGEXP_REPLACE(REGEXP_REPLACE(REGEXP_REPLACE(phone_number,'\D'),'^4407','447'),'^07','447')) = 12
                                 AND SUBSTR(REGEXP_REPLACE(REGEXP_REPLACE(REGEXP_REPLACE(phone_number,'\D'),'^4407','447'),'^07','447'),1,3) = '447'
                           THEN REGEXP_REPLACE(REGEXP_REPLACE(REGEXP_REPLACE(phone_number,'\D'),'^4407','447'),'^07','447')
                           ELSE NULL
                           END

i.e we get +4407900123456 coming in and we take of the + and if there is a leading 0 after the 44 we remove that too.

I know in App Int that I can use REPLACE to replace the + with BLANK but the question is....is there anyway in the data transformer task to use REGEX? is there a specific function to use like "match" or something else?

Regards Stuart

 

Hi Stuart,
Yes, we support match function in Data Transformer task: https://cloud.google.com/application-integration/docs/data-transformer-functions-reference#match
You can also find list of all the supported functions here: https://cloud.google.com/application-integration/docs/data-transformer-functions-reference 

 

Hi .....yes I think we have it as follows incase it can help others:

local f = import "functions"; // Import predefined Data Transformer function library

local str1 = f.extVar("Appointment_Date"); // Access integration input variable "pubsub_input_payload.appointment_date"
local str2 = f.extVar("Phone_Number");     // Access integration input variable "pubsub_input_payload.phone_number"

local format_phone (phone) =
  if std.substr(phone, 0,2)=='07' then '44'+ std.substr(phone, 1, 99)  
  else if std.substr(phone, 0,3)=='440' then std.substr(phone, 0, 2) + std.substr(phone, 3, 99)
  else phone;

{
 "Appointment_Date_Check": f.dateFormat(str1, "YYYY-MM-DD HH:mm:ss", "GMT", "DD/MM/YYYY"),
 "Appointment_Date_USA": f.dateFormat(str1, "MM/DD/YYYY", "GMT", "DD/MM/YYYY"),
 "Appointment_Date_Text": f.dateFormat(str1, "DD MMMM YYYY", "GMT", "DD/MM/YYYY"),
 "Phone_Number_Amended": format_phone(std.strReplace(str2, '+',''))
}