Enhancing your business integration flows with GenAI / Vertex AI

In a previous blog post, we've shown how we can use the Vertex AI task to easily access the power of Google Vertex AI from Application Integration. This blog post will exemplify a real use-case of that. We firmly encourage you to understand it and consider how it could be applied to your use-cases - the ideas shown here are generic and could be applied in multiple scenarios! 

The overall flow for this business process is shown below:

carloscabral_0-1704816871318.pngThis fictional company uses a custom AppSheet app (but hey - it could be any sort of app - web, custom mobile… you name it!) for handling expense approvals. Also, this company standardizes on Jira Cloud for approvals of all sorts. Jira Cloud provides plenty of APIs for working with its objects - Issues, Tasks, Boards and much more. But this company had a few requirements:

  1. Time to market was crucial. They had to release this new app in just 2 weeks - no more than that.  Furthermore, they had no developer teams at their disposal for this project - just a few "IT Pros" and business owners who knew how the process should be like.
  2. They had the requirement of summarizing and generating text for the issues and Jira objects.
  3. They had no deep Jira Cloud developer / API  knowledge.

Application Integration was their choice because…

  1. It is a no/low-code platform for developing integrations - and it is fully managed by Google Cloud!
  2. They can use the native Vertex AI task to harness the power of Google Cloud's LLM with just some prompt experimentation!
  3. The native Jira Cloud connector does the heavy lifting - from event subscription, to creating objects and handling potentially changing schemas within Jira without having to code or be PhDs in Jira's APIs

As for the flow… Basically the client app needs to send an API call with some information (such as an Id for the expense and also the requester name) that would trigger the integration. Then, the integration would work on those inputs (maybe adding an Issue Description from them with GenAI!) and then, natively send the data to Jira to create a new approval issue. Finally - we also need to get notified from Jira about updates on the Issue approval process and inform (also via an API call) the client API of the changes - maybe with the current timestamp and status.

Cool! There's a lot of interesting portions on this flow, but let's now focus on the GenAI portion. Let's go!

Using Application Integration Variables as Prompt Templates for GenAI

For our use case, let's say that we want to generate a creative (or formal, or long, or short…) description for our approval. We could create static strings for all of that or… we could use GenAI! 

The strategy used here is using Local Variables in App Integration with a "template prompt". Check the sample prompt template below:

 

Within business context, generate a short request for ticket approval. You are somewhat dramatic, human and sentimental when generating such requests.
Input: Generate a desperate approval for user Pedro. 
Output: Please, please, PLEASE! Approve this ticket for Pedro, he is desperate!
Input: Generate a desperate approval for $requester$.
Output:

 

Prompt engineering is out of scope of this post, but notice we used the common "context" and "one-shot" strategy for the prompt - that explained the overall context/scenario of what should be done and we provided a single example. The point is - we can influence the LLMs in many ways depending on the prompts we provide - more examples, more details, asking for structured data to be returned, asking for classification… It is a brand new world and experimentation is recommended!

Notice that I added a variable within the prompt definition - namely, $requester$. That is the input for the integration! So, dynamically, each and every LLM answer will be generated for a specific given user at each integration run!

Expanding on this idea of "Prompt Template as a Variable", we could have:

  1. Multiple different prompts / prompt templates are stored also as local variables and depending on conditions of the Integration flow, we could select one or another. Let's say, if we want to personalize email messages for the email task - we could have prompt templates for approvals and rejections!
  2. The prompt templates could come from external systems - such as a Google Sheet row or a database, such as Google Firestore. When the prompt templates are external to Application Integration, other systems can dynamically update the template without someone needing to edit or create a new Application Integration local variable / version. For example, maybe your marketing team is using Google Sheets and they can update the prompt template or sample inputs directly on Sheets - our Integration would then dynamically leverage the new prompts!

Similarly, we could also work with LLM-generated structured outputs. While the default behavior for GenAI LLMs is basically generating text, we could also create prompts that ask the model to generate output in a format such as JSON. With a few sample inputs in the prompt, the LLMs can do it quite nicely and this will attach nicely to App Integration Data Mapping and other tasks.

The generative AI use-cases alongside Application Integration are many and varied, here a few examples. Use them as examples to spark your imagination on what makes sense for your own use-cases and challenges!

  1. Generate Images with Imagen and use App Integration to store them in a Google Cloud Storage Bucket (and maybe even notify someone via email when that happens!);
  2. Generate and concatenate a series of HTML generated by code-bison and then use it for triggering rich email campaigns using one of our email service connectors
  3. Store summarized versions of data received from one system / API on a SQL Database…

There are many scenarios! The point is that we can use Application Integration as a no-code platform for templatizing, selecting, parallelizing, processing and concatenating both LLM prompts and their outputs - unlocking the potential of LLMs without necessarily jumping into Python/Notebooks or a full MLOps pipeline!

Vertex / Google Cloud AI + Application Integration - beyond GenAI

While the Generative AI use cases are trending, many other useful Vertex / Google Cloud AI use cases can be achieved when joined with Application Integration.

  1. A retailer might want to be able to segment customers into different personas for targeted marketing or forecasting the demand for products in the future. There are many ways to achieve this with Vertex AI - but once the models are deployed, we can use Application Integration's native Vertex AI task to make an online prediction for, let's say, a new persona that is currently accessing their ecommerce website for personalization. Maybe their platform invokes an API whose fulfillment is handled by Application Integration and Vertex AI. Also, we could also publish a Pub/Sub Message in a queue to inform upstream systems from within the Integration.
  2. Fraud detection is also a very common use case for ML/AI. A bank might want to make an online prediction on a transaction to detect an anomaly and also collect analytics on the trends of such behavior. A combination of Apigee, Application Integration, Vertex AI and Big Query could implement such use-case.

Conclusion

Application Integration excels at simple, no-code data transformation, data handling and complex flow manipulation. Also, it is great for connecting with different business systems. Native tasks to connect with Google Cloud services - including Vertex AI - make it easy to authenticate and manipulate data for such services without needing to explicitly become an expert on them. 

Generative AI will unlock many use-cases that were not possible or quite hard before it, without the need of custom training or deep ML/AI expertise. Application Integration can further accelerate such use-cases with no-code, which is quite useful for IT-Pro users that won't necessarily work with Python, AI Notebooks but rather prefer a no-code approach.

 

Note : Special thanks to @vsy  for collaborating with me on the blog posts!

6 1 451
1 REPLY 1

Love this- From now all my travel requests are gonna use the Dramatic text from Gen AI to have it approved in no-time. 🙂

Top Solution Authors