Latency of Forge Custom UI bridge/resolver

Hey everyone,

I am seeing very long latencies when calling Forge functions from Custom UI, especially when comparing to a Connect app. I have reduced my code to the bare minimum, but still have not seen any call taking less than 1,800ms which seems really long.

Here’s my stripped down resolver code:

resolver.define('get-all-data', async ({ context }) => {
  return [];

I can trace the call in Safari’s developer console, a typical call would look like this:

Bildschirmfoto 2021-04-08 um 16.41.34

Meaning that (in this case) 3842.1 ms are spend waiting for a response which then takes 0.3 ms to load. Our Jira instance seems to be in AWS eu-central region, which is geographically close to where I am.

When I make similar calls in a connect application, I see much smaller latencies. As in orders of magnitude smaller. Here’s the trace of a comparable call to a (production) connect backend (in AWS us-east, so a bit further away):

Bildschirmfoto 2021-04-08 um 16.59.38

I understand that a lot of factors can affect latency, but a 20-36x difference is a lot. The high latency would make Forge an unfeasible choice for building our app.

Has anyone had a similar problem and how did you work around it?



Hi @osiebenmarck ,

Do you see any performance improvement with repeat calls to your app, compared to the first call? I’m wondering how much of this is Lambda’s “cold start” problem.

Hi @mventnor,

No idea how I missed your reply. I suspected it was a cold start issue at the time, but even repeated calls did not do much to improve things. If I remember correctly, the 1.8s figure was what I got after repeated calls and the 3.8s in the screenshot was a cold start.

As these calls were from a CustomUI app loading data for a user to interact with, we have since moved this particular use case to Connect. However, we have not given up on Forge altogether and are still working with it in other cases, but the latency in the end was a deal breaker that we felt we had very little control over.