Async Events Queue times out after 25sec

I have this setup:

  consumer:
    - key: sync-queue-consumer
      queue: sync-queue
      resolver:
        function: sync-consumer-func
        method: sync
  function:
    - key: sync-consumer-func
      handler: queues.handler

And the function is in queues.ts and looks roughly like so:

import Resolver from '@forge/resolver';

const resolver = new Resolver();

resolver.define(
  'sync',
  async ({payload, context}: {context: Context; payload: SyncPayload}) => {
    // contents
  })

export const handler = resolver.getDefinitions();

With const syncQueue = new Queue({key: 'sync-queue'}); instantiated in a resolver and doing only one push:

  const jobId = await syncQueue.push({
    // some props here
  });

The queue runs, and then it says:

Function result will not be returned to the Forge platform, as the function did not complete within 25 seconds. In invocations outside tunnel, functions that exceed the time limit are terminated.

and restarts the job over and over till it stops. From my understanding the 25 seconds limit shouldn’t apply here…

1 Like

Hi

Does this answer your question?

Apologies in advance if I’ve misunderstood.

1 Like

How else would you handle it? Didn’t see alternatives in the docs.

1 Like

Current solution involves “chaining” async events.

If you have a long running task you split it into multiple steps. Each step constitutes another async event.

We’re looking into more support for long-running tasks in Forge here Improving support for long-running tasks in Forge

1 Like

Can you point me to an example of this “chaining” docs/code example? I need to be able to make over 100 network requests for initial syncing purposes for large projects.

1 Like

Unfortunately it doesn’t seem we’ve written about this pattern in our documentation.

Our limits say you can push 50 events in a single request. If this is for initial setup the 500 events per minute won’t affect you here.

This is not indicative of elegant code or best practice. I’m sketching out the idea.

const CHUNK_SIZE = 3;
const payloads = [];
let lastPayload = [];
for (const url of urls) {
  lastPayload.push(url);

  if (lastPayload.length >= CHUNK_SIZE) {
    payloads.push(lastPayload);
    lastPayload = [];
  }
}
if (lastPayload.length > 0) {
  payloads.push(lastPayload);
}

await queue.push(payloads);

// ... in the event listener
resolver.define("event-listener", async ({ payload, context }) => {
  console.log(payload);
  // payload is an array of up to 3 urls e.g. ['https://atlassian.com', 'https://example.com', 'https://random.org']
});

export const consume = resolver.getDefinitions();

Let me know if I need to rework the code example

2 Likes

What exactly is initial setup?

Sorry had other Forge apps on the brain.

“If this happens once” is what I probably should have said.

I’ve seen a common pattern where a huge amount of work is done after installation to initialise the database to the right shape and have the right stuff.

e.g. If this is happening every time a Confluence comment gets made you may have to start thinking about rate limits, you may surpass the 500 events per minute limit.