Forge remote bearer token for storage API

I have few questions about calling storage API from external system, for instance, Azure Function.

In the documentation, it is said that I need to supply bearer token for the authentication. I see another documentation for remote, saying that forge invocation token is sent as bearer token underneath request headers. I was wondering

  1. if FIT(forge invocation token) contains the information about storage of which installation instance of forge app that I want to deal with.
  2. And if I have to use invokeRemote() to get FIP underneath request headers because I am sending HTTP request from index.js file (forge platform) to Azure Function(HTTP trigger), not in custom UI script, and I am not seeing authentication request header from azure function side.

It might but you should not depend upon that data inside the token. The policy of Atlassian regarding tokens is they should be considered opaque:

I’m not sure I understand. Can you explain your index.js in Forge terms? In other words, where is the JavaScript function that makes the HTTP request referenced in the Forge manifest?

Thanks for the reply,
So basic idea is that I have a function inside index.js that runs hourly which sends HTTP request to the Azure function which will trigger external function (Azure function has HTTP trigger). Index.js is located in src folder that is in the same directory as manifest file. Note that I use custom UI approach for the app.

    - key: storageSync
          handler: index.setupDataBase

    - key: storageSyncTrigger
      function: storageSync
      interval: hour

This is part of my Manifest file that declares the function and scheduledTrigger.

await api.fetch(url, {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json'
    body: JSON.stringify(data)

Index.js file is located inside src folder that is in the same directory as where manifest file is. And it has a function that runs code above which sends HTTP request to my Azure function.
This part works as Azure Function is triggering every hour as a reaction.

The problem I have is that it is unclear for me how to call Storage API calls from Azure Function side. So I went through some documentation and found that it needs bearer token, which can obtained from request header (Forge Invocation Token as a bearer token), but I could not find it from the request I did.

So I did further research and found existence of invokeRemote() from bridge API, but the problem is that I cannot use bridge API inside the index.js file I was working on, as it is not inside the custom UI app (bridge API can be only used in custom UI). So I was wondering triggering HTTP request without using invokeRemote() is even proper way to do it and that is why I was not getting the proper request headers.

Sorry if I made my explanation unclear, but I have hard time going through these documentation as I am relatively new to these knowledge.


I see. The scheduledTrigger section is what makes the problem clear to me. You will not be able to send Forge Storage tokens via Forge Fetch. So you must use Forge Remote, which means your backend (the Azure Function) cannot be called via arbitrary API tokens. You must follow Forge Remote model.

You’ll want to follow the docs for configuring scheduled triggers to invoke a remote backend.

Overall, I’m curious why you would want to synchronize storage, rather than use it directly as a Forge Remote back-end so that you can call the actual storage from your frontend. Or maybe I don’t understand the use case for your app as a whole?

Thanks for the reply @ibuchanan.

I see that if I use ‘endpoint’ instead of ‘functions’ for scheduledTrigger, it will trigger the endpoint instead of the function and this would be the proper use of forge remote (If I am understanding correctly) which will provide correct request headers.

Perhaps the naming of function was bit misleading. I am using forge storage (Storage API) as an actual storage and is the only storage I am using. I am simply trying to run a function that creates database structure for a storage in Azure Function due to time out limit in Forge. The name storage sync comes from the fact that I am trying to ‘sync’ the storage with the confluence spaces so that it contains correct information.

Does this answer your question? Or am I misunderstanding something.

Yes, that’s correct. Coincidentally, I have been talking to one of our technical writers about better documenting the distinction between endpoint and function, if not calling out both options under the resolver node in the manifest. In short, it’s too easy to confuse these.

Yes. And I can understand why you need to off-load such an intensive task. That said, I think the Forge team would be disappointed to learn that you need to.

I wonder if were aware of entity properties? Specifically, if your data is 1:1 with spaces, then you don’t even need Forge storage. Your apps’ data is inside Confluence, which confers all the benefits of Confluence Cloud like data residency, backup/restore, etc.

Thanks for the reply.

I did not know there was something called entity property, but I am using something similar to that through REST API such as content properties. Unfortunately, I think the formatting of the data I have is bit too complex, and I would need to use forge storage for that. I think you have answered the question of this thread, so I will mark that as a solution. Thank you for the responses.