Requesting binary data with requestJira when using the Forge Bridge

In my Custom UI app I’m attempting to fetch and display attachments to Jira tickets (image files). To do this I’m fetching the data by calling await requestJira("/secure/attachment/<attachmentId>/attachment") since calling this method requires authentication that requestJira augments the request with.

When I request a PNG image I can see it successfully returned and rendered in the browser dev tools, however when I attempt to use it in my app (for example base64-encoding it and displaying it in an image tag), the image is no longer valid.

After doing some digging, it appears that the response body from requestJira has already been interpreted as text – since reading the body stream returns PNG contents with Unicode replacement characters:


IHDR00*lPLTE !$\^a�������OO/IDATxc€m����=8����'������A(��������)-Jej�IEND�B`�

I found a potentially relevant snippet of code in the async-forge-ui-iframe-renderer file (note the body: r.body ? await r.text()):

                    fetchProduct: async({restPath: e, product: t, fetchRequestInit: n})=>{
                        const r = await B(e, t, n)
                          , {status: o, statusText: i} = r;
                        return {
                            headers: Object.fromEntries(r.headers.entries()),
                            status: o,
                            statusText: i,
                            body: r.body ? await r.text() : void 0


When I use the api.asUser().requestJira method to call this endpoint in my application backend I see the image successfully returned as binary data and I can use it correctly. Since it works in the backend api implementation I was wondering whether this is a bug in the @forge/bridge implementation, and if so, if there is any way to work around it?

Thanks very much!

1 Like

@SamSmyth I saw in this thread that you were involved in building requestJira via lambda - was wondering if you or anyone in your team would know about the above?

Hey @joshp,

Thanks for this report, this would be a bug and I encourage you to create a ticket on our public feedback board FRGE so that it is able to be tracked and triaged more easily. Since the backend is able to fetch the image you may have some luck using a Custom UI resolver to work around this.

Thanks @SamSmyth, created [FRGE-531] Unable to request binary data when using requestJira from the frontend via @forge/bridge - Ecosystem Jira

1 Like

@joshp were you able to read the requestJira response body back as a stream? I was trying to do that recently but found the body was always null - but I could get my results using the .json() method on the fetch response. Normally that’d be fine (I am getting json data after all). However in this case I’m calling a bulk api and it’s a very large result set so I’d actually like to read the results as a stream and process it using a package like JSONStream. Just wondering if there’s a trick to get the body as a stream. I’ll look through the code you referenced as that maybe it’ll reveal something. @SamSmyth any tips?

Interesting @jeffryan, when I request an image attachment it appears that the body isn’t null – I can do the following:

const imageResponse = await requestJira(`/secure/attachment/${attachmentId}/attachment`);
const reader = imageResponse.body.getReader();

And then I can iterate through the reader object (which is a ReadableStream):

while (true) {
    const { done, value } = await;
    if (done) {

Here’s a screenshot from Chrome dev tools:

I wonder if it isn’t the header {accept: ‘application/json’} that I’m passing on the request. I’ll experiment some more and post back if I find anything. Thanks @joshp

Well, I take back what I said about body being null… it’s actually just not defined at all in the requestJira response. And changing the accept header had no effect. It looks like you’re making your request from the client though so perhaps done differently there than from the lambda.

Yeah sorry I should have clarified – this is using requestJira in the frontend from @forge/bridge.

body isn’t available in the @forge/api implementation used in the lambda, however could you potentially use the arrayBuffer that is returned and covert that to a stream?

Something like this:

import { Readable } from "stream";
import Resolver from "@forge/resolver";
import api, { route } from "@forge/api";

// From
function toBuffer(ab) {
  var buf = Buffer.alloc(ab.byteLength);
  var view = new Uint8Array(ab);
  for (var i = 0; i < buf.length; ++i) {
    buf[i] = view[i];
  return buf;

const resolver = new Resolver();

resolver.define("route", async () => {
  const response = await api.asUser().requestConfluence(route`${apiRoute}`);

  const arrayBuffer = await response.arrayBuffer();

  const buffer = toBuffer(arrayBuffer);

  // From
  const readable = new Readable();
  readable._read = () => {}; // _read is required but you can noop it

  // Use the `readable` stream

export const handler = resolver.getDefinitions();

Thanks @joshp. Yes I think I could do that and could then process the data as a stream. However, my main reason for wanting to use a stream was to avoid reading all the results into memory at once (processing and discarding pieces of it as it arrives). My guess is that the arrayBuffer is fully loaded with the results and wrapping it in a Stream wouldn’t help me from a memory pressure standpoint.

Yeah makes total sense - probably worth raising this on the feedback board like my issue above :+1: