The Forge Cache EAP is open and accepting participants!

We are offering an EAP(Early Access Program) for a new Cache Service on Forge. The service will offer partners with a high throughput and low-latency cache for ephemeral data storage.

Developers will be able to leverage Forge Cache to provide a better experience for the end customer by making their apps more responsive and pleasant to use.

A this stage, we believe we can provide a high-performance solution that would support*:

  • A high-performance, high-throughput ephemeral cache solution for Forge
  • The ability to define a flexible TTL of up to 1 hr
  • Atomicity to allow consistency in the face of concurrency
  • Operations like: Get, Set, SetIfNotExists, Delete, IncrementAndGet and DecrementAndGet

If you are interested in Forge Cache and believe this storage solution will help unblock your development journey in Forge, we encourage you to sign up for the EAP.

How do you participate?

What happens after you sign up:

  • You’ll be notified once the storage feature has been enabled so you can start testing on the non-production App ID you submitted in the participation form.
  • You’ll be added to the developer community Forge Cache EAP topic, where you will be able to access announcements, updates, documentation, feedback and other resources.

We are looking forward to partnering with you and receiving your feedback so we can deliver a more powerful storage solution for you!

*Please note, that these features will only be accessible by those participating in the EAP. However we will also be posting relevant updates here to ensure everyone is informed.

Thank you!


Thanks for this. I wrote a relatively simple cache on Forge Storage because I wanted to persist some aggregated/manipulated REST API responses to speed up an app’s initial loading time. The context of this was build pipelines.

As part of this implementation I needed to introduce logic that introduced a limit on the number of things stored in Forge Storage - a kind of cache with limit on number of entries where the oldest entries were removed when new entries were added. Can this be added to the cache API?

Additionally, caching data for a maximum of only 1hr is a little small. The context I outlined above needed data to be cached for between minutes (a running pipeline) and days (for older pipelines that haven’t changed) in order to provide benefit to the end-user.


Hi @jbevan thank you so much for sharing the feedback, this is quite useful. Let me discuss this with the team and get back to you.
In the meanwhile, wondering if you had a chance to opt-in to the EAP? We are actively collecting feedback and would love to hear your thoughts.
Thank you!