Guides

Migrating to BuildJet Cache

The BuildJet cache action is designed with user-friendliness at its core, no need to set up your own object storage or learn a new syntax. It is compatible with any runner - official, self-hosted, or BuildJet, and no matter the runner used, BuildJet doubles the free storage, providing 20 GB/repo/week storage space for free. Just like with GitHub Actions cache, when you consumed your free storage quota, we'll simply remove the oldest entry to make room for the new one.

To fully understand the reason why we built a GitHub Actions cache alternative, and its benefits read our launch post.

#

Migrating actions/cache actions

Migrating to BuildJet's cache from action/cache is a straightforward process. The two caching systems are fully compatible; simply replace actions/cache with buildjet/cache in your workflow. Once done, you'll be able to leverage BuildJet's superior reliability and speed.

yaml
1
...
-
- uses: action/cache@v3
+
- uses: buildjet/cache@v3
4
with:
5
path: ~/.npm
6
key: buildjet-node-${{ hashFiles('**/package-lock.json') }}
7
restore-keys: |
8
buildjet-node-
9
...

Furthermore, if you're using actions/cache/restore and actions/cache/save, you simply need to replace with buildjet/cache/restore or buildjet/cache/save respectively.

As BuildJet is fully compatible with actions/cache, we simply refer to the official documentation action/cache documentation for the detailed instructions. You can find the official documentation here.

#

Migrating actions/setup-* actions

BuildJet also provides a drop-in replacement for most popular setup actions. Today, BuildJet has compatible setup actions for setup-node, setup-python, setup-java, setup-go and setup-dotnet. They all have the same syntax as their official counterparts, but uses BuildJet's reliable & fast cache under the hood.

Simply replace actions/setup-* with buildjet/setup-* in your workflow, and you're good to go. To fully take advantage of the new BuildJet cache in your setup action, make sure you enable caching in the setup action by setting the cache property to npm (or pip, gradle, etc.). Please check the documentation in of your setup action.

Please note that we only support the latest version of each setup action. If you're using an older version, you'll need to upgrade, or use the BuildJet cache directly.

For example (using setup-node):

yaml
1
...
2
- name: Setup node
-
uses: actions/setup-node@v3
+
uses: buildjet/setup-node@v3
5
with:
6
node-version: 14
+
cache: npm
8
...

If we don't support your setup action, or you're not using one, you can still use BuildJet's cache to cache your dependencies. You just specify the path to the directory you want to cache, and a key to identify the cache. Here is a list of examples for popular languages and tools, that don't have an official BuildJet setup-* action.

#

Rust

Cargo

yaml
1
- uses: buildjet/cache@v3
2
with:
3
path: |
4
~/.cargo/bin/
5
~/.cargo/registry/index/
6
~/.cargo/registry/cache/
7
~/.cargo/git/db/
8
target/
9
key: ${{ runner.os }}-cargo-${{ hashFiles('**/Cargo.lock') }}

#

PHP

Composer

yaml
1
- name: Get Composer Cache Directory
2
id: composer-cache
3
run: |
4
echo "dir=$(composer config cache-files-dir)" >> $GITHUB_OUTPUT
5
- uses: buildjet/cache@v3
6
with:
7
path: ${{ steps.composer-cache.outputs.dir }}
8
key: ${{ runner.os }}-composer-${{ hashFiles('**/composer.lock') }}
9
restore-keys: |
10
${{ runner.os }}-composer-

#

Deno

yaml
1
- uses: buildjet/cache@v3
2
with:
3
path: |
4
~/.deno
5
~/.cache/deno
6
key: ${{ runner.os }}-deno-${{ hashFiles('**/deps.ts') }}

#

Swift

Carthage

yaml
1
- uses: buildjet/cache@v3
2
with:
3
path: Carthage
4
key: ${{ runner.os }}-carthage-${{ hashFiles('**/Cartfile.resolved') }}
5
restore-keys: |
6
${{ runner.os }}-carthage-

CocoaPods

yaml
1
- uses: buildjet/cache@v3
2
with:
3
path: Pods
4
key: ${{ runner.os }}-pods-${{ hashFiles('**/Podfile.lock') }}
5
restore-keys: |
6
${{ runner.os }}-pods-

Swift Package Manager

yaml
1
- uses: buildjet/cache@v3
2
with:
3
path: .build
4
key: ${{ runner.os }}-spm-${{ hashFiles('**/Package.resolved') }}
5
restore-keys: |
6
${{ runner.os }}-spm-

Mint

yaml
1
env:
2
MINT_PATH: .mint/lib
3
MINT_LINK_PATH: .mint/bin
4
steps:
5
- uses: buildjet/cache@v3
6
with:
7
path: .mint
8
key: ${{ runner.os }}-mint-${{ hashFiles('**/Mintfile') }}
9
restore-keys: |
10
${{ runner.os }}-mint-

#

Elixir

Mix

yaml
1
- uses: buildjet/cache@v3
2
with:
3
path: |
4
deps
5
_build
6
key: ${{ runner.os }}-mix-${{ hashFiles('**/mix.lock') }}
7
restore-keys: |
8
${{ runner.os }}-mix-

Rebar3

yaml
1
- uses: buildjet/cache@v3
2
with:
3
path: |
4
~/.cache/rebar3
5
_build
6
key: ${{ runner.os }}-erlang-${{ env.OTP_VERSION }}-${{ hashFiles('**/*rebar.lock') }}
7
restore-keys: |
8
${{ runner.os }}-erlang-${{ env.OTP_VERSION }}-

#

Flutter

yaml
1
- name: Set Flutter cache variables
2
shell: bash
3
id: set_flutter_cache_vars
4
run: |
5
echo "cache_key=flutter-${{runner.os}}-stable-${{env.FLUTTER_VERSION}}-${{runner.arch}}" >> $GITHUB_OUTPUT
6
cache_path="${{runner.tool_cache}}/flutter/stable-${{env.FLUTTER_VERSION}}-${{runner.arch}}"
7
echo "cache_path=$(echo ${cache_path} | tr '[:upper:]' '[:lower:]')" >> $GITHUB_OUTPUT
8
- name: Cache Flutter
9
uses: buildjet/cache@v3
10
with:
11
path: ${{steps.set_flutter_cache_vars.outputs.cache_path}}
12
key: ${{steps.set_flutter_cache_vars.outputs.cache_key}}-${{hashFiles('**/pubspec.lock')}}
13
restore-keys: |
14
${{steps.set_flutter_cache_vars.outputs.cache_key}}
15
- name: Setup Flutter
16
uses: subosito/flutter-action@v2
17
with:
18
channel: stable
19
flutter-version: ${{env.FLUTTER_VERSION}}
#

Haskell

Cabal We cache the elements of the Cabal store separately, as the entirety of ~/.cabal can grow very large for projects with many dependencies.

yaml
1
- name: Cache ~/.cabal/packages, ~/.cabal/store and dist-newstyle
2
uses: buildjet/cache@v3
3
with:
4
path: |
5
~/.cabal/packages
6
~/.cabal/store
7
dist-newstyle
8
key: ${{ runner.os }}-${{ matrix.ghc }}-${{ hashFiles('**/*.cabal', '**/cabal.project', '**/cabal.project.freeze') }}
9
restore-keys: ${{ runner.os }}-${{ matrix.ghc }}-

Stack

yaml
1
- uses: buildjet/cache@v3
2
name: Cache ~/.stack
3
with:
4
path: ~/.stack
5
key: ${{ runner.os }}-stack-global-${{ hashFiles('stack.yaml') }}-${{ hashFiles('package.yaml') }}
6
restore-keys: |
7
${{ runner.os }}-stack-global-
8
- uses: buildjet/cache@v3
9
name: Cache .stack-work
10
with:
11
path: .stack-work
12
key: ${{ runner.os }}-stack-work-${{ hashFiles('stack.yaml') }}-${{ hashFiles('package.yaml') }}-${{ hashFiles('**/*.hs') }}
13
restore-keys: |
14
${{ runner.os }}-stack-work-

#

R

renv

yaml
1
- name: Set RENV_PATHS_ROOT
2
shell: bash
3
run: |
4
echo "RENV_PATHS_ROOT=${{ runner.temp }}/renv" >> $GITHUB_ENV
5
- name: Install and activate renv
6
run: |
7
install.packages("renv")
8
renv::activate()
9
shell: Rscript {0}
10
- name: Get R and OS version
11
id: get-version
12
run: |
13
cat("##[set-output name=os-version;]", sessionInfo()$running, "\n", sep = "")
14
cat("##[set-output name=r-version;]", R.Version()$version.string, sep = "")
15
shell: Rscript {0}
16
- name: Restore Renv package cache
17
uses: buildjet/cache@v3
18
with:
19
path: ${{ env.RENV_PATHS_ROOT }}
20
key: ${{ steps.get-version.outputs.os-version }}-${{ steps.get-version.outputs.r-version }}-${{ inputs.cache-version }}-${{ hashFiles('renv.lock') }}
21
restore-keys: ${{ steps.get-version.outputs.os-version }}-${{ steps.get-version.outputs.r-version }}-${{inputs.cache-version }}-

#

Bazel

bazelisk does not have be to separately downloaded and installed because it's already included in GitHub's ubuntu-latest and macos-latest base images.

yaml
1
- name: Cache Bazel
2
uses: buildjet/cache@v3
3
with:
4
path: |
5
~/.cache/bazel
6
key: ${{ runner.os }}-bazel-${{ hashFiles('.bazelversion', '.bazelrc', 'WORKSPACE', 'WORKSPACE.bazel', 'MODULE.bazel') }}
7
restore-keys: |
8
${{ runner.os }}-bazel-
#

Migrating Docker Cache

Docker layer caching is a feature of Docker BuildKit that allows Docker to reuse the layers from previous builds, leading to much faster build times. However, if you're using Docker's experimental docker layer caching type=gha, you may encounter instability and speed issues as it uses the same underlying storage as actions/cache.

To improve your Docker layer caching performance and stability, we recommend using inline caching in most cases.

Inline cache is stored in the registry, embedded into the main image, and is pulled alongside the main image. Because of the way the inline cache is organized, the inline cache only supports min mode, so if you want max mode, then you'll need to use the registry cache.

Here is an example workflow using the type=inline to cache the Docker layers:

yaml
1
name: Build and push Docker image
2
on: push
3
jobs:
4
docker:
5
runs-on: buildjet-2vcpu-ubuntu-2204
6
steps:
7
- name: Checkout
8
uses: actions/checkout@v3
9
10
- name: Set up Docker Buildx
11
uses: docker/setup-buildx-action@v2
12
13
- name: Login to Docker Hub
14
uses: docker/login-action@v2
15
with:
16
username: ${{ secrets.DOCKERHUB_USERNAME }}
17
password: ${{ secrets.DOCKERHUB_TOKEN }}
18
19
- name: Build and push
20
uses: docker/build-push-action@v4
21
with:
22
context: .
23
push: true
24
tags: user/app:latest
+
cache-from: type=registry,ref=user/app:latest
+
cache-to: type=inline

Note that the first time you run this workflow, the cache will be empty, meaning that it will take a bit longer to build, as Docker will have to build all the layers from scratch. However, the next time you run the workflow, Docker will be able to reuse the layers from the previous build, leading to much faster build times.

#

Deleting BuildJet Cache Entries

When your free storage quota of 20GB/repo/week is consumed, BuildJet automatically removes the oldest cache entry to make room for new ones. However, manual deletion can help better manage your cache, especially when handling outdated or irrelevant data.

#

Manually Deleting Cache Entries

The BuildJet Cache is accessible to all GitHub Actions users, regardless of whether you're using BuildJet. This includes those on self-hosted or official runners. To extend cache management capabilities, we've introduced the buildjet/cache-delete action. This action enables users to delete their cache entries without requiring a BuildJet sign-up. Moreover, it ensures authenticated and authorized deletions, safeguarding other users' cache entries from accidental deletion.

To manually delete an entry from the BuildJet cache, you simply need to add a .github/workflows/delete-buildjet-cache.yml file to your repository with the following content:

yaml
1
name: Manually Delete BuildJet Cache
2
on:
3
workflow_dispatch:
4
inputs:
5
cache_key:
6
description: 'BuildJet Cache Key to Delete'
7
required: true
8
type: string
9
jobs:
10
manually-delete-buildjet-cache:
11
runs-on: buildjet-2vcpu-ubuntu-2204
12
steps:
13
- name: Checkout
14
uses: actions/checkout@v3
15
- uses: buildjet/cache-delete@v1
16
with:
17
cache_key: ${{ inputs.cache_key }}

This workflow creates a manually-run action that can be executed from the Actions tab in your repository. Once this workflow has been added to your repository, navigate to the Actions tab, select the Delete BuildJet Cache workflow on the sidebar, and click the Run workflow button.

You will then be prompted to input the cache key you wish to delete. After inputting this cache key, the workflow will start, deleting the corresponding cache entry.

The cache key can be located in the logs of the workflow using the BuildJet cache. It should look like:

text
1
Run buildjet/cache@v3
2
Received 11505181 of 24752374 (46.5%), 11.0 MBs/sec
3
Received 24752374 of 24752374 (100.0%), 14.0 MBs/sec
4
Cache Size: ~24 MB (24752374 B)
5
/usr/bin/tar -xf /home/runner/work/_temp/f6d0f5ad-56bc-487f-a047-6b5f483e40b8/cache.tzst -P -C /home/runner/work/cache-delete/cache-delete --use-compress-program unzstd
6
Cache restored successfully
7
Cache restored from key: Linux-npm-73339479b843f24de506811a29b7519e82adab1f40d83fecff88668fa0dc47ff

Simply copy the cache key from the logs and paste it into the input field when running the Delete BuildJet Cache workflow.