Skip to main content

Rules and Macros

To customize Bazel, you'll need to expand the capabilities of the BUILD file with your own rules.

By the end of this section, you'll have written a simple macro to provide a customized developer experience.

Existing rules

The first approach is just to learn more APIs which are already available. If the problem you're solving isn't novel, then other engineers probably ran into it before, and may have provided a solution you can simply adopt.

There are some libraries available with many useful rules:

You should skim through these to form a rough memory of what's available. This way when you encounter an interesting problem while writing a BUILD file, you can search on this site to find that useful nugget.

You can also look through example repositories like aspect bazel-examples to find solutions to problems similar to yours. If you don't find a solution, consider donating a Feature Bounty on our OpenCollective and we can add it for you.

Key building block: run_binary

This rule is an "adapter" from an executable (something you could bazel run) to an action (something you can bazel build).

The executable (called a "tool" here) is run in a single action which spawns that executable given some declared inputs, and produces some declared, default outputs.

caution

Bazel's built-in genrule looks a lot like run_binary, but it's best to avoid it.

  • Arbitrary bash one-liner, commonly non-hermetic
  • Bash dependency hurts portability
  • Subtly different semantics for expand_location, stamp, etc.

Here's a sample usage, which runs my_tool with three arguments to produce a folder called dir_a:

  1. The path to some.file which is the only input
  2. An --outdir flag, which we know from reading the CLI documentation for my_tool.
    • We're always required to predict what path the tool will write to. If you get it wrong, Bazel will error that the "output was not produced".
  3. A syntax-sugar shorthand for "the output folder Bazel assigns for this action"
BUILD.bazel
load("@aspect_bazel_lib//lib:run_binary.bzl", "run_binary")

run_binary(
name = "dir_a",
srcs = ["some.file"],
args = ["$(execpath some.file)", "--outdir", "$(@D)"],
execution_requirements = {"no-remote": "1"},
mnemonic = "SomeAction",
out_dirs = ["dir_a"],
progress_message = "doing some work to make %{output}",
tool = ":my_tool",
)

It also has a mnemonic, which is a tag that allows all similar actions to be configured together. It prints a custom progress message, so that if it takes substantial time to execute, the user will know what they are waiting for. It also passes custom execution_requirements, in this case opting-out from remote execution and remote caching.

note

The js_run_binary rule takes it a step further, adding the ability to:

  • capture stdout/stderr/exit code as "outputs"
  • chdir to a specific working directory
  • throw away logspam on success

Making tools work

The tool in run_binary can be any executable. However some tools don't work the way Bazel expects. This can usually be fixed without having to change the tool, which is good since most tools are written by third-parties who don't care about your Bazel migration problems!

caution

Googlers got in the habit of rewriting everything to work with Blaze. Do not follow their lead! Changing more than one thing at a time makes your migration riskier.

You can make most tools work under Bazel by asking: "How can the tool tell that it's running under Bazel?"

There are three ways to make the tool think it's still running outside Bazel:

  1. "Monkey-patch" the runtime
    • Node.js --require flag to run
    • JVM has a classpath, you can inject a shadowing class
  2. In-process wrapper
    • Peel one layer off the tool's CLI
    • Write your own CLI that calls its entry point
  3. Parent process wrapper
    • Often a short Bash script

Comparing Rules and Macros

So far, we've used features that ship with Bazel or with rulesets. What do we do when we need something more?

First, recall that an Action is a transformation from some inputs to some outputs, by spawning a tool.

Definition

A "Rule" extends Bazel to understand how to produce an action subgraph from the user's dependency graph.

Features:

  • Output Groups: Multiple named sets of outputs
  • Can run multiple actions. Which actions run depends on which outputs are requested.
  • Interop API with other rules: "Providers"
  • Walk the dependency graph: "Aspects"

We'll learn to write a custom rule, however they are an advanced topic and not needed in many cases.

Macros are significantly easier to write than custom rules.

If you're a product engineer and rarely interact with Bazel internals, it's likely not worth your time to learn how to write a custom rule, and you can nearly always accomplish your goal with a macro instead.

So, we'll learn about the more usable alterative first: Macros.

Macros

Bazel Macros are like pre-processor definitions, which compose existing rules in a novel way and provide "syntax sugar" to developers who call them from BUILD files.

Bazel design feature

At a BUILD file usage site, you cannot distinguish macro from rule

This is to allow a rule to be wrapped with a macro without a breaking change.

Thanks to this design, we can start by imagining the right way for a user to express their "bare facts" in the BUILD file, then write Starlark code that supports it. We can start with a macro as they are much easier, but we can always introduce a custom rule when the requirements make it necessary.

leaky abstraction

If you bazel print (Aspect CLI only) which is a syntactic operation on the BUILD file, you see the macro as it was called.

However, macros are expanded during the analysis phase, so if you run a bazel query you'll see the result of the macro evaluation.

If the macro is named differently from the underlying rule, this can be confusing for users and also affect usability, e.g. --test_lang_filters applies to the underlying rule's name.

A macro is just a function definition in a .bzl file which composes some existing rules.

my.bzl
def my_macro(name, srcs, **kwargs):
some_rule(
name = name,
srcs = srcs,
**kwargs
)

The run_binary rule introduced earlier is a great candidate for the some_rule here.

Example 1

This example just wraps a single run_binary rule, in this case it's a third-party tool called "mocha" which was fetched from npm.

Usage:

BUILD.bazel
load("//examples/macro:mocha.bzl", "mocha_test")

mocha_test(
name = "test",
srcs = ["test.js"],
)

Definition:

mocha.bzl
"Example macro wrapping the mocha CLI"

load("@npm//examples/macro:mocha/package_json.bzl", "bin")

def mocha_test(name, srcs, args = [], data = [], env = {}, **kwargs):
bin.mocha_test(
name = name,
args = [
"--reporter",
"mocha-multi-reporters",
"--reporter-options",
"configFile=$(location //examples/macro:mocha_reporters.json)",
native.package_name() + "/*test.js",
] + args,
data = data + srcs + [
"//examples/macro:mocha_reporters.json",
"//examples/macro:node_modules/mocha-multi-reporters",
"//examples/macro:node_modules/mocha-junit-reporter",
],
env = dict(env, **{
# Add environment variable so that mocha writes its test xml
# to the location Bazel expects.
"MOCHA_FILE": "$$XML_OUTPUT_FILE",
}),
**kwargs
)

Example 2

This example composes a few building blocks from bazel_skylib and aspect_bazel_lib.

Usage:

BUILD.bazel
ts_project(
name = "strip",
tsconfig = {
# Demonstrating that rootDir compilerOption works the same as the
# root_dir attribute.
"compilerOptions": {
"rootDir": "subdir",
},
},
)

assert_outputs(
name = "strip_test",
actual = "strip",
expected = [
"examples/root_dir/a.js",
"examples/root_dir/deep/subdir/b.js",
],
)

Definition:

testing.bzl
"helpers for test assertions"

load("@bazel_skylib//rules:diff_test.bzl", "diff_test")
load("@bazel_skylib//rules:write_file.bzl", "write_file")
load("@bazel_skylib//lib:types.bzl", "types")
load("@aspect_bazel_lib//lib:params_file.bzl", "params_file")

def assert_outputs(name, actual, expected):
"""Assert that the default outputs of actual are the expected ones

Args:
name: name of the resulting diff_test
actual: string of the label to check the outputs
expected: expected outputs
"""

if not types.is_list(expected):
fail("expected should be a list of strings")
params_file(
name = "_actual_" + name,
data = [actual],
args = ["$(rootpaths {})".format(actual)],
out = "_{}_outputs.txt".format(name),
)
write_file(
name = "_expected_ " + name,
content = expected,
out = "_expected_{}.txt".format(name),
)
diff_test(
name = name,
file1 = "_expected_ " + name,
file2 = "_actual_" + name,
)

Example 3

This example creates a macro wrapping a repository rule rather than a build rule. (Actually, it uses alias which is even shorter than a macro, it passes all attributes through.)

It uses select to get a binary for the host platform, bypassing the need for toolchains which are a tricky part of Custom rules.

caution

If you run an unconfigured build (e.g. with bazel query) then select will eagerly load every label on the right-hand-side. This causes an eager fetch of tools which don't run on the host platform and wastes the developers time.

This is a good reason to get in the habit of always using bazel cquery instead, so that the build is configured.

Usage:

WORKSPACE.bazel
http_archive(
name = "terraform_macos_aarch64",
build_file_content = "exports_files([\"terraform\"])",
sha256 = "ff92cd79b01d39a890314c2df91355c0b6d6815fbc069ccaee9da5d8b9ff8580",
urls = ["https://releases.hashicorp.com/terraform/{0}/terraform_{0}_darwin_arm64.zip".format(version)],
)
BUILD.bazel
alias(
name = "terraform_binary",
actual = select({
"//platforms/config:linux_x86_64": "@terraform_linux_x86_64//:terraform",
"//platforms/config:macos_aarch64": "@terraform_macos_aarch64//:terraform",
"//platforms/config:macos_x86_64": "@terraform_macos_x86_64//:terraform",
}),
)

When a Macro isn't enough

Rules create actions, which transform inputs to outputs.

Using ts_project as an example, this couldn't be a macro for several reasons:

  1. It creates a tree of actions, which might use one tool to transpile .js outputs, and a different tool for producing TypeScript typings (.d.ts files).
  2. It requires that srcs have a JsInfo provider so that it can understand their structure.
  3. It produces a JsInfo provider for interop with downstream rules that depend on it.
note

Even when Providers get in your way of "just using a macro", you can often write a tiny adapter rule and then put most of your logic in a more easily understood macro.

For example, this code adapts a ProtoInfo on its sources to a DefaultInfo output.

Try it: write a macro

Add any macro in your repository, even a trivial one.

Then change one of your BUILD files to call the macro.

Writing a custom rule

Pro Feature

The rest of this section is only available to Aspect Pro subscribers and training customers.

Sign up for our training course: https://www.aspect.dev/services#training