Development¶
Initial setup¶
Fork the repository on Github, then clone the fork:
git clone git@github.com:YourGithubUserName/bartz.git
cd bartz
Install R and uv (for example, with Homebrew do brew install r uv). Then run
make setup
to set up the Python and R environments. (Note: at the time of writing, the R installation instructions for ubuntu miss a sudo apt install r-base-dev at the end.)
The Python environment is managed by uv. To run commands that involve the Python installation, do uv run <command>. For example, to start an IPython shell, do uv run ipython. Alternatively, do source .venv/bin/activate to activate the virtual environment in the current shell.
The R environment is automatically active when you use R in the project directory.
We don’t support using conda’s R, though it might work.
Contributing¶
To contribute code changes to the main repository, create a pull request from your fork to the main repo.
Pre-defined commands¶
Development commands are defined in a makefile. Run make without arguments to list the targets. All commands that simply consist in invoking a tool with the right command line arguments use the ARGS variable to add extra arguments, for example:
make tests ARGS='-k test_pigs_fly'
will invoke something like
uv run pytest --foo=1 --bar=128 --etc-etc -k test_pigs_fly
Documentation¶
To build the documentation for the current working copy, do
make docs
To build the documentation for the latest release tag, do
make docs-latest
To debug the documentation build, do
make docs SPHINXOPTS='--fresh-env --pdb'
Unit tests¶
The typical workflow to debug new changes is to first run all tests with
make tests
Then, if some tests fail, use pytest directly to run and debug only the relevant tests, e.g., with
uv run pytest --lf --sw --pdb
Where --lf selects only the tests that failed, --sw stops on the first failed test, starting again from it on the next run, and --pdb opens the python debugger at the point where the test failed. Another useful option is -k <pattern>, which selects only tests whose name matches <pattern>.
Debugging dependencies¶
To debug tests that fail with old versions of dependencies, it’s convenient to piggyback on the predefined make target using ARGS:
make tests-old ARGS='-n0 -k test_pigs_fly'
Where -n0 disables test parallelization.
For more fine-grained control, it’s useful to invoke directly uv with the --with option, e.g., the following command will start an IPython shell equipped with specific versions of python and jax:
uv run --with='jax<0.7,jaxlib<0.7' --isolated --python=3.11 --dev python -m IPython
Benchmarks¶
The benchmarks are managed with asv. The basic asv workflow is:
uv run asv run # run and save benchmarks on main branch
uv run asv publish # create html report
uv run asv preview # start a local server to view the report
asv run writes the results into files saved in ./benchmarks. These files are tracked by git; consider deliberately not committing all results generated while developing.
There are a few make targets for common asv commands. The most useful command during development is
make asv-quick ARGS='--bench <pattern>'
This runs only benchmarks whose name matches <pattern>, only once, within the working copy and current Python environment.
Profiling¶
Use the JAX profiling utilities to profile bartz. It works well on GPU, not on CPU.
from jax.profiler import trace, ProfileOptions
from jax import block_until_ready
from bartz.BART import gbart
traceopt = ProfileOptions()
# this setting makes Python function calls show up in the trace
traceopt.python_tracer_level = 1
# on cpu, this makes the trace detailed enough to understand what's going on
# even within compiled functions by manual inspection of each operation
traceopt.host_tracer_level = 2
with trace('./trace_results', profiler_options=traceopt):
bart = gbart(...)
block_until_ready(bart)
On the first run, the trace will show compilation operations, while subsequent runs (within the same Python shell) will be warmed-up. Start a xprof server to visualize the results:
$ uvx --python 3.13 xprof ./trace_results
[...]
XProf at http://localhost:8791/ (Press CTRL+C to quit)
Open the provided URL in a browser. In the sidebar, select the tool “Trace Viewer”.