Switching from conda to pixi

Despite many improved and fully-compatible substitutes of conda existing today, pixi has become my preferred replacement. This post explains why, and how to get started.

Why conda was (previously) chosen

Before leaving conda, it is fair to say why it became the dominant choice for bioinformaticians in the first place.

Most package managers are language-specific: pip and uv for Python, bioconductor for R, cargo for Rust. Conda is different. It manages packages in any language, including pre-compiled C/C++ binaries, Python libraries, R packages, and command-line tools, all from the same interface. For bioinformaticians who chain together Python scripts, R analyses, and compiled aligners in a single workflow, this matters.

The bioconda channel hosts over 9,000 bioinformatics tools (samtools, STAR, DESeq2, nextflow, and thousands more) maintained by contributors across the field. Because bioconda is built on top of conda’s packaging infrastructure, any tool in bioconda is one conda install away, with dependencies handled automatically. This lowered the barrier to installing and sharing scientific software more than any other single project.

Conda also supports Linux, macOS, and Windows on both x86 and ARM. In collaborative environments where team members use different operating systems and analyses eventually need to run on HPC clusters, cross-platform support is not optional.

These strengths made conda the de facto standard. Conda was a great idea and a pioneer, but the ecosystem grew faster than the conda maintainers could keep up.


Background: the birth of pixi

Back in 2019, Wolf Vollprecht, a scientific software engineer at QuantStack, was frustrated by how slow conda had become. As the conda-forge channel grew to tens of thousands of packages, the “Solving environment…” spinner could run for minutes, or never finish at all. In a blog post titled Making conda fast again, he introduced mamba: a drop-in conda replacement that reimplemented the dependency solver in C++, backed by libsolv (the same solver used by RPM-based Linux distributions). Mamba solved environments in seconds where conda took minutes. It became widely adopted, and Anaconda eventually adopted libmamba as conda’s default solver in 2022–2023 (since conda version 23.10.0).

Vollprecht also started boa, a faster replacement for conda-build, but it never took off. The real successor came later.

Around 2022, Vollprecht left QuantStack to found prefix.dev, a company focused on package management. There, together with engineer Bas Zalmstra, the team built rattler: a suite of Rust libraries that implement the full conda package ecosystem (reading metadata, solving dependencies, downloading, and installing) without needing Python at all. Rattler proved solid enough that in October 2024, the conda GitHub organization adopted it, which says something about community confidence in the project.

On top of rattler, prefix.dev built two tools. pixi, launched on August 16, 2023, is the environment and dependency manager. rattler-build, also built on rattler, is the package builder: a Rust-based replacement for conda-build (and the archived boa) that compiles source recipes into conda packages. If you maintain bioconda or conda-forge recipes, rattler-build is worth knowing about; for most users who only consume packages, pixi is the relevant tool. The motivating blog post for the pixi launch was called Let’s stop dependency hell.

The lineage in brief:

conda (before v23.10, Python, slow solver)
  └─> mamba (2019) — C++ reimplementation
        └─> boa (2020) — faster conda-build, later archived
rattler (2022) — Rust crate library
  ├─> pixi (2023) — project-based environment manager
  └─> rattler-build (2023) — conda package builder, successor to boa

Why leave conda?

No real global packages

The base environment of conda is not truly “global”. Once you activate another environment, you lose access to packages installed in base. As a bioinformatician, there are tools I need everywhere and every day (samtools, bedtools, minimap2) and it would be nice to have those available system-wide without switching between environments. And if conda is managed by your server admin, updating or installing packages into the base environment without root privilege is a nightmare.

Slow speed and dependency conflicts

Once an environment grows large (or sometimes when it is not that large), conda can take a very long time to resolve dependencies. I have waited half an hour before conda either found a solution or reported a conflict. Mamba solves things much faster, but it can still throw dependency conflicts or remove existing packages to work around them. (Nevertheless, I still recommend mamba if you do not want to jump ship from conda entirely.)

A real cross-language package manager

To my knowledge, conda (and now pixi) is the only package manager that works across multiple programming languages. uv is excellent, but Python-only. cargo is one of my favorites, but Rust only. Bioinformaticians work with software written in Python, R, C++, Rust, and sometimes Perl. Having one tool that handles all of these while pulling from bioconda and conda-forge matters. Pixi also has better pip compatibility than conda, which only partially supports mixing pip-installed packages. (This stackoverflow discussion covers the conda-and-pip problem well.)

Open source and free

Pixi is open-source and free with no licensing restrictions (BSD 3-Clause). Conda itself is open-source, but the defaults channel is not free for commercial use, which can create uncertainty depending on your institution’s situation. Pixi uses conda-forge by default, which is fully open.

Reproducibility

Conda has no built-in lockfile. Reproducing an environment from an environment.yml does not guarantee the same package versions will be resolved. Channels change over time, and conda env create can produce a different set of packages than what was originally installed. conda list --export does not always produce a reproducible file (From my personal experience, it rarely produces ready-to-use file without manual edits). Tools like conda-lock exist to fill this gap, but they are a separate step that most people skip.

Pixi generates a pixi.lock automatically on every pixi add or pixi install. This file pins every package in the resolved environment, including transitive dependencies, and is meant to be committed to version control. Reproducing the exact environment on another machine is then a single pixi install.

The prefix.dev blog covers this in more depth here.


Installing pixi

Run the following one-liner to install pixi on Linux or macOS:

curl -fsSL https://pixi.sh/install.sh | sh

After installation, restart your terminal or source your shell config to make the pixi command available:

# Linux (bash)
source ~/.bashrc

# macOS (zsh)
source ~/.zshrc

If the installation succeeded, typing pixi in the terminal should print pixi’s help message.


Setting up a project

Pixi is project-centric: dependencies are tracked per directory, not per named environment (though you can install packages globally). To initialize a new project:

pixi init my_project --channel conda-forge --channel bioconda
cd my_project

This creates a pixi.toml config file (analogous to environment.yml) and a hidden .pixi/ directory where the environment lives.

If you are already inside a directory you want to use:

pixi init . --channel conda-forge --channel bioconda

Now a pixi project is created and uses channels conda-forge and bioconda.


Installing packages

Once inside your project directory, add packages with pixi add:

pixi add bioconda::samtools
pixi add bioconda::minimap2 bioconda::bedtools

Pixi resolves and installs quickly, and records the exact versions in pixi.lock for full reproducibility.


Running tools

Option 1: pixi run (no activation needed)

The simplest way to use a tool is to prefix it with pixi run. This works from within the project directory without activating anything:

pixi run samtools view -h input.bam
pixi run python my_script.py

Option 2: pixi shell (activate the environment)

If you prefer an interactive session where tools are available on your $PATH, use pixi shell. This is the pixi equivalent of conda activate:

# Enter the environment (similar to `conda activate`)
pixi shell

# Now you can run tools directly
samtools view -h input.bam
python my_script.py

# Leave the environment
exit

pixi run only works from within the project directory. After pixi shell, you can use tools anywhere until you exit.


Global packages

For tools you want available everywhere regardless of project, pixi supports global installs:

pixi global install bioconda::samtools
pixi global install bioconda::minimap2

Some tools have conflicting dependencies. Pixi handles this by letting you group them into named global environments:

# Install two packages into a shared global environment called "biotools"
pixi global install --environment biotools bioconda::samtools bioconda::bedtools

This avoids conflicts while keeping tools globally accessible.


Project-based vs named environments

Pixi ties environments to directories, not names. Conda users find this jarring: conda activate my_env works from anywhere on the system, while pixi shell only works from within the project directory (except for global packages).

The current workaround is the --manifest-path flag, which lets you activate any project’s environment from any working directory by pointing pixi at the pixi.toml:

pixi shell --manifest-path /path/to/my_project/pixi.toml

In recent versions of pixi, this activated environment persists across future pixi calls for the duration of that shell session, so you only need to specify the path once. The downside is obvious: it’s verbose, and you have to remember where the pixi.toml lives.

A common workaround shared on the prefix.dev Discord is a small shell function that mimics conda activate:

# Add to ~/.bashrc or ~/.zshrc
function pixi-activate() {
    local project_path="$1"
    pixi shell --manifest-path "${project_path}/pixi.toml"
}

Then you can do:

pixi-activate ~/projects/my_analysis

This is still more explicit than conda activate my_env, and it requires you to know the path rather than a memorable name, but it works for most workflows. Whether pixi will support named, globally-registered environments remains an open question.


Cache directory

Conda stores downloaded package tarballs and extracted environments in ~/.conda/pkgs/ and ~/anaconda3/pkgs/ by default, both inside $HOME. On shared servers and HPC clusters where $HOME quotas are tight (often 10–50 GB), a growing conda installation can fill up your home directory and cause cryptic failures. I have run into this more than once. The workaround with conda is to redirect the cache via .condarc:

# ~/.condarc
pkgs_dirs:
  - /path/to/larger/disk/conda_pkgs

Pixi uses a different location by default:

Platform Default cache path
Linux ~/.rattler/cache
macOS ~/Library/Caches/rattler/cache

On Linux, ~/.rattler/cache is still in $HOME, so the same quota concern applies on HPC systems. To relocate the pixi cache, set the RATTLER_CACHE_DIR environment variable:

# e.g., in ~/.bashrc or ~/.zshrc
export RATTLER_CACHE_DIR=/path/to/larger/disk/rattler_cache

The .pixi/ directory inside each project folder contains the installed environment, not the cache. The cache holds the downloaded packages shared across all projects; .pixi/ holds the hard-linked or extracted files for that specific project. You can delete .pixi/ and recreate it with pixi install. Pixi will reuse cached packages and skip re-downloading.


Using pixi on a job scheduler (SLURM)

pixi shell spawns an interactive subshell, so it does not work in SLURM batch scripts.

Make pixi available in SLURM

SLURM jobs don’t source ~/.bashrc by default. You need to expose pixi’s executable to $PATH first.

export PATH="$HOME/.pixi/bin:$PATH"

Then there are two good alternatives to run pixi packages.

Prefix each command with pixi run from within the project directory:

#!/bin/bash
#SBATCH --job-name=my_job
#SBATCH --cpus-per-task=4
export PATH="$HOME/.pixi/bin:$PATH"
cd /path/to/project
pixi run samtools sort -@ 4 input.bam -o sorted.bam
pixi run python my_script.py

Option 2: pixi shell-hook

Use pixi shell-hook to emit the activation commands and eval them. This activates the environment inside the script:

#!/bin/bash
#SBATCH --job-name=my_job
export PATH="$HOME/.pixi/bin:$PATH"
cd /path/to/project
eval "$(pixi shell-hook)"
# Tools are now on $PATH
samtools sort input.bam -o sorted.bam
minimap2 -ax sr ref.fa reads.fastq > aligned.sam

To activate a specific named environment (if your pixi.toml defines multiple):

eval "$(pixi shell-hook -e myenv)"

Is pixi a full replacement for conda?

For most bioinformatics use cases, yes. Pixi installs from bioconda and conda-forge, so any tool you used via conda is available. A few things to adjust to:

Pixi environments are tied to directories, not names. If you work across many one-off analyses, get in the habit of creating a project directory per analysis. pixi shell must be invoked from the project directory (though once activated, you can navigate freely). Tools installed with pixi global install are separate from project environments; think of them as your always-available utilities.

If you are running conda today and frustrated by solver speed, dependency conflicts, or reproducibility, pixi is worth the switch. If conda is working fine for you, it is fine to stay.

AI tools were used in preparing background research for this post.