Conda Is Fast Now: The Guide to Universal Environments
Conda is a local, cross-language, cross-OS software stack manager.
Is conda a Python tool?
Pretend it is, and conda will quietly install you a C compiler, OpenSSL, BLAS, R, Node, and a few helpful system libraries while you weren’t looking.
Conda’s superpower is not “it installs Python packages.”
Conda’s superpower is: it installs software stacks into a directory you own—no root/admin required—and it does this across operating systems (as long as the packages exist in your chosen channels).
How best to use conda?
As a universal compiler/interpreter version manager. — Just google conda install -c conda-forge <your-interpreter> to see whether a conda package is available for your case (the anaconda organization hosts the documentation sites).
Steps I often do:
# Install conda in MacOS
curl -O https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-arm64.sh
bash ~/Miniconda3-latest-MacOSX-arm64.sh
# Linux
wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh
bash Miniconda3-latest-Linux-x86_64.sh
# Windows PowerShell
Invoke-WebRequest -Uri "https://repo.anaconda.com/miniconda/Miniconda3-latest-Windows-x86_64.exe" -OutFile ".\miniconda.exe"
Start-Process -FilePath ".\miniconda.exe" -ArgumentList "/S" -Wait
del .\miniconda.exe
# or easily with scoop:
scoop bucket add extras
scoop install miniconda3
#-----------
# update/upgrade conda
conda activate base
conda update conda
conda update conda-build# I want to install Python version 3.14
conda create --name py314 python=3.14 -c conda-forge
conda activate py314
conda deactivate
# Install python 3.12
conda create --name py312 python=3.12 -c conda-forge
conda activate py312
conda deactivate
# Create an R4.4.3 environment
conda create --name R443 r-base r-essentials r-devtools -c conda-forge
conda activate R443
conda deactivate
# alternative for R is the channel `-c r` quite good.
# all R packages in conda have the prefix `r-`
# google for system libraries by `conda install mylibrary-or-compiler-interpreter`
# anything you want to install into your environment - search by
# googling with `conda install` in front of it.
# list all environments to get an overview
conda env list
# activate deactivate environment as shown above
# remove an environment
conda env remove --name R443
# overview over current conda environment
conda info
active environment : base
active env location : /opt/homebrew/Caskroom/miniconda/base
shell level : 1
user config file : /Users/myname/.condarc
populated config files :
conda version : 24.3.0
conda-build version : not installed
python version : 3.12.2.final.0
solver : libmamba (default)
virtual packages : __archspec=1=m1
__conda=24.3.0=0
__osx=15.5=0
__unix=0=0
base environment : /opt/homebrew/Caskroom/miniconda/base (writable)
conda av data dir : /opt/homebrew/Caskroom/miniconda/base/etc/conda
conda av metadata url : None
channel URLs : https://repo.anaconda.com/pkgs/main/osx-arm64
https://repo.anaconda.com/pkgs/main/noarch
https://repo.anaconda.com/pkgs/r/osx-arm64
https://repo.anaconda.com/pkgs/r/noarch
package cache : /opt/homebrew/Caskroom/miniconda/base/pkgs
/Users/myname/.conda/pkgs
envs directories : /opt/homebrew/Caskroom/miniconda/base/envs
/Users/myname/.conda/envs
platform : osx-arm64
user-agent : conda/24.3.0 requests/2.31.0 CPython/3.12.2 Darwin/24.5.0 OSX/15.5 solver/libmamba conda-libmamba-solver/24.1.0 libmambapy/1.5.8 aau/0.4.4 c/. s/. e/.
UID:GID : 501:20
netrc file : None
offline mode : Falseconda got a new brain (libmamba is now default)
Why did conda feel slow for years?
Because the classic dependency solver was polite, tried hard, and was slow. Sometimes needing half an hour to resolve dependencies.
Since conda 23.10.0, the libmamba solver became the default. libmamba builds on libsolv, a much faster solver.
Thus the sentence “conda is slow” expired now. (Like a yogurt you forgot in your fridge or ~/miniconda3/pkgs.)
The other plot twist: micromamba exists (a portable conda engine)
What is micromamba?
A tiny, statically linked C++ executable that integrates into the “conda ecosystem” seemlessly, without requiring a base environment—without a default Python - its strongest point.
micromamba is what you reach for when you want:
- very fast solves + installs,
- a minimal footprint,
- a clean use on HPC / locked-down systems,
- and you want to avoid “big distribution” baggage.
How to install micromamba?
# MacOS
brew install micromamba
# MacOS, Linux, Windows (Git Bash)
"${SHELL}" <(curl -L micro.mamba.pm/install.sh)And further:
# update
micromamba self-update
micromamba self-update --version 1.4.6
# set channels and mode
micromamba config append channels conda-forge
micromamba config set channel_priority strict
# create, activate, install, deactivate
micromamba create -n env_name xtensor -c conda-forge
micromamba activate env_name
micromamba install mypackage
micromamba deactivateMy big teaching: don’t surgically remove packages—rebuild the env
Why you should actually never conda remove anything?
Because environments are dependency snowflakes. Removing one package often causes a cascade of necessary changes that leave your env in a weird half-life.
This is something I needed a long time until realizing finally that it is better to:
- Export the intent of your env (what you asked for).
- Edit out the thing you no longer want.
- Recreate a fresh environment from that spec (creation is so much faster than resolving consequences of removals - and much less error-prone).
condais simply not built to remove dependencies - but only to add dependencies.
You can end up with a broken environment after removal of one essential package (and its dependencies). Moreover conda doesn't check in a logically sound manner what is allowed to be removed and what not so that the environment still functions.
And the time you need to figure out how to fix the broken environment is magnitudes of order bigger than to recreate the environment without the single package you want to have removed.
(I save you with this tip many hours life time!)
How do I “freeze” a conda environment, remove one package, and rebuild it cleanly?
Freeze the intent, edit it, then recreate a fresh env from that file (don’t perform surgery inside the old env). - This is good for cross-OS transfers.
# 1) Freeze what you explicitly asked conda for (best for rebuilding)
conda env export -n myenvname --from-history > environment.yml
# 2) Edit environment.yml and delete the package you don’t want
# (open it in your editor and remove the line)
# 3) Rebuild a fresh environment from the edited spec
conda env create -n myenv2 -f environment.ymlIf you want an exact, fully pinned snapshot instead of “intent-only”, freeze the full state:
conda env export -n myenv > environment.lock.yml
# edit environment.lock.yml and save
conda env create -n myenv2 -f environment.lock.yml(Exact exports are heavier and can be less portable across OS, but they’re the closest thing to a true snapshot.)
The “90% of life” command sessions
Session A — Conda (modern, with libmamba default)
1) Create a new env
conda create -n ds python=3.12 numpy pandas -c conda-forge2) Activate it
conda activate ds3) Install more stuff (with strict channels)
conda config --set channel_priority strict
conda install -n ds -c conda-forge scipy scikit-learnStrict channel priority is one of those “why didn’t I do this years ago” settings—especially for bioconda/conda-forge stacks.
4) Run a command without activating (great for scripts/CI)
conda run -n ds python -c "import numpy as np; print(np.__version__)"5) Export the env
Full, exact state (heavy but reproducible):
conda env export -n ds > environment.ymlIntent-only (“from history”):
conda env export -n ds --from-history > environment.ymlThis exports only what you explicitly installed, which is perfect for rebuilding, but be aware it may omit version pins depending on your workflow.
6) The sacred rebuild ritual (remove-by-rebuild)
# edit environment.yml: remove the package you regret
conda env create -n ds2 -f environment.ymlThen you test ds2, and when it’s good, you stop using ds.
This is how you keep sanity.
Session B — Micromamba (fast, minimal, ecosystem-compatible)
Micromamba is a separate CLI and expects you to work a lot with prefixes (paths) or named envs.
1) Create an env
micromamba create -n ds -c conda-forge python=3.12 numpy pandas2) Activate it (after shell init)
micromamba activate ds3) Install more packages
micromamba install -n ds -c conda-forge scipy scikit-learn4) Export / recreate (same philosophy)
Create envs from YAML specs is a common micromamba workflow in practice (especially in CI/HPC).
“But what can’t micromamba do compared to conda?”
Is micromamba a perfect drop-in replacement for the full conda UX?
Not quite.
Here are real, practical differences:
- Different CLI surface: it’s compatible with the ecosystem, but not a verbatim clone of every
condasubcommand. - Shell integration can be fussier, especially in some Windows setups and “shell function vs binary on PATH” situations.
- Some
condaconveniences likeconda env config vars set/unset/listhave historically not been fully covered the same way inmicromambaworkflows (people ask for it explicitly). Conda-build(building packages) is acondatoolchain story;micromambais focused on fast env management, not being “the builder.” (You can still build packages—just don’t expectmicromambaalone to replaceconda-build.)
So: micromamba is amazing when you want speed + minimalism. Conda is still the “full Swiss army knife” experience.
The licensing trap (and the clean escape)
Should I install Anaconda?
A: If you’re in a company, a government org, or anywhere that worries about compliance, treat “Anaconda Distribution + default repository access” as something you must check carefully.
Anaconda’s own legal page spells out that repository access and updates can trigger Terms-of-Service acceptance and commercial licensing requirements in certain organizational contexts, and that Miniforge/Mambaforge do not require an Anaconda commercial license (they’re community installers configured to conda-forge by default).
Practical takeaway many teams adopt:
- Prefer
Miniforge(conda-forgedefault) ormicromambafor a cleaner compliance story.
The universal advantage: conda is not an environment manager, it’s a
stack manager
What do I gain over pip/venv/Poetry/uv?
A: Conda installs non-Python dependencies reliably—system libraries and compiled binaries—inside the environment prefix. Packages can be .tar.bz2 or .conda, and they can include system-level components.
Also: no root privileges needed—which is exactly why conda/micromamba are beloved on HPC and locked-down enterprise machines.
Some Tricks Also For the “years-long conda user”
1) Make channel behavior deterministic
conda config --set channel_priority strictThis reduces “mixed channel soup” and can speed solves.
2) Stop touching base
Keep base for conda itself! Do real work in named envs. (Many institutions explicitly teach this.)
3) Rebuild instead of “remove”
Treat envs as cattle, not pets. Export intent → edit → recreate. (This is the single habit that saves you the most time. Don't tinker, don't correct!)
4) Use conda run for scripts
No activation edge cases. No shell voodoo. Cleaner CI.
5) Lean on cache
Conda caches package archives (this is why rebuilds can be surprisingly fast!)
When you solve a nasty env problem: build a conda package
I finally got it working. How do I stop the pain from returning?
Package the fix!
A conda package is an archive with metadata + files installed into a prefix, and can include link/unlink scripts.
The official conda-build tutorial walks through creating packages on Windows/macOS/Linux.
And you can even generate a local channel index with conda index if you want to distribute internally.
This is the moment you go from “environment victim” to “ecosystem contributor.”
And of course this is also good for the community - you and me :) .
“Community dependency” is real—and here’s how to contribute
Is contributing to conda-forge hard?
It’s surprisingly structured.
The short, simple process:
- Submit a recipe PR to
conda-forge/staged-recipes. - When merged, a feedstock repo gets created for ongoing maintenance.
- Updates happen via PRs (often with bot help), and maintainers work through forks/PRs.
Want a turbo button? Use Grayskull to generate recipes for many Python packages, then submit to staged-recipes.
This is the hidden secret of conda: when you contribute once, you save hundreds of people from repeating your pain.
Closing
So what should I do tomorrow morning?
- Use
condawithlibmamba(moderncondais finally quick by default). - Or use
micromambawhen you want a tiny, fast, clean binary. - Use
strictchannel priority. - Never “remove” in place when it matters—export intent, edit, rebuild.
- And when you solve a hard env puzzle: turn it into a package and contribute it!
Because the only thing better than a reproducible environment…
is never having to debug it again.