The future of mamba

The recent adoption of libmamba by the conda project was a great validation of our work. Several other game-changing innovations are in the works by the mamba team. Stay tuned!

Wolf Vollprecht
5 min readMar 31, 2022

The mamba team has brought a lot of innovations to the package management ecosystem in the past three years! With the blazing fast mamba package manager, the extensible quetz package server, the fast package builder boa and the micromamba project, this has been quite a ride — with a lot of contributions and adoption from the larger open-source community.

In this post, we lay out our vision for the next few months, and how mamba will continue to move the lines and disrupt the package management ecosystem.

🚀 Better, faster, and more secure downloads, faster installation

Mamba makes package installation fast by parallelizing downloads and package extractions. We are currently working on a new backend for even faster downloads of repodata by only downloading the changed bits using zchunk.

The new library is called libpowerloader and it combines mamba’s and Linux’ rpm/dnf experience in talking to “package repositories”. It is a C++ port of the librepo library that is used in rpm & dnf.

libpowerloader automatically performs SHA256 or MD5 checksum validation, retries and restarts of downloads. It can select one out of multiple mirrors and finds the fastest among them. libpowerloader can use compression and chunked downloads using the zchunk library — to which we recently added macOS and Windows. Additionally libpowerloader can natively talk to GitHub Packages (and any other OCI registries) and Amazon S3 buckets — and that not only for downloading but also for uploading! Everything is still based on libcurl .

🔐 Properly storing credentials cross-platform

Currently credentials (such as tokens or base HTTP authentication credentials) are stored in plain text in the home directory. This is not really the proper way and we’ve made an effort to maintain a port of the excellent keytar / node-keytar that we are calling libcred . This allows users to use the native credential store (libsecret on Linux, Keychain on macOS and wincred on Windows and we will add a fallback mechanism for non-interactive systems).

🏝 More secure environments with sandboxing

Recently we’ve made big efforts to add a micromamba run command that largely follows the docker run syntax. Now we want to add sandboxing capabilities. So far, we have figured out how to do it for Linux and macOS (this was also promised as part of the CZI grant we received). On Linux we can use the excellent bubblewrap library (used by the awesome flatpak project). On macOS we found sandbox-exec that is largely undocumented but actually used by Chromium and Bazel. With both these mechanisms we should be able to launch processes in encapsulated environments, for example restricting network access or access to the host system to prevent accidental data leaks. This will also be nice as a component for sandboxed builds with conda-build or boa.

🧨 Better error messages

Another item we promised on the CZI grant are better error messages! Here we are hoping to extend libsolv in ways that will allow us to produce more useful messages on what packages were actually conflicting — and we already found some functions to obtain this information in a structured way that we have exposed to Python for experimentation. There is still some R&D left, so if you are really into graph theory and extracting informations from conflicts in a graph, let us know!

Note that any improvement here will now directly benefit users of the conda-libmamba integration as well —which highlights the power of libraries.

📘 Boa with a new recipe format

We are currently working on an improved recipe format to be used with the boa build tool. This new recipe format features some important improvements:

  • more logical behaviour for multiple outputs
  • clean & pure YAML (no logic in comments, no un-parseable Jinja syntax)
  • much faster to parse & start building as there is no more expensive recursive parsing & solving necessary

There is discussion about this new recipe format in the conda/specs channel on Slack! We are super excited to work with the conda-forge community on solidifying this recipe format.

The extensible package server quetz has a new frontend

Quetz, our fully open source package server built on modern technologies (FastAPI, pluggy, PostgreSQL) has a new frontend. We are building up a completely new frontend that is in many ways familiar, because it uses the JupyterLab extension system under the hood! We are building on top of really solid foundations and we hope that there will be a lot of cross-pollination. For example, it would be awesome if we can run certain JupyterLab extensions right away as extensions to the quetz frontend (such as Gator!)

Finally: micromamba

Awesome autocompletion in micromamba

If you haven’t tried micromamba yet, we can fully encourage you to do so. We already have many happy users of micromamba in CI systems (e.g. using this excellent GitHub Action). But also the interactive experience is shaping up really nicely (it comes with shell autocompletion out of the box). The lack of a “base” environment makes things less brittle vs. traditional conda/mamba. It’s a fully native binary and just feels very snappy. Some CLI arguments from conda/mamba have also been cleaned up (for example, micromamba create -f can also read YAML files).

It also comes with parallel pyc compilation (thanks to Chris Burr) and parallel package extraction (thanks to Jonas Haag) that makes micromamba run as fast as possible on any given machine.

About the author

Wolf Vollprecht is the CTO of QuantStack — the open source consulting company at the heart of mamba, and core contributors to conda-forge and Jupyter. At QuantStack and beyond there we are many contributors to the mamba stack: Johan Mabille, Madhur Tandon, Frederic Collonval, Joel Lamotte, Andreas Trawoger as well as previously Adrien Delsalle and Bartosz Telenczuk!

--

--

Wolf Vollprecht
Wolf Vollprecht

Written by Wolf Vollprecht

I work as a scientific and robotics software developer for QuantStack in Paris and Berlin. We do Open Source for a living!

No responses yet