Automating ArchLinux setup with meta packages

August 19, 2018

I've been using Linux for four years now. After playing with various distros (Ubuntu, Fedora, OpenSuse, ...), I switched to ArchLinux more than two years ago on my desktop and laptop. During this time, I reinstalled Arch a few times because I wanted a fresh start to do things better as my knowledge about Linux, programming and system management evolved. Also, each time I wanted to configure something, I needed to install it on my laptop and desktop which is quite cumbersome to do.

As I'm an automation maniac, I've recently been looking for a solution to reduce the time passed to configure my system (DRY right?).

Linux automation

My first idea was to create a list of packages that should be installed with pacman and a shell script to copy configuration files and enable services. This approach is not very elegant nor maintainable in the long run.

Afterwards, I looked into software configuration management tools (such as ansible, salt, puppet, ...) built for big infrastructure management. Those tools felt quite hard to pick up for only one system.

After putting this idea on the side for a while, I stumble across a post on the ArchLinux subreddit posted by a fellow ArchLinux user. The post linked a serie of articles to automate ArchLinux explaining how to create a custom repository, managing packages and creating an installation script. In the second post, the author explains how to create meta packages to group packages together. This way, the user can only install a handful of packages to setup its systems. On top of that, meta packages have hooks that can be triggered after an installation or an update to enable services, copy configuration files, ...

Meta packages

A meta package is defined just like any packages in the ArchLinux repository as a PKGBUILD file. A PKGBUILD example can be found here.

Just like mdaffin (the author of the serie about automating ArchLinux), I created multiple meta packages to deal with different configurations and hardware used. For example, my packages are:

charlesvdv-laptop           # TODO
charlesvdv-laptop-asus-ux32 # TODO

On my desktop computer, I will only install the first three packages and on my current laptop, I will install all of them. The advantages of this approach is, of course, modularity when I change hardware, I only have to change a few things.

Custom repository

Meta packages are great but pacman, the package manager of ArchLinux, requires every dependency to be on the computer local repository.

As I'm an avid user of the AUR (Arch User Repository), I wanted also to include AUR packages into my meta packages. Unfortunately, AUR packages can't directly be installed with pacman because the AUR is just a bunch of PKGBUILDs.

As meta packages dependency resolution is done with local repository available, AUR packages can be added to a custom repository that is located either on the computer or a remote server. Meta packages are also added to this custom repository so these can be installed and updated easily via pacman.


All the code can be found in my dotfile (in the arch-pkgs/ folder). The root of the arch-pkgs/ directory is composed of two things:

  1. Directories containing PKGBUILDs
  2. Scripts to automate the building and update of packages

Directories containing PKGBUILDs are currently split into two categories. First, there are custom package definition (like my meta packages). Secondly, there are some packages definitions which are directly coming from other sources (mostly the AUR). Since those type of packages are updated in a remote repository, git submodules are used to keep their origins intact and don't take ownership of others' work.


The build script is a python script operating in multiple steps. This script makes a heavy use of aurutils to handle various things such as analyzing dependencies or building packages in a container. The main function of the python script describes quite well the steps taken by the build script:

pkgs = get_pkgs_name()

# SRCINFO files contains packages metadata.

deps = get_pkg_build_deps()

# Check whether a package should be build based on its current version
# and its db version.
pkgs_to_build = [pkg for pkg in pkgs if should_pkg_be_build(pkg, args.rebuild_vcs)]
pkgs_to_build = order_pkgs(pkgs_to_build, deps)

# Set up package repository if it doesn't exist.
if pkgs_to_build:

Basically, we first get every package name. Then, we generate SRCINFO files to extract each package dependencies. Afterwards, packages are filtered depending on whether the version in the local database is outdated or not. The packages that should be build are then ordered to ensure that build dependencies are met. Finally, packages are build inside a container.


Updating is straightforward.

For git submodule, the remote repository is first check to see if there is any new commit. If there is, the diff is checked to ensure that the change is legit and doesn't contain shady stuff. Finally, the remote master is merged into the local master with git.

For custom package, we just need to update the package version number inside the PKGBUILD to update the package.

After the update, a build operation is required to actually update the package inside the custom repository.

Future work

In the near future, I will need to focus on the meta packages for my laptop. A good thing would also to reinstall ArchLinux from scratch to ensure every service, configuration file, ... is properly setup.

In the end, it will just be a bit of maintenance work to keep my meta packages updated with the current setup. I'm not yet as evolved as NixOS(I recommand you to check it out if you've never heard of this project!) for declaring and setuping declaratively my system but I feel this is a step in the right direction to make my computer management a bit easier.