Contents

My favourite tools of 2024

Here’s a quick summary of the tools I’ve picked up along the way in 2024 that I continuously find useful. I’ve divided the list into three sections, first one comprised mostly of general shell “helpers” that are part of my day to day workflow one way or another. The second one, focusing on my current nvim setup and the last one listing some tools deployed in my home lab or things I run on my servers that I find useful and worth mentioning as well.

shell

zoxide

If zoxide doesn’t sound familiar then maybe autojump does? zoxide is basically a rewrite of the latter in rust. In short, it’s a tool that collects your most visited paths and allows you to jump quickly to one of them using partial matching. So, z command becomes your cd replacement. Here’s a quick demo:

zoxide in action.

There’s also zi which lets you choose the path interactively but I’ve found myself never using it (or very rarely).

tmuxp

I was a bit tired having to constantly restore the same tmux layout when coming back to a given project. Turns out I wasn’t the only one. tmuxp solves exactly that problem.

You create a YAML file describing your tmux windows and panes and their content. This defines a session that has a name. These sessions can now be easily re-instantiated. Here’s my simple setup for working on this blog. I’ve created ~/.tmuxp/twdev.yaml containing:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
session_name: 'twdev'
windows:
  - window_name: twdev/vim
    start_directory: /Users/tomasz/blogs/twdev
    panes:
      - shell_command: nvim

  - window_name: twdev/shell
    start_directory: /Users/tomasz/blogs/twdev
    panes:
      - shell_command: git s

  - window_name: twdev/server
    start_directory: /Users/tomasz/blogs/twdev
    panes:
      - shell_command: hugo server

Now, it’s just a matter of calling tmuxp load twdev - all my windows, panes and programs are running back again.

direnv

I’ve started using direnv early on this year. I’ve even mentioned that in one of my posts about wasm. This tool solidified as a quintessential part of my workflow as it’s very helpful in its simplicity.

The premise is simple: upon entering the directory, direnv processes .envrc which might run some code for you but in majority of cases it’s just gonna read-in the environment variables from .env using dotenv_if_exists.

It’s an effective way to store configuration and credentials outside of the code itself and helps to reinforce good practices in the long run. Additionally, your project might have an associated set of tools and its own bin/ directory - thanks to direnv all needed paths can be configured automatically without having to document anything in the README files.

direnv in action.

just

Initially, I was a bit sceptical and failed to notice the added value that just brings to the table but after spending some time with it, I’ve been converted. The tool is great. I had a habit of creating a dev/bin directory in my projects. This directory would normally contain a set of development helper scripts to make most common project tasks more convenient to run. I no longer do that - that’s exactly the niche that just is filling in. It allows to create a documented set of commands in a single file that is easy to track and use. I’ve written more about it in one of my older posts which I encourage you to read.

pass

pass is a CLI password manager but I’m using it mostly as an API token store. If you find yourself writing small CLI tools that rely on external REST APIs you have probably faced the problem of storing the credentials as well. Keeping the tokens hard-coded anywhere on the filesystem in clear is pretty bad. That’s where pass comes in. I keep my tokens in pass which can be then retrieved programmatically in shell scripts like so:

pass show SOME_API_TOKEN

fd

fd is a standard UNIX find replacement. The thing which I like about it is that by default it skips all patterns contained in .gitignore. Thanks to that, when working with a given project, it’s easy to skip build directories and in general focus strictly on the project contents.

nvim

I’m still using packer as my plugin manager. I didn’t really feel the need to transition to something like lazy although, eventually I’ll have to as packer is no longer maintained.

In general, the majority of my nvim setup remains unchanged. I’ve made only a handful of rather uncontroversial transitions.

neogit

I really like emacs’ magit for working with git and neogit is exactly just that for nvim. Prior to that, I was using fugitive but the offered functionality is rather limited (or at least I didn’t find it useful enough for myself or failed to integrate well enough into my workflow). With neogit I tend to stay in the editor for majority of source control management which is great.

gitsigns

Another git related change. gitsigns replaced my git-gutter setup. I felt like git-gutter was ageing a bit and it’s time for something new. I find gitsigns to be more visually appealing, more responsive and more feature rich. It comes with git blame mode built-in which is great.

treesitter

Treesitter is becoming essential whether it’s for writing context aware snippets or syntax highlighting. I find it useful in combination with luasnip to be able to write some pretty clever snippets. Previously, I had it installed but I wasn’t really using it to its full potential.

copilot?

I was using copilot quite heavily during first couple months of 2024 but after a while I found it annoying. Majority of suggestions were lacking and it, most of the time, tend to sneak in things which were simply wrong. I no longer use the copilot and transitioned back to my old setup which was built around lSP, snippets and treesitter code completion - I’m not saying that it’s superior, far from it, it’s just what I like better.

home lab

borg & borgmatic

Usually I just rsync stuff around when it comes to backups - the main argument behind it is that with rsync - you have an unobscured copy. If a file gets corrupted it’s only that file, you can easily access all the rest without any hassle. It’s not the case with backup programs that maintain “archives” or store the data in packed format. With some of them, if parts of the archive get corrupted - all data might be gone, even if the rest remains readable.

To be clear, I’m still maintaining a backup with rsync but additionally I complement it with borg and borgmatic (borg is the backup program and borgmatic is just a wrapper with its own configuration file to make the backups more automated and convenient to run). The main advantage is that borg is deduplicating data between the archives within the repository so, it’s only performing incremental backups of new data. Additionally, borg allows for backup repositories encryption.

rclone

rclone synchronises the data between your machine and various cloud storage providers. Personally, I’m using it in combination with borg to store my backups remotely. You might ask why not restic but borg? I’ve played aroung with restic and found it unreliable. It seem to have some memory management problems which lead to excessive memory usage. Some of my restic backups were terminated by OOM. I’ve lost trust in that project.

ansible

I’ve adapted IaC philosophy and try to write deployment playbooks for services I deploy on my servers. It takes a bit longer, I have to admit, but it’s great when having to restore something using a playbook. The initial time invested in writing the playbooks definitely pays off in the long run. Not to mention, that you’re documenting exact configuration of services and machines in ansible which is great - it’s a documented source of knowledge.

compiler-explorer

Now you might ask, why would you ever bother deploying your own instance of compiler-explorer? The reason is surprisingly simple - custom libraries.

I often want to just quickly check something with one of my own libraries or some 3rd party libraries I’m working with (one example is libzmq) - you can do that if you have your own instance of compiler-explorer - this is the main and (probably) the only reason.

Additionally, you might want to have an instance that has some proprietary libraries integrated within so, you can experiment with proprietary code as well. None of that is possible with the mainstream, public instance.