Louka Ménard Blondin
← Back

How to hate Linux

March 19, 2025

Do not hate Linux because it is different. Do not hate Linux because it requires a greater technical understanding of computers to operate (if you’re using those DIY distributions that don’t hold your hand). Do not hate Linux because software vendors don’t care for it, especially those non-free program distributors that stuff their releases full of DRM that borders on malware. All of those reasons are too easy. They’ll see it coming, and use that advance to convince you that the problem all along was you, not the operating system. With all due respect, it is the user’s unfamiliarity with Linux that causes the most problems.

If you want to hate Linux, you gotta do it smart. You can’t hate it for something too obvious. It must be a frustration partly created by your desire to be different, but mostly done in by some ridiculous yet standard fare bullshit from the innermost layer of Linux hell. Here’s my woeful case study involving screensharing.

You may remember that there was a time Linux operating systems used Xorg to provide you with windows. These days are long gone with the advent of Wayland, which is popularly branded as a safer, better, modern-er implementation of windows. I had no objections to this change, partly because how windowing works eludes me but mostly because Xorg was indeed an old and inflexible piece of technology that wasn’t keeping up with the times.

One of the biggest advantages of Wayland touted by its users is safety. Wayland is distinctly more macOS-like in the way that it grants absolutely nothing to the programs in terms of access to keyboard input and window contents. They initially have access to only the input they receive and the windows they create, and anything else must require the consent of the user through an implementation of an XDG desktop portal, which provides facilities for file pickers, filesystem access, screenshooting, screencasting, camera access, microphone access, and so on and so forth.

XDG desktop portals

XDG desktop portals, however, are funny. They’re funny because, first, they were never intended to be used by Wayland compositors (AKA window managers, AKA implementations of the Wayland protocol, AKA the thing that creates your windows). The XDG desktop portal was initially designed to provide a safe way of interacting with system resources to flatpaks, which are essentially semi-sandboxed Linux programs. Its use as a collector and provider of standard system protocols to Wayland compositors is an off-label, secondary usage.

Second, they’re also funny because they’re not straightforward at all. Desktop environments like GNOME or KDE Plasma implement their own XDG desktop portals since they have the manpower to do so, and therefore it’s never a concern for their users, but it becomes one for just about anyone else. If you use a cool compositor like Sway, labwc, or even Hyprland, then you need to fetch a desktop portal that implements the protocols you need… and maybe even multiple. For example, Hyprland ships the aptly named xdg-desktop-portal-hyprland, which you would assume does everything an XDG desktop portal must do. Sadly, it doesn’t. It implements only four protocols (out of 19), and the rest is left to another portal that you must install separately.

I use Sway, and Sway also maintains its own XDG desktop portal in the form of xdg-desktop-portal-wlr. It’s a bit wrong actually to say that it’s their own portal, since it’s actually the portal intended to be used by compositors built on wlroots (TL;DR: a stable implementation of common Wayland protocols), but since Sway is the parent project of wlroots, it’s kinda their own portal. When you install Sway on any common sense Linux distribution, it usually installs xdg-desktop-portal-wlr as a dependency. What you’re not being told is that this portal provides only two of the protocols you need. Out of 19, again. I was dumbfounded when I found out that my application picker in Dolphin wasn’t working because of this. Figures out, you need to install another portal, like xdg-desktop-portal-gtk.

Third and finally, XDG desktop portals are also funny because sometimes, they just don’t work. xdg-desktop-portal-wlr is supposed to provide reliable screencasting, but it sometimes chooses not to. Who knows why. Half the planet currently uses Zoom for videoconferencing, and xdg-desktop-portal-wlr straight up doesn’t work with it. The GNOME and KDE Plasma implementations work, but unfortunately, they aren’t compatible outside of their respective desktop environments. xdg-desktop-portal-hyprland worked until a few months ago when they introduced a regression that broke Zoom. You cannot rely on XDG desktop portals to provide you with funtional screencasting.

This is, by the way, what I’m trying to do. Achieving screensharing on Zoom. Somehow, something so simple has become so hard on Linux. This is what you can hate Linux for, because it’s partly my fault (for running a Wayland compositor instead of a desktop environment), but it’s mostly the fault of other developers failing to align over something as simple as recording the screen.

Fortunately, we’re on Linux, the land of alternatives and endless rewrites. You would expect that some other form of screensharing must exist, resolving all of my woes with XDG desktop portals. Sadly, you would be disappointed. There are, unfortunately, zero other protocols explicitly made for screensharing, which means you must suffer. However, again, we’re on Linux, the land where you can also just duct-tape three things not meant for a purpose and still achieve that purpose anyway.

Alternative #1: PipeWire

PipeWire is a new implementation of an audio server for Linux. It replaces PulseAudio and ALSA, both of which have been more or less the standard for audio servers on Linux. Thankfully, replacing PulseAudio or ALSA with PipeWire is straightforward as it offers a compatibility layer for either of them. I’ve been using PipeWire for years now without problems.

I do not know exactly why, but PipeWire has decided to implement protocols for screencasting on Wayland. At least, Christian F.K. Schaller’s blogposts seemed to indicate that. I presume implementing these protocols inside of what would eventually be a quasi-universal provider of multimedia access on Linux systems would be highly convenient, a bit like how systemd standardized central seat management for the majority of Linux distributions.

While what I initially read seemed promising, with their developers promising a “multimedia revolution”, PipeWire has decided to be lazy and abandon its own implementation of screenshooting and screencasting protocols in favor of using—you won’t believe it—XDG desktop portals. Yes, the very off-brand, hackfix shortcut used by Wayland compositors has independently occured in PipeWire. In fact, if you’re on Xorg and you’re using PipeWire, then using PipeWire’s screencasting interface to share your screen is basically a roundabout way of just requesting window data straight from the Xorg server. Instead of doing the obvious, you’ll go through a XDG desktop portal instead.

As a matter of fact, the 6.1+ versions of Zoom switched from directly interacting with the XDG desktop portal to interacting with PipeWire instead, presumably with the objective of attaining an extra degree of universality. Unfortunately, this has achieved nothing but break Zoom for virtually everyone (1, 2) but the users of GNOME. Even those using KDE Plasma have suffered a wee bit before they issued a half-baked patch.

Alternative #2: xwaylandvideobridge

If you’re not capable of screensharing over Wayland, then maybe you can screenshare over emulated Xorg.

Much like most 64-bit operating systems implemented multilib support for older 32-bit software, there was a somewhat obvious need to implement support in Wayland for older Linux programs that still relied on Xorg. This resulted in the development of Xwayland, which is a custom implementation of an Xorg server nested within Wayland that allows older programs to render correctly on Wayland, often with little to no difference other than perhaps the occasional oddity with HiDPI screens. Some compositors such as KWin or Mutter include an implementation of Xwayland, but some compositors (such as Sway) require a separate installation. In my case, I installed xorg-xwayland, and it works completely fine. Games are usually the number one uses of Xwayland since most game engines have yet to implement Wayland support.

Anyway, the folks over at KDE suffered the same problems that a lot of other people had with screencasting. XDG desktop portals acting wonky and PipeWire being a crashing mess. Even their own xdg-desktop-portal-kde proved insufficient for certain applications like Discord or Microsoft Teams, which still rely on Xorg, so they had to come up with an alternative. Therefore, they came up with the XWayland Video Bridge, which essentially creates a fake Xorg window within Xwayland that acts as a “mirror” for a proper window rendered within Wayland. Of course, it ultimately still relies on a XDG desktop portal somewhere.

However, for me, this was a potential solution. The problem wasn’t that my XDG desktop portal failed to capture window data, it was that Zoom specifically failed to interact with my XDG desktop portal. I thought that I could, just maybe, force Zoom to render within xorg-xwayland, then use KDE’s xwaylandvideobridge to provide my screen to Zoom. It’s such a redundant process but perhaps it would fix the problem.

After some trying, while I verified that xwaylandvideobridge indeed worked with the official Discord client, Zoom simply refused to use xorg-xwayland. I even uninstalled the native program offered in Arch Linux’s AUR, replaced it with the flatpak version which ran in a sandboxed environment, then used Flatseal to force Zoom to use Xorg instead of Wayland by cutting its access to the Wayland socket. This also failed as Zoom simply refused to launch if Wayland wasn’t present. This is a great irony since just last year, it took everything to make Zoom not bitch and moan at Wayland. Now, it only wanted Wayland and nothing else.

Alternative #3: The web application

Zoom ships a web application. This is not necessarily known to many, as Zoom goes out of its way to hide its existence (you need to pretend that Zoom isn’t launching while joining a meeting in order for the link to appear), but it does exist, and it offers more or less the same features as of native Zoom.

When I discovered this, I immediately tested if its screensharing worked. Unbeknownst to most, modern browsers implement some amount of screencasting, and they often rely on their own libraries to achieve this. I knew for a fact that, for example, Firefox uses its own screen capture mechanism on Windows. I thought that, provided all their manpower, they may have implemented their own screencasting mechanism on Linux too. Same goes for Chromium.

I was, of course, wrong yet again. Both Firefox and Chromium still branches out to XDG desktop portal. For a minute, I thought this wouldn’t be a problem because the XDG desktop portal does work in, for example, Vesktop, which is Chromium-based. Unfortunately, for some arcane reason, it also failed to work in the browser. At this point, I was getting heavily bored of this shit. I started looking into why Vesktop works but Chromium didn’t, and I didn’t get an answer. My guess is that Electron implements their own screenshare mechanism.

Alternative #4: OBS Studio’s virtual camera

I remembered that OBS Studio did work on my system. I used it a couple times to record footage of a game, and I know that OBS Studio offers a virtual camera. In fact, I used the virtual camera at least once in the past and I remembered that it worked. I thought this would fix it—maybe I couldn’t screenshare in any way, but perhaps I could simply broadcast my stream to a virtual camera device and then use it as the webcam in Zoom!

Long story short, this also failed. But Louka, oh Louka, why does it fail now when it worked before? Because as of three (3!!!) weeks before I made this post, a regression was introduced in OBS Studio where they screwed up their implementation of some ioctl protocol that was apparently non-compliant. This results in OBS Studio’s virtual camera failing to initialize NVENC, which is my GPU’s hardware encoder.

Alternative #5: v4l2loopback

This was the last resort, the last option I considered. This is another form of virtual camera, but instead of going through a video stream like OBS, it goes through an emulated camera stream on /dev/video0, like a webcam. You write to /dev/video0 like you would for any other video, and you can even read from it using ffmpeg or mpv.

It’s a bit of a convoluted process, which is why I delegated as my last resort. It’s one of the first ideas that came to me, but I thought I wouldnt have to do it. You install your distribution’s v4l2loopback package (on Arch Linux, it’s called v4l2loopback-dkms), load the driver with modprobe v4l2loopback exclusive_caps=1, and use whatever screen utility you have (like wf-recorder) to push data to the stream it creates at /dev/video0. However, you need to make sure that the color format used and the codec used are compatible with the software you are using (since you are after all emulating an hardware connection), and testing this is tricky since sometimes it may appear to work locally but fail during transmission as the software cannot encode a wrong incoming stream.

At first, I thought it worked, because mpv displayed the stream in its window! Sure, it was operating at half a frame per second, but that was something I could fix later by toying around with v4l2loopback’s driver settings. A further test in ffplay (which is the same technology used in software like Discord, Teams, and Zoom to encode/decode video streams) also worked, so I thought it was ready. I booted up Zoom and created yet another empty meeting to test it.

The camera appeared in the list! And Zoom didn’t fucking crash when I selected it!

But Zoom didn’t see the stream. It was greyish-black and nothing was coming through. I thought maybe the preview was screwed up, but I joined separately on my phone and it didn’t work either. Apparently, some additional reading revealed that none of the hardware codecs implemented in v4l2loopback were supported by Zoom. The only way to fix this was to setup a background instance of ffmpeg that piped in the raw codec coming through /dev/video0 into a separate emulated video device in a codec accepted by Zoom, which means I would have up to three encoding/decoding processes running at the same time.

Despair and disaster

It was at this point that I gave up and I told myself that tomorrow’s meeting will take place over my trusty MacBook Pro. I gained a new reason to hate Linux, but I still love it. What a bittersweet load of crap this all was.

← Back ↑ Top