On Virginia Giuffre

And here’s that tweet with its date:

December 10, 2019.

So about a month and a half after Epstein’s suicide. (And yes, I still firmly believe Epstein did kill himself.) Yet, as we see in the above screenshot, like with Epstein, many are basically saying that suicide is impossible with Virginia Giuffre.

Anyone familiar with the psychological dynamics of suicidality – on which I have… first-hand experience – will say that someone who says they aren’t suicidal can become suicidal in a shockingly short amount of time – even as short as a few months depending on life events and environment simply because… the human brain is very, very strange. That’s the whole “driving someone to suicide” thing, on which people have been successfully prosecuted – though such prosecutions are difficult.

And if you’re going to change your tune and say “Okay, she committed suicide, but [insert person you wish to accuse] drove her to it!” you’d better have plenty of evidence at the ready that isn’t just speculation or circumstantial.

Even someone who is suicidal or going through suicidal ideation can lie and say they aren’t suicidal. Suicide doesn’t play out how it’s portrayed in media. Someone who is suicidal can look surprisingly calm and collected, even “normal”. It’s one of the reasons the survivors of someone who committed suicide often have a hard time accepting the death is suicide – with “They didn’t seem suicidal” or “I didn’t realize they were suicidal” reactions not being uncommon – even when all the evidence points to it.

Like with Epstein, many so desperately want this to be a homicide. That her death cannot possibly be suicide, as if God himself is somehow making it impossible for her to kill herself… but that requires the kind of mental gymnastics that are characteristic of… leftists. Showing they’re little different from them in how they think, only different in what thoughts they hold.

Making an Arch-based router

For a while I’ve been using OPNsense for my custom router. I recently decided, though, to migrate away from that over to Arch.

Hardware

I haven’t changed anything on the router since the last hardware update, so here’s a recap:

Correction, the LAN card was upgraded to the ConnectX-3, but everything else has stayed the same.

Why not OPNsense (or pfSense)? And why Arch?

The router’s first incarnation was at the tail end of 2022, built using leftover AMD FM2+ parts, after I got frustrated both with Google Fiber’s router interface and MikroTik’s RouterOS… non-performance.

And in the 2+ years since, it’s become very apparent that… for a home router, OPNsense (and pfSense) is overkill. Very, very much so. And I don’t need… the vast majority of what OPNsense provides. I almost never log into the OPNsense front-end except for package upgrades.

But updates can end up looking like… this:

This was the proverbial straw in this instance. os-wireguard is the OPNsense Wireguard plugin. And it appears to have been replaced by something else. Meaning to upgrade OPNsense, I would need to back up my Wireguard configurations, remove everything Wireguard-related, upgrade, add the new Wireguard plugin, then add back my configurations, provided I could actually do that cleanly… Really?

Does FreeBSD not allow packages to be declared as replacements for something else? Good thing I haven’t needed to restore my router from a configuration backup… Anyway…

Routers are typically one-and-done like most servers. Once you have it set up and configured to your liking, there isn’t much need to pay it any mind except for periodic software updates. The virtual machine that serves this website, for example, is like that. Once I have everything on it that needs installed, there’s nothing more for me to do except keep the software up to date.

And those software updates shouldn’t require removing anything to upgrade them. Anyway…

Why Arch?

OPNsense, pfSense, and other similar distributions are built with general purpose in mind along with providing a clean, hopefully intuitive UI for configuration. (Though minus the “clean, hopefully intuitive UI” for VyOS.) They try to anticipate what you’re going to use based on feedback from their audience. This is, by the way, why Windows and a lot of Linux distributions (most distros anymore, it seems) don’t give you any option on what to install, pushing a pre-determined set of software.

Which is why I looked to Arch.

Arch provides two significant benefits: it’s lean, and it’s a rolling release. The latter means I don’t have to worry about a distro going end-of-life, like what’s happening with Ubuntu 20.04 LTS in April 2025. Meaning I don’t have to worry about replacing an entire distribution to keep everything up-to-date, with the downtime that comes with that.

Which is a consideration for OPNsense and pfSense, since both ride on FreeBSD, which are definitely not rolling-release operating systems. And while OPNsense’s upgrade is typically pretty smooth, it isn’t perfect – see the above screenshot. In all seriousness, though, if OPNsense migrated everything to sit on top of Arch – similar to how TrueNAS created SCALE, which rides on top of Debian – I think they’d be much better positioned given the far superior hardware support the Linux kernel offers. Along with not needing to create a magic file to enable the Mellanox driver.

Then there’s how lean you can make an Arch installation.

The base installation instructions on the Arch Wiki start you with only three packages: base, linux, and linux-firmware (plus their dependencies, obviously). You add whatever you need on top of that – and you’ll definitely need more than just that. But it’ll probably surprise you how little you need for a router.

The final install footprint for this project is… under 3GB. Tempting, then, to run this off a USB drive but, for the sake of stability and performance, especially reboot performance… definitely don’t do that. There’s a reason even TrueNAS doesn’t recommend doing that, even though they once did. And I initially ran it that way until switching to an SSD when the USB drive started showing issues.

Though if you really wanted to take things to the edge (pun intended), you could rebuild the kernel with it stripped down to the essentials and nothing more.

Who is this NOT for?

I need to stress that this is NOT a project for the inexperienced. At minimum you need to be comfortable with Linux.

And reasonably comfortable with Arch. The install steps aren’t written with the expectation you’re daily driving it, you should be familiar with Arch’s concepts and how they differ from the other root distributions like RedHat and Debian so you can, at the least, keep it up to date. If you’ve never installed Arch to even a virtual machine to create a simple web host or even a desktop for limited purpose, get comfortable doing that first.

And you absolutely need to be familiar with IP networking concepts – e.g., IPv4, DHCP, DNS, subnets, etc.

Setting it up

All the scripts and configurations I created to set this up are over on Github, along with the instructions for setting this up for yourself.

For Dynamic DNS updating, there are a few options available. OPNsense relied on ddclient, so that’s what I kept. And I migrated my previous Wireguard configuration as well – thankfully it wasn’t difficult to do by hand. Though without a UI, adding another Wireguard configuration will be a little more involved, but not substantially difficult.

Which, in the end, OPNsense is really just a UI on top of existing services.

Follow-up on my “Linux challenge”

I’ll preface this by saying that I’m no longer using OpenMandriva. I switched to openSUSE Tumbleweed after dealing with some rather interesting instabilities over the couple weeks I used it. But a few of the tings I discovered are likely not specific to OpenMandriva.

I already said in my previous article that photography is one of the reasons I’m keeping Windows 11. And it seems I might have another to add to that list:

VC1 support

I have a DJI Osmo Action 4. Great little action camera that I use as a body camera with a modified chest harness. And I have the camera configured to record video at 4K 60fps in HEVC and 10-bit color. The Osmo Action 4 uses VC1 to encode its video when in HEVC mode – which it requires when recording in 10-bit.

I can review the VC1 footage without issue in VLC on Windows. On Linux… it’s unplayable.

And this is a long-standing issue as there are articles and threads going back years about this. The only option is… converting it to H.264 or H.265 first. 4K UHD Blu-Rays are encoded in H.265. But some HD-DVDs used VC1, so I may have a few movies to my NAS to convert.

The Osmo also records an LRF – low-resolution file – that does play in VLC without issue, so those can be used for footage review and skipping through to find where to make cuts.

Clipboard and Printing

Turning OFF clipboard history in Plasma solved most of my clipboard problems. There are still a few periodic quirks, but for the most part it’s now cooperating. And I think those annoyances are more caused by the X11/Wayland boundary.

I didn’t bother trying to figure out the printer.

Brave on Wayland

I mentioned in my previous article that I chose OpenMandriva’s Plasma6 on Wayland installation. And… Brave had some interesting issues on Wayland. Easily the most frustrating was the window rendering being… off, wherein the tab bar at the top would fail to render properly as if the entire window was shifted up. Usually minimizing and restoring the window would correct that, but it was annoying to have to do.

Another issue was positioning the mouse cursor so it was over the lowest edge of a web page, and it would auto-scroll the page down. And it would do this even if the browser window in question was not top-level.

There’s an experimental flag to enable by going to brave://flags/#ozone-platform-hint and setting that option to Auto or Wayland and restarting the browser. That eliminated most of the issues, but there is still an interesting rendering glitch where a tab will be blank when restoring a window or switching to that tab.

NVIDIA

I think I’ve got a pretty good rhythm on this now and have figured out the workflow that allows me to keep the kernel and driver in sync.

First, as mentioned in my previous article, I have the kernel packages excluded from update so I can keep everything but the kernel updated.

Also using OpenMandriva ROME in VMs allowed me to keep an eye on when kernel packages get updated so I can choose when that’s going to happen. I’m not concerned with keeping the kernel “bleeding edge”.

And because… if it can be scripted, it should, since until my two reported issues are handled (here and here), this will be the norm for installing the NVIDIA driver on OpenMandriva:

#!/bin/bash

if [ -z "$1" ]; then

    echo Did you intentionally forget the path to the NVIDIA installer?
    exit

fi

# Just to get this out of the way...

sudo true

if ! sudo true ; then
    exit
fi

# resolve_btfids is installed at /usr/bin when installing the
# kernel-source package, so link to it 

sudo ln -rfs /usr/bin/resolve_btfids /usr/src/linux-`uname -r`/tools/bpf/resolve_btfids/resolve_btfids

# Create a symlink for vmlinux

cd /usr/lib/modules/`uname -r`/build/
sudo ln -fs /sys/kernel/btf/vmlinux vmlinux

# Circle back to the downloads folder and run the NVIDIA driver installer

cd ~/Downloads
sudo $1 --no-rebuild-initramfs --no-nouveau-check --skip-module-load

Note that this again requires these packages also be installed: kernel-source, kernel-desktop-devel, clang, libglvnd-devel, and pahole.

And this is the order of operations for updating the driver, whether due to a new driver or a new kernel:

  1. Modify the grub.cfg file directly to re-enable Nouveau only for the console mode entry
  2. Reboot to the console
  3. Comment out the dnf.conf line to allow the kernel packages to be updated
  4. Update the kernel packages and anything else out-of-date
  5. Reboot back to the console, following again on step 1
  6. Un-comment the line in dnf.conf to disable kernel package updates
  7. Run the above script to install the NVIDIA driver
  8. Reboot back into desktop

I’ve noticed that periodically I’ll get some video glitches as well, or one of my monitors will just completely disconnect and reconnect. No idea what’s going on there, but it’s one of two things: Wayland or OpenMandriva’s kernel being compiled with LLVM instead of gcc. Speaking of…

Gaming and my controller

I pulled Steam through Flathub to give it a go since, you know, everyone keeps saying Linux gaming is either on-par with Windows or… somehow better. That very much depends on what you’re talking about, but Windows gaming is still better and anyone who says otherwise is either delusional or grasping at any tiny data point – like FPS – they think proves their overall idea. Anyway…

Haven’t tried much in my library yet, but I did want to get my controller working.

There is a reverse-engineered driver available that unfortunately does not appear to be kept up to date anymore – the last commit was in early 2024, as of this writing, and there are several open pull requests to fix build issues with newer kernels. But one of that project’s forks is being kept up to date. So I pulled the code.

Again, OpenMandriva’s kernel is built using LLVM, not gcc, making them very much an outlier as of this writing. NVIDIA’s driver package already accounts for this… edge case. Thankfully the adjustments weren’t all that difficult to figure out once I found the Linux kernel resource about LLVM.

And after building and installing, along with confirming my controller was working with the code – testing with Steam and PCSX2 – I submitted a pull request for the build changes. If he doesn’t accept the PR, I’ll likely just fork his repo and keep it synced but for the LLVM change.

Switching distros

I ran into so many strange instabilities with OpenMandriva that I decided to jettison it for something else. I went looking at other rolling distros, even for a short bit considering Manjaro and Arch, until settling on openSUSE Tumbleweed.

The blame here lies in one of two camps, perhaps both: the LLVM kernel build in OpenMandriva, or the NVIDIA driver either not being completely stable when built using LLVM or not playing the greatest with Wayland.

So why Tumbleweed? Well, for one it’s considered the most stable rolling release Linux distro, and I wanted to stay with rolling release. I had looked at Tumbleweed initially, but had some difficulties trying it out in VirtualBox initially.

This brings my Linux experience back around nearly full-swing as well. While I started with Mandrake 6, later Caldera 2.something and RedHat 6.1, I would settle on SUSE for a while. And what brought me back was seeing this:

NVIDIA has an official repository for openSUSE Tumbleweed. No need to build the driver. No convoluted steps to keep the driver up to date. And the fact Tumbleweed’s kernel release isn’t built using LLVM meant I didn’t need to do anything special to get the xone driver built and installed.

I only wish they also had the Brave repository as a “community” repository option in YaST so I didn’t need to add the repo manually. It ships with Firefox by default. I’ve been using Brave for a couple years, long before the current drama. And I had been a regular Firefox user on Windows since it was initially called Phoenix, then Firebird, while I was in college.

Yes, kids, Firefox is that old. And the Mozilla name is older still.

The only thing about Tumbleweed on which I’m not a fan is it installs Plasma6 onto X11, not Wayland. There isn’t even the option to select Wayland for Plasma6, but they do give that option for GNOME. (Correction: it installs both X11 and Wayland, and you select which you want from the login screen.)

This is for the better, honestly. Plex Desktop (through Flathub) on Wayland with the NVIDIA proprietary driver does not work. And plenty of other applications also have issues unless you tell it specifically through a config option that you’re on Wayland. (See above about Brave on Wayland.) NVIDIA’s driver also overall seems to be more stable and better support X11 as well.

I also haven’t noticed any issues with the clipboard. So far. But what I did notice is the printer working on the first go when I installed HPLIP.

So yeah… OpenMandriva definitely has… issues. But when you’re talking about a distro with nowhere near the user base of the big players – openSUSE, Fedora, Ubuntu, etc. – it’s not all that surprising. The ones who’ve been using it the longest may not have issues, but as more and more newcomers switch over to it, any issues they do have will become a LOT more apparent. So hopefully there are plenty of people willing to contribute PRs and file bug reports. Personally I decided to just find something that works.

Which it wasn’t all sunshine and roses with Tumbleweed since trying to install it confirmed the Nouveau driver is trash:

This doesn’t really bode well for Linux adoption…

Since I’m using 4K displays, Tumbleweed’s installer defaults to that: 4K at 60Hz. And I get some… interesting display corruption with no option – or at least one that’s obvious that I can see – to change the resolution and refresh rate for the installer. And since it defaulted to that in the installer, it defaults to that in the desktop as well, meaning I had similar display corruption on the desktop. Dropping the refresh rate cured the issue while I figured out how to install the NVIDIA driver through the repo above.

But let’s put a few things into perspective here. 4K@60Hz has been an option for nearly a decade, with NVIDIA’s cards leading the front on bringing 4K gaming to the desktop with the GTX 1080 back in 2016, with the 1080 Ti solidifying their position on that in 2017. Again, I’ve personally been using it since 2021.

So why the fuck does the Nouveau driver NOT support that option without display corruption? In all seriousness, there has to be someone working on the Nouveau driver with a 4K@60Hz display, right? Right?

Sure if you’re using an NVIDIA card with Linux, chances are you want the proprietary driver anyway unless you’re using an older NVIDIA card (like the GT620 in my NAS). So at least Tumbleweed makes that easy to incorporate – again it’s the reason I switched over.

But that is what I saw merely trying to install Tumbleweed, and then saw it again after I got to the desktop. I had an idea on what to do to get around the issue. A lot of people new to Linux, though, won’t. And it’s shit like that which pushes people away from Linux. And telling those of us with NVIDIA cards to switch to AMD isn’t an option.

And I had similar display corruption with OpenMandriva’s Plasma6 on Wayland install, but it wasn’t consistent. And like with Tumbleweed, the issue went away entirely after installing the NVIDIA driver. Mostly. There was still something… interesting about using the NVIDIA driver on an LLVM-compiled kernel that still meant things weren’t playing completely right.

Scam? Hardly…

I’m going to play devil’s advocate here on the part of the repair business since “scam” gets thrown around way too much anymore. A “scam” would be charging $275 and NOT fixing the issue. Or getting the diagnosis completely wrong and demanding more money to set it right.

First, though the pad is a ground pad, it exists for a reason as part of the HDMI spec, and it’s to spread out current. So it is not true that “it’s not even really needed”. Sure the connector will function without it, but it should still be connected.

And retaining $75 for trying to diagnose the issue isn’t unreasonable. The time it takes to take apart a PS5 is still the repair tech’s time. He had to tear apart the PS5 to get to the mainboard and HDMI port, then he removed the HDMI port and saw there was a missing solder pad. The console owner paid $175 in advance, and probably signed a contract acknowledging he may be charged more and was given a quote. That he would need to reroute the missing solder pad is additional cost beyond what he initially quoted.

So $275 to tear apart the PS5 to get to the mainboard and remove the HDMI port. That’s time and incidentals going into this. Plus time to reassemble the PS5 and test the console to make sure the repair worked. How much is that time worth? You can’t act like the repair business isn’t out anything merely trying to diagnose the issue.

And to everyone like “OMG, why don’t you name and shame the shop that tried to SCAM this PS5 owner?” One simple reason… DEFAMATION LAWSUIT. What the repair shop is demanding isn’t unreasonable given what’s needed to repair the PS5.

Don’t consent to searches

If you ever want a damned good reason to never consent to a police search of your phone or other electronic device, take this case out of Oregon recently decided by the Ninth Circuit. First a few facts.

Tyler Smith is a deputy with the Grant County Sheriff’s Department in Oregon. Haley Olson is his girlfriend. But they were keeping their relationship under wraps. Olson drove into Idaho and was arrested for marijuana possession. While in custody, she signed a form that waived her Fourth Amendment rights and granted Idaho police permission to search her phone, and the internal storage was cloned.

Let’s sidebar for a moment.

I’ve said before that your consent is the government’s absolute defense to the Fourth Amendment when it comes to a search without a warrant. There are very few exceptions to the warrant requirement, none of which would’ve applied here.

And combining the Fourth and Fifth Amendments is an exception called the “foregone conclusion” doctrine. In short, absent your consent, law enforcement must be able to articulate what crime they are investigating, what evidence of that crime they expect to find on your phone, and what evidence – testimony or something tangible – gives them reason to believe that evidence exists. I discussed that when discussing whether the government can compel you to unlock your phone. (Short answer: yes, but they have to satisfy the foregone conclusion doctrine first.)

But again, that’s absent your consent. In the above, scenario with Miss Olson, she consented to the search, so the Idaho State Police imaged her phone. Again, do not consent to a search. Let the police hold your phone while trying to get a warrant. Since consenting to a search means also consenting to whatever they find being used against you in a Court of Law.

But consenting to that search definitely does NOT mean she consented to… everything else that allegedly happened with what they found.

Unfortunately Miss Olson wasn’t able to get satisfaction out of the Courts. The Ninth Circuit ruled that, while there was clearly a violation of the Constitution, since no one beyond the Idaho State Police had reason to be in possession of the phone’s storage clone, the police officers and prosecution officials who participated in that are protected by qualified immunity.

So, again, another reason to never consent to a search.

The above case is Olson v. County of Grant, Oregon.

AOC did nothing wrong

Two things to say up front before getting into Rep. Ocasio-Cortez’s recent letter to the Attorney General. First, it is not a crime to inform people of their rights under the Constitution. And second, illegal aliens do have rights under the Constitution.

On the second point, one thing so many people seem to forget is the Federal government is indiscriminately restrained by the Bill of Rights and the rest of the Constitution. As I’ve pointed out on this blog before, there is nothing in the Constitution limiting the protections of rights only to citizens or legal permanent residents.

So let’s get into the details.

Representative Alexandria Ocasio-Cortez (NY-14) sent a letter to the Attorney General of the United States requesting clarification on whether the Department of Justice is pursuing a criminal investigation against her. This stems from a webinar she held on February 12 called “Know Your Rights”:

I’ve mentioned an organization on this blog called “Flex Your Rights” and their two videos informing you of your rights under the Bill of Rights when it comes to interacting with law enforcement.

Now what Ms Ocasio-Cortez released, both a pamphlet and her above webinar, is no different from the above videos with one exception: it explicitly calls out ICE and is targeted to helping illegal immigrants and migrants. Something many have called “aiding and abetting”.

And in response to the webinar, Tom Homan, the Executive Associate Director of Enforcement and Removal Operations (ERO) and the “Border Czar”, threatened an investigation for obstructing law enforcement, saying on Fox News: “I’m working with the Department of Justice and finding out. Where is that line that they cross? So maybe AOC is going to be in trouble now.”

But, again, the Representative did nothing wrong here.

It is not illegal to inform someone of their rights. Doing so is not “practicing law without a license”, something I’ve seen asserted countless times. Nor is it obstructing law enforcement. If I walk by someone who is in handcuffs while the police are processing them and inform them of their Fifth and Sixth Amendment rights, those officers cannot then turn around and arrest me. They absolutely can tell me to go away. But they can’t arrest me for merely informing someone of their rights, even if the person I’m informing is under arrest and in police custody!

I could make up my own pamphlet and walk into the inner cities, talking to gang members and illegal immigrants, distribute the pamphlet and hold conversations with people informing them of their rights. And absolutely nothing in the law can stop me from doing that.

People knowing their rights impedes law enforcement. And many conservatives treat that as a bad thing. I’ve pointed out several times on this blog cases where police have done some… shady things to circumvent the Constitution. Such as the cases of Timothy Bass and Jimmie Bowen. And even defending the Boston Marathon Bomber’s right to remain silent.

Again the Constitution indiscriminately restrains the government. The Bill of Rights is supposed to impede law enforcement. And the more informed everyone is of their rights, the better. And contrary to a very popular belief on the right, illegal aliens are still protected by the Constitution. That is why there is a “due process” they are entitled to per the Fifth Amendment before they are deported.

Again, Ms Ocasio-Cortez did not violate the law by hosting a webinar informing people of their rights. She is absolutely protected by the First Amendment here.

Claudia Tenney seeks to commandeer the States

Now that the text of the bill is finally available, let’s go through this.

Claudia Tenney (R-NY-24) introduced HR.373 as part of the 119th Congress, called simply the SAGA Act, or “Second Amendment Guarantee Act”. It had previously been introduced in the 115th Congress by Chris Collins, also from New York. The meat of the bill seeks to amend 18 USC § 927. Currently that statute reads:

No provision of this chapter shall be construed as indicating an intent on the part of the Congress to occupy the field in which such provision operates to the exclusion of the law of any State on the same subject matter, unless there is a direct and positive conflict between such provision and the law of the State so that the two cannot be reconciled or consistently stand together.

So that’s a bit of legalese, isn’t it?

In short, what this means is no part of Title 18, Chapter 44 – where you’ll find that section – shall be interpreted as precluding States from being able to have their own firearm laws except where Federal law and the laws of a State conflict in a “direct and positive” manner. “Direct or positive” basically means that a State is trying to impose looser restrictions than what Federal law requires. For example, if Federal law had a magazine limit set at 10 rounds, but a State had a magazine restriction of 20 rounds, the latter is not enforceable and the State would instead be required under Federal law to enforce the 10 round limit. This is known as preemption.

Claudia Tenney wants to change §927 to this:

(a) Except as provided in subsection (b), no provision of this chapter shall be construed as indicating an intent on the part of the Congress to occupy the field in which such provision operates to the exclusion of the law of any State on the same subject matter, unless there is a direct and positive conflict between such provision and the law of the State so that the two cannot be reconciled or consistently stand together.

(b) (1) A State or a political subdivision of a State may not impose any regulation, prohibition, or registration or licensing requirement with respect to the design, manufacture, importation, sale, transfer, possession, or marking of a rifle or shotgun that has moved in, or any such conduct that affects, interstate or foreign commerce, that is more restrictive, or impose any penalty, tax, fee, or charge with respect to such a rifle or shotgun or such conduct, in an amount greater, than is provided under Federal law. To the extent that a law of a State or political subdivision of a State, whether enacted before, on, or after the date of the enactment of this subsection, violates the preceding sentence, the law shall have no force or effect. For purposes of this subsection, the term ‘rifle or shotgun’ includes any part of a rifle or shotgun, any detachable magazine or ammunition feeding device, and any type of pistol grip or stock design.

(2) In an action brought for damages or relief from a violation of paragraph (1), the court shall award the prevailing plaintiff a reasonable attorney’s fee in addition to any other damages or relief awarded.

This statute sounds somewhat similar to another that was overturned by the Supreme Court.

In 1992 at the tail end of GHWB’s sole term as President, Congress passed and Bush signed into law the Professional and Amateur Sports Protection Act (PASPA) of 1992, which created Title 28, Chapter 178 of the United States Code. Under that Chapter is §3702, which says:

It shall be unlawful for—

(1) a governmental entity to sponsor, operate, advertise, promote, license, or authorize by law or compact, or

(2) a person to sponsor, operate, advertise, or promote, pursuant to the law or compact of a governmental entity, a lottery, sweepstakes, or other betting, gambling, or wagering scheme based, directly or indirectly (through the use of geographical references or otherwise), on one or more competitive games in which amateur or professional athletes participate, or are intended to participate, or on one or more performances of such athletes in such games.

The law not only was a nationwide ban on sports betting, it prohibited States from legalizing it. In 2011, New Jersey would implement the Bradley Act in direct conflict with PASPA. Other States joined in on the legal challenge. And the Supreme Court would rule in Murphy v. NCAA, 584 US 453 (2018), that PASPA is unconstitutional.

The PASPA provision at issue here—prohibiting state authorization of sports gambling—violates the anticommandeering rule. That provision unequivocally dictates what a state legislature may and may not do. And this is true under either our interpretation or that advocated by respondents and the United States. In either event, state legislatures are put under the direct control of Congress. It is as if federal officers were installed in state legislative chambers and were armed with the authority to stop legislators from voting on any offending proposals. A more direct affront to state sovereignty is not easy to imagine.

Well, Claudia Tenney thought of one. She seeks to rewrite §927 and take it from preemption to commandeering.

In short, commandeering when talking about jurisprudence is when Congress enacts a law or the Executive Branch enacts a regulation requiring that States enact or refrain from enacting laws or regulations that include specified provisions. The case in which the anticommandeering doctrine was established is, rather ironically, New York v United States, 505 US 144 (1992). And in establishing the concept of commandeering a State government, the Supreme Court also established the distinction between an incentive and an attempt to commandeer.

The Federal government can provide incentives to States to enact certain laws – e.g., helmet laws and setting the drinking age to 21 – but cannot directly require they pass or refrain from passing legislation or regulation meeting specified requirements or including specified provisions.

In short, except where the Constitution directly allows it, Congress cannot tell the States what they can and cannot do.

Because the States are also sovereigns. And the Federal government is also sovereign, but only to the extent the States have allowed via the Constitution. This is known as the dual sovereignty doctrine, which is derived from Federalist 39. First,

Each State, in ratifying the Constitution, is considered as a sovereign body, independent of all others, and only to be bound by its own voluntary act.

And,

Among communities united for particular purposes, it is vested partly in the general and partly in the municipal legislatures. In the former case, all local authorities are subordinate to the supreme; and may be controlled, directed, or abolished by it at pleasure. In the latter, the local or municipal authorities form distinct and independent portions of the supremacy, no more subject, within their respective spheres, to the general authority, than the general authority is subject to them, within its own sphere. In this relation, then, the proposed government cannot be deemed a NATIONAL one; since its jurisdiction extends to certain enumerated objects only, and leaves to the several States a residuary and inviolable sovereignty over all other objects.

And that dual sovereignty means, again, that Congress largely cannot tell the States what they can and cannot do. Yet Tenney is explicitly doing that in the language for her bill by saying that State laws that run afoul of the language of her bill “shall have no force or effect”.

That is commandeering, since she is basically trying to tell the States what laws they can and cannot enforce. That absolutely runs afoul of the Constitution, and sets a very, very dangerous precedent as well.

My own Linux challenge

Plenty of tech YouTube channels have taken a plunge into daily-driving Linux. And I’d actually been wanting to do so for the longest time. The holdup hasn’t been any specific requirements. I need Windows for some things, so dual-booting was part of the intent.

Which, up front, I’ll say that if you want to dual-boot Windows and Linux, get one drive for each, install Windows first and get it completely updated, set this setting in the registry, then install Linux.

Anyway… The holdup has actually been the distro.

I’ve been using Linux in virtual machines for… years using VirtualBox on Windows – and briefly when using VMware and ProxMox on a server. And that’s a great way to try out new distros and compartmentalize projects. For example, I created a Linux Mint virtual machine with VS Code for working on the theme I mentioned in this article. Everything was self-contained, separate from my Windows environment. And with bridged networking, I could still test the theme in my Windows browser and on my phone.

So before going further, let’s address the elephant in the room: Windows 11.

Why do I still need Windows 11?

Two things: games and photography.

On the first, while there have been significant leaps in Linux gaming, courtesy of Valve’s work with Proton, Steam isn’t the only service where I have games. I also have games through Epic, Ubisoft Connect, EA Origin, Bethesda, Battle.net, GOG, and the Rockstar Social Club. And getting games from those clients working on Linux can be… interesting, to say the least.

But with the light years leap Valve made for Linux gaming with Proton and Steam, I would not be surprised if Epic is actively developing a Linux client if, for no other reason, to get their client on the Steam Deck and other SteamOS compatible devices. Windows-based handhelds can already run… most any game you throw at it – just watch the many reviews Craft Computing has made. But it is definitely in Epic’s interests to get their client on SteamOS. Same with Blizzard with Battle.net. It’s also in Microsoft’s interest to write a Linux driver for the XBox controllers so we don’t have to rely on reverse-engineered drivers from Github. (The PS5 controller apparently works out of the box.)

However (comma)… until the issues with anti-cheat in online multiplayer games are handled in a viable fashion, gaming on Linux won’t be able to completely catch up to Windows.

Overall gaming is just easier on Windows. Period.

I also use WeMod, which absolutely does not work on Linux and likely never will simply due to what it is and how it works.

Yes I use cheats and trainers in offline games. If that bothers you, and you feel you the need to say anything negative about it, kindly fuck off. I’m approaching my mid-40s. I don’t have enough time on my calendar to spend 100 hours trying to finish a game when I’ve got a backlog a mile long and a trainer or cheat will allow me to see it all and do it all in a fraction of that time.

As an example, even with a trainer, it still took me over 30 hours to 100+% Hollow Knight. And I’m considering going achievement hunting on that, which will probably take another 20+ hours with a trainer. And yes, I’m very much looking forward to Silksong.

On photography…

I use the Adobe suite. Photoshop (Ps) and Lightroom (Lr) specifically. Plain and simple, there is currently no viable alternative to either in the open source space. Not at least without making significant sacrifices to my workflows.

“But GIMP!!!”

Shut the fuck up!

GIMP absolutely is very functional, very capable for photo editing and “manipulation”. I do use it on occasion, though not nearly as much since adding Ps to my Adobe subscription in mid-2023.

But Photoshop is just better. Plain. And. Simple. Far superior where it counts. And one place where it absolutely counts is… context-aware heal. Why GIMP’s devs have not integrated this feature is beyond me? Why “integrate” and not “implement”? Because a plugin has been available for over a decade that provides a context-aware fill. Yet that code has never been adapted to provide a context-aware heal that works very similar to Photoshop’s implementation.

In all seriousness, if they were to implement just. that. feature. they’d bring themselves far closer to Photoshop in needed functionality. That feature alone could give them the leg-up they need, especially among photographers who want to walk away from Photoshop and would be willing to walk to something available for free, like GIMP, if only the functionality was there.

“Why don’t you implement it then? After all GIMP is open source!” Because I value my time and will readily use something that already works, even if I have to pay for it, over spending who knows how much time figuring out GIMP’s source code, the plugin’s source code, and how to integrate them.

“Photoshop works in Wine!”

Shut the fuck up! No it doesn’t!

If it can’t be installed and run through Wine in Linux without issues, it. doesn’t. work. Period. So stop claiming it does. And almost no one using Ps or Lr cares about the ancient versions that you claim work. Only Linux fanboys care about that, and only so they can claim Ps and Lr work in Wine.

And darktable is not a viable alternative to Lr. Plus it’s a lot more difficult to use. Yes, I’ve tried it. Lr’s UI and UX is just far superior.

Moving on…

My primary camera is the Nikon Z5 (buy it on Amazon, Adorama, or direct from Nikon). My previous camera was a Nikon D7200. I’ve been relying on Nikon NX Studio for initial image review before importing photos into Lr for editing as it’s pretty snappy. I’ve used RawTherapee, but Nikon’s NX Studio is just… better. It’s from Nikon, so I would expect it to be. Plus it’s also available for free.

But one hard requirement is… functional 10 bit-per-channel color.

I’ve been relying on that for a couple years, since acquiring two 4K televisions that support it. My camera is configured to write 14-bit RAW files. And the aforementioned software I use on Windows can display 10-bit color without question.

The situation is a bit different on Linux.

It appears to be working on Linux from what I can find. Kind of, at least. But it’s seemingly impossible to determine whether it’s enabled, let alone being used. In all seriousness, it should be as easy to find in Linux as it is in Windows. Put it on the same panel for changing the display resolution and refresh rate.

And, lastly, to a lesser extent is Microsoft Office. But most of the functionality I need with the Office suite is available in the browser, so it’s not a major loss. I have a 365 subscription mostly for OneDrive. I know I lose OneDrive syncing, but I still have access to that through the browser. Plus, as mentioned, I’m going to be dual-booting with Windows.

So moving on…

A few things I didn’t need to worry about

Before getting into the requirements for my setup, day-to-day, workflows, etc., let’s first talk about what I need that’s also pretty ubiquitous with Linux.

Browser. Most everything that doesn’t require significant compute performance or an involved UI has moved to the browser. And even some things you wouldn’t expect to be browser-enabled have become so. For example, you can do light photo and video editing in a browser. Google Docs and Office Online have been available for years – though there are plenty of features Office Online does NOT have compared to Microsoft Office that just aren’t really possible in the browser.

And virtually every browser is also available on Linux. Firefox, Chrome, Brave, and even Microsoft Edge. So there’s no worries with using my preferred browser – Brave – on Linux.

Media players. VLC and Plex in particular. (I do have a Plex Pass.) Plex has been available on Linux since the beginning – both the server and client. And I’ve written a few articles about it here. Same with VLC since it relies on other open source projects.

SSH. This is pretty ubiquitous on Linux. I use PuTTY on the regular on Windows to, for example, remote access the server behind this blog for software updates and the like. PuTTY is also available readily on Linux, so no changes needed here.

Virtualization. VirtualBox is readily available. QEMU is there as well with virt-manager, but isn’t just a single-command install, but I might use this as a chance to play around with it since it has some features that VirtualBox does not.

Requirements

So now let’s talk about functionality I need. While that doesn’t necessarily govern what distro to use, it does determine whether this is viable in the first place.

Lean installation. One thing I don’t like is a distro that doesn’t give me the choice of what to install. So if that choice is going to be taken away from me, I want what is installed to be a lean set of packages so I’m not spending time removing a ton of stuff I know I won’t ever use. Needing to remove some things is fine, but needing to remove half of what’s installed… No.

This means… Mint is pretty much out, as are a lot of distros that don’t have a “slim” or “minimalist” option.

I understand wanting to be more welcoming of people new to Linux by giving them an install base with a bunch of software that meets most anyone’s requirements for a desktop. But… please… please also always give a lean install. Seriously Arch should not be the default choice for lean Linux setups. This was one of my gripes finding a Linux distro for Cordelia.

Rolling release. I’ve played around with Arch, and I like the idea of the operating system not having a set version number. Since that means not having to worry about doing an in-place upgrade – though Ubuntu and Fedora make that mostly trivial – or a reformat and reinstall like when I upgraded from Windows 10 to 11. (I did the in-place upgrade first to register my machine against the Microsoft license servers, then reformatted and reinstalled.)

Now this is a risk in part because rolling releases may include experimental, beta, or even alpha packages. But it also means you don’t have to worry about being horribly out of date on some things like… the kernel.

KDE or Cinnamon. Linux Mint is one of my go-to distros for running in a VM, even with all the packages I need to remove, largely because of Cinnamon. Being somewhat lightweight, it works well in a virtual machine. I’ve fooled around with Arch a little as well, using Cinnamon there. But Cordelia runs Manjaro with KDE. And KDE is the window manager I’ve leaned toward ever since first using Linux… over 25 years ago. And in playing around with the latest versions of it recently, I prefer KDE to Cinnamon.

Never liked GNOME. And I avoid distros and install options where that’s the default – meaning Ubuntu is out just on that alone.

I’ve played around with COSMIC in virtual machines, but it’s only in Alpha right now. And as this is for my personal desktop – i.e., mission critical – I need something that’ll just work. This is a workstation, not a bug testing setup.

Password management. I’ve been relying on KeePass for… years. Before I even contributed code for it all the way back in… 2006. And while I’m evaluating switching to Proton Pass, since I use Proton for email, right now I need something compatible with KeePass. And KeePassXC fits the bill, so it needs to be readily available in the package manager as well. I’ve played around with it on Windows, and I definitely prefer its password generator to the one in KeePass, which has some… issues.

What distro?

OpenMandriva ROME. As of this writing, the latest released version is 24.12, though an update will upgrade you to 25.02. And specifically I’m using the Plasma6 on Wayland ISO. I tried to use the “ROME Plasma Slim” from this page but was not able to get the image to boot on my desktop. Worked fine in VirtualBox, though.

But that also meant I couldn’t get the “slim” release I wanted since the “slim” build seems to be new for the 2025 builds. There isn’t a “slim” option for download on their SourceForge. Meaning I’d be removing packages after installing. But while it isn’t “slim”, I’ve definitely seen worse – e.g., Linux Mint.

So why OpenMandriva? I learned about it through Lunduke’s YouTube channel and checked it out.

Mandriva is a spiritual successor to Mandrake Linux. Mandrake was a RedHat derivative, and version 6.0 (Venus) is where I got started with Linux back in 1999. At the time, it was billed as one of the easier Linux distros to get into. And a boxed set of CDs for it was on the shelf at CompUSA, saving me from having to download it later. (Alongside boxed sets for Caldera, RedHat, Debian, and Slackware.)

The last release of Mandriva Linux was in 2011, and OpenMandriva followed a couple years later. There’s also Mageia, another fork of Mandriva, but its last major update was in 2023.

I’ve bounced between various distros over the years, and I’d considered daily driving Linux for a while. So upon learning about OpenMandriva, I tried it out. And it was what finally gave me the push to daily drive Linux for the first time in… well, a long time. I only wish I could get the 25.01 “slim” release working.

What didn’t work out of the box?

I’ve only tried the Plasma6 release, so I can’t speak for anything else. Out of the box, it gives you Chromium plus quite a few other packages I just… removed. There are third party repos for Brave, VS Code, and a few other things.

And one point to the OpenMandriva team as well if they see this: take out the “We strongly recommend…” text from the third-party repository descriptions. If a user enables one of the non-free repos, it’s likely out of necessity – e.g., the Microsoft Teams repo – so your “We strongly recommend” is largely meaningless.

And with VS Code, a LOT of developers are very used to VS Code – like me! – and so will NOT transfer to something else because… why?

Anyway… There isn’t much I have a gripe with here so far, so I’ll just mention the specific packages that gave me issues.

Printing. I have an HP LaserJet P1505 printer that I bought brand new back in 2009. Still on the original toner cartridge, too, which goes to show how ubiquitous printing no longer is. And while the system did detect and appear to install the printer, that didn’t happen.

I needed to install another package called hplip and then run hp-setup after first removing the printer through the System Settings so it could be re-added using HP’s official Linux driver.

Printing still isn’t working properly directly from an application, though. Brave can see the printer but print jobs sent to it get blackholed, and GIMP doesn’t see it at all. I hardly every print anything so I’m not hugely concerned with getting this working. Most of what I print is PDFs, though, which can be printed using HPLIP directly.

Clipboard. Imagine copying something to the clipboard, but it doesn’t actually get copied. So when you try to paste something into a form, for example, it instead pastes something you had previously copied, if anything.

Yeah for some reason I have to clear the clipboard history periodically to keep this from happening. This became especially apparent when trying to use KeePassXC. Where I think I copied the username or password to the clipboard only for… something else to be pasted into the login form. I’m leaning toward that being a problem with Plasma6, since I don’t recall having this issue on Cinnamon. Perhaps I need to downgrade to Plasma 5. I don’t know…

Anyway…

But it’s little things like these which can keep people from adopting Linux for daily use. Since I definitely should NOT have to keep clearing the clipboard to keep using it. Sure with Windows I still needed to download and install a driver for my printer, but then it just… works after that.

And then there’s…

NVIDIA

So this took more than a few tries to get right. And I definitely spent way more time on this than was reasonable.

I’ve used the NVIDIA proprietary Linux driver in the past, too. A LOT. So that I had this much difficulty getting it working on OpenMandriva is either a problem with NVIDIA or OpenMandriva. And I’m leaning toward the latter simply due to a couple quirks I discovered while trying to get this working.

Out of the box, the Nouveau driver does at least provide something to get NVIDIA owners going. But with more advanced setups, the driver’s lackluster performance becomes very apparent. Especially on a system with two 4K monitors like mine.

The OpenMandriva non-free repo does have an RPM for the NVIDIA proprietary driver, but… I was not able to get it to work. So going with the download available straight from NVIDIA, I’ll outline the steps I followed.

Note: make sure to fully update your install before following these steps.

I also highly recommend backing up your boot and data partitions since this… could get messy. Even if you’re operating on a fresh install since restoring a backup might be quicker than reinstalling everything. In my instance, I just imaged the entire drive using dd from the OpenMandriva live USB, copying the image off to Nasira, making restoration as simple as reversing the process while I was figuring out these steps.

First, you’ll need to download the NVIDIA driver package. If you’re on on the OpenMandriva Rock 5.0 distro, go with the “New Feature Branch” version, which is 565.77 as of this writing since Rock is still on the 6.6 kernel as of this writing. But if you’re using Rome, which is the rolling release distro which was recently upgraded to kernel v6.13, the v565 driver will not build so you’ll need the v570 beta driver.

Next up is package installations. You can do this from within the graphical desktop from a command line – e.g., Konsole for KDE Plasma – or reboot to the console.

sudo dnf install clang libglvnd-devel pahole kernel-desktop-devel kernel-source

Notice I did NOT add “-y” to auto-confirm. You need to pay attention to what’s going to be pulled in here, specifically the kernel-desktop-devel package. The version MUST match your kernel. If you’re fully up-to-date, this should be the case. Make sure it is NOT grabbing something newer. If it is, you need to update your packages first.

After this, disable kernel package updates. This is due to a couple additional steps we must take in order to build the kernel modules that absolutely will brick your installation if you allow the kernel packages to be updated. Make this change by running this command:

echo "exclude=kernel*" | sudo tee --append /etc/dnf/dnf.conf

I’ve filed two bugs with OpenMandriva’s GitHub (here and here), so hopefully those will be addressed. Not sure how easy or difficult those will end up being.

So now to overcome the issues I reported in the above bugs:

# Build resolve_btfids and link to it 

cd /usr/src/linux-`uname -r | sed -E 's/(.+?)-.+?-(.+)/\1-\2/g'`/tools/bpf/resolve_btfids
sudo make
sudo ln -rs resolve_btfids /usr/src/linux-`uname -r`/tools/bpf/resolve_btfids/resolve_btfids

# Create a symlink for vmlinux

cd /usr/lib/modules/`uname -r`/build/
sudo ln -s /sys/kernel/btf/vmlinux vmlinux

That first line will strip “desktop” from the middle of the kernel version string returned by uname -r, so it should take you right to the kernel source folder for the resolve_btfids tool, which the NVIDIA installer will use as part of building the kernel modules. As of this writing, on my updated installation, uname -r will return 6.13.4-desktop-2omv2590, so that first line should resolve to /usr/src/linux-6.13.4-2omv2590/tools/bpf/resolve_btfids

So after all of this, you’ll want to run the installer. If you have not yet rebooted your system to the console, do so now, then run this when you get settled:

sudo /path/to/NVIDIA_install.run --no-rebuild-initramfs --no-nouveau-check --skip-module-load

Let’s go over the options here.

  • --no-rebuild-initramfs: the in-built step to rebuild the initramfs will fail on OpenMandriva, so disabling it here to avoid the error message and early abort.
  • --no-nouveau-check: pretty self-explanatory. This won’t check for nouveau, but it also won’t attempt to disable it. We’ll handle that later.
  • --skip-module-load: This will prevent the installer loading the module, which will error out if Nouveau is still loaded, causing the installer to abort early.

During the install, you’ll get prompted for a few things:

  • NVIDIA Proprietary or MIT/GPL: Proprietary
  • Install 32-bit libraries: Yes. Not sure what uses it, but better to have it and not need it over needing it and not having it
  • Register the kernel module with DKMS?: No! Unless you previously installed DKMS for something else, you likely won’t even get prompted for this.
  • Configure X automatically: Yes if you’re on X11, No if you’re using Wayland

The installer will dump out some progress messages as it goes and eventually confirm at the end that the kernel module was installed. But you will also get a message about needing to reboot due to Nouveau being enabled.

Don’t reboot just yet! We’ve got one more step: disabling Nouveau. First, create a new file at /etc/modprobe.d/nvidia-proprietary.conf:

blacklist nouveau
options nouveau modeset=0

options nvidia_drm modeset=1
options nvidia_drm fbdev=1

Now, edit the /etc/defaults/grub file. The line you’re looking for begins with GRUB_CMDLINE_LINUX_DEFAULT. Add modprobe.blacklist=nouveau to the end of that string.

Now, a few more commands:

sudo dracut --force
sudo cp /boot/grub2/grug.cfg /boot/grub2/grub.cfg.bak
sudo grub2-mkconfig -o /boot/grub2/grub.cfg

That copy command ensures that your grub config is backed up before it’s overwritten so you can easily recover it using a boot USB if necessary.

And now… Reboot.

If you did everything successfully, you should be coming back up into your desktop of choice with everything working as expected. Meaning if you have your system configured for auto-login, that should work as before. And you can verify from a terminal that your system is using the NVIDIA driver instead of Nouveau by running this:

lsmod | grep nouveau

If the NVIDIA driver is properly configured and working, you should get a blank result. You should also be able to run nvidia-smi from the command line and have it return details about your card and what applications are trying to use it:

And – holy shit! – that was far more difficult to figure out than it realistically should have been. So be glad you’re reading this so you don’t have to go through the pain – or at least for much longer. These instructions may also work with other distros with some tweaks for distro-specific commands and package names.

Conclusions

So that’s it for now. I’ll probably revisit this some months down the line – especially as package and kernel updates get rolled out along with a new NVIDIA driver release.

I’ve been daily driving Linux for about two weeks as of this writing, so it’ll be interesting going forward. I’ve been back in Windows a few times, but that was mostly as I was figuring out the NVIDIA driver issues and just wanted to break from it. I have VMs under VirtualBox in Windows I need to migrate to QEMU – which that, along with virt-manager – works a LOT better than VirtualBox under Linux. But things for the most part have been… just working.

Once I figured out the NVIDIA driver and realized what was going on with the clipboard.

And again the only reasons I really have for going back into Windows are gaming and photo work along with the very infrequent print job. I can use the HPLIP GUI to print some document types, though, so that’s a workaround for the occasional need to print a shipping label.

So I’ll report back sometime down the line as to how this goes. But so far I’ll likely be sticking with Linux and popping over to Windows only when necessary – e.g., the above-mentioned reasons why I’m keeping it around.

Open source++

Given what’s been coming out of open source projects in the last few months, with waves of purges and “cleansings” to rid projects and repositories of anyone who is not left of Bernie Sanders, I’m starting to wonder if FOSS is about to have its own version of “Atheism+”.

I first write about Atheism+ not long after it got off the ground back in 2012. To summarize, Atheism+ was about politics and demands by atheism speakers and “leadership” expecting all atheists to move hard left and adopt a preordained set of tenets. And in my above-linked article, I said this:

For one there are more atheists than those who actually use the label “atheist”, and more using the label “atheist” than those active in the atheist movements, meaning Atheism+ is, by definition, another minority within the totality of all who fit the definition of “atheist” even if they don’t use the label.

Atheism+ would die the death it deserved in 2016, but its underlying tenets haven’t gone away. The lesson they failed to learn is, quite simply, that atheists were a LOT more politically diverse than first presumed. Those who formed Atheism+ expected they’d get the majority of atheists on their side. And when that didn’t happen – as shown by the dismal attendance numbers for the 2016 Reason Rally when compared to the 2012 Reason Rally – to say they turned hostile would be an understatement.

I’ll let Peach’s video do the talking:

I’m libertarian and atheist. Skeptical of governments and gods. So when Atheism+ first came around and tried to redefine atheism, despite their assertions they weren’t trying to do that, I wasn’t going along with it. Instead it showed that a lot of people who typically advocated for free thought and free expression were now adopting the idea that you only had freedom of thought and expression only if you used that freedom to think the same way as them.

And one of the counters to Atheism+ by other atheists, such as me, was simply… what does any of that have to do with atheism and being an atheist? Rather than focusing just on… atheism and offsetting or countering the influence religion has over society, along with countering the false ideas many propose such as creationism, Atheism+ I guess decided that wasn’t enough. That atheism meant more than just saying “gods don’t exist”.

And now with open source, we have something similar brewing. Requiring a person hold certain political ideas to contribute to repository rather than being inclusive of everyone who wishes to contribute. Where a lot of projects have people at the reins who are hard left. With “purges” and “cleansings” happening because of it. So are we now witnessing the formation of “Open Source++” or “FOSS++”? Borrowing on Jen McCreight’s tenets for Athesim+:

  • Open source += “we care about social justice”
  • Open source += “we support women’s rights”
  • Open source += “we protest racism”
  • Open source += “we fight homophobia and transphobia”

And to finish with adapting Peach’s ending statement, since that’s the apparent trend: Open source += everyone we haven’t yet blacklisted.

What do all the += have to do with writing software?

Software development, especially open source software development, is the ultimate in egalitarianism. Projects live or die on the merits and with whatever effort their creators or forkers are willing to put into it. User bases are earned, not owed. Though every so often we get project creators and maintainers with a god complex. But the central idea of contributing to open source is… leaving your identities and politics at the door to the code repository. Focus instead on writing code, fixing bugs, and delivering value.

But right now we definitely have people on a power trip. And this has the potential to threaten wider adoption of FOSS projects.

So will we see “FOSS++” or “Open Source++” coming down the pike? I very much hope not. If anything, the push back by others within open source spaces should push these projects to reverse course.

“Bloat” doesn’t mean what you think it does

So a few weeks ago I come across a post calling the engineers of old “lazy” because a decision to not record all 4 digits of a year led to a worldwide crunch that became known as “Y2K”. And now… we have a post saying “And yet we wonder why software today is so bloated and inefficient” because… Doom, from 1993, could fit on two floppy disks…

Oh brother…

“Fast forward to today, and a simple video player app like YouTube is more than 300MB. Just to play videos.”

The YouTube app is a lot more than just a video player. And playing videos involves a lot more than he seems to realize. VLC has an installation footprint of nearly 200 MB, and it is literally *just* a media player.

He’s also vastly overstating Doom’s simplicity. It wasn’t “full 3D”, for starters, nor immersive. And Doom overall is very simple: run and shoot, grab keys and other items and weapons, open doors, push buttons.

Doom also targeted the i386 processor. Yes, that detail matters. A lot.

To get an idea of what trying to write a game like Doom for the i386 involved, the source code for it is on Github.

Here’s a challenge: go through the code and count the number of coding best practices that are NOT being followed. To get something like Doom performing reasonably well on the i386, you needed to make a lot of compromises.

Simply due to the limitations of the hardware. Limitations that largely don’t exist today.

He’d say in another comment: “When your options shrink, your creativity expands. Constraints are the best teacher.”

To an extent this is true. It also reveals whether everything you want to include in the product can be included.

Since that’s the singular reason software today could be called “bloated” compared to years prior: more and better functionality.

VLC, for example, plays most every media file format that’s ever existed. It takes a LOT of code to support that. Even browsers haven’t been immune to that “bloat” with everything they need to support, with Chromium-based browsers and Firefox both having install footprints of a few hundred megabytes.

And sure constraints are a good teacher. I’d like to see young devs today try to write programs for the TI-82 and it’s 28 kilobytes of user space.

But those constraints can also push you to write bad code to get it working per your requirements, provided that’s even possible at all, potentially introducing bugs that may not be repairable due to your constraints.

Personally having faced the limitations of the past, including coding for the TI-82 and TI-85, I’m glad they, for the most part, don’t really exist anymore.

Since, again, much of the “bloat” Marc is describing (and complaining about) is due to… more and better functionality. Giving customers what they want and what they didn’t know they needed till they had it.