Fedora 41 and lvm driving me insane

I’m trying to get my system up to Fedora 41 but i’m fighting against a weird issue in combination with lvm and ssh cache with my raid lvm.

have a 5 drive raid setup (lvm) that is cached by 2 drive raid on nvme and this worked since f38 or about there, due to health reasons i’ve been away for nearly a year now and that old f38 or f39 system was no longer supported, so i decided to update to 41 and ever since then i’ve been trying to get back to a working setup

I tried accessing my data by editing /etc/lvm/lvm.conf and setting scan_lvm = 1 and running vgchange -ay this worked when booting from the usb stick with the f41 installer (burned to usb stick with fedora writer but on a clean install this doesn’t work

to be clear, i’m not getting any errors, just missing a volume that is found when doing the same steps when booting from usb

yeah, i have good backups, but trying to avoid having to restore 60TB, and trying to understand the issue

LUKS Group doesn't show up - #2 by vtrefny

1 Like

I’ll test this over the coming weekend, but it might be related<3

@vgaetera was on the money, seems ik was a recent change in LVM that had me baffled, I still need to run vgchange -ay to be able to open the luks volume but the fedora 41 system itself is installed and working properly as far as i can tell

I’m fairly positive my isssues stem from the lvm device file, Chapter 9. Limiting LVM device visibility and usage | Red Hat Product Documentation for now i still need to resolve the issue where mounting home fails after I update to the latest lvm2 packages but progresss is being made :slight_smile: