this is the error message:
an empty password is not allowed by default. Pass the flag
–insecure-no-password to restic to disable this check
on Fedora 40 everything worked fine
Any thoughts on this?
this is the error message:
an empty password is not allowed by default. Pass the flag
–insecure-no-password to restic to disable this check
on Fedora 40 everything worked fine
Any thoughts on this?
Follow the instructions in that message to start restic without a password
There is a restic password, anyway DejaDup is a graphical application i see no way to pass this parameter
I am also using Deja-Dup, but with the default Duplicity backend. When choosing Restic as a backup tool for Deja-Dup, the user is warned that this is still experimental feature.
The message received seems to be an error message by Restic not being handled properly by Deja-Dup.
Or, if you did actually use a password and consider this a regression bug, you could report it.
Already reported by others a month ago with fedora 41 beta but also after fedora 41 released. I don’t know if is a problem appeared to fedora only but all reporters using fedora.
I’m experiencing this problem as well. I’m currently doing some troubleshooting to see if it’s a problem within restic or deja-dup. See also: Restic backend: "an empty password is not allowed by default" when trying to restore, but nothing ever prompts for the password (#519) · Issues · World / Déjà Dup Backups · GitLab
I don’t think that issue should have been closed since “back up to previous version of restic” doesn’t really seem like a fix to me.
I am wondering (sincerely, out of curiosity), what are the reasons for using an experimental feature (Restic backend), other than supporting the testing efforts.
Are there additional features or performance benefits in comparison to Deja-Dup with Duplicity backend?
For me the primary reason for using restic rather than duplicity was that duplicity often failed due to running out of memory. It also had the effect, as it was getting close to running out, of bogging down the whole machine to the point that it was unresponsive and unusable.
Edward, I am nobody but instead of trying the “experimental feature” in Deja Dup you should consider that probably you are using the wrong tool. For example you could try another basic tool, Pika Backup that uses “Borg” and then you need a “borg-compatible” hosting service (I guess there is a free 10G account). Yes, it is harder to set and other backup tools are harder.
I will never voluntarily send my data to the Borg, despite the reassuring and charming name. My backups are to an external hard drive which I change out monthly for off-site storage.
Ok, it makes no difference, you can use an external drive as well, read above: “plug your USB drive and let Pika do the rest for you”. The point here is to try a different software if Deja Dup doesn’t perform as expected instead of using alpha or beta level feature (like restic in Deja Dup) for potentially critical backups.
BTW, I find Deja Dup useful because it allows easy backups on GDrive. I don’t worry of “privacy” because I already use GMail so they can spy me over there. GDrive is limited in size so incremental backups are good to have, plus the bandwith constraints.
When I make backups on external drives I do them manually, by just coping files. No need to automate because the point of external drives, IHMO, is to NOT have those connected to the system so that they aren’t killed by whatever kills the system.
For clarification:
BorgBackup (short: Borg) is an open source deduplicating backup program. Pika Backup provides a more user-friendly GUI front-end to Borg. BorgBase is a Borg-compatible cloud hosting service.
You don’t need to send Borg data to “the Borg” (BorgBase) if you don’t want to. Pika Backup (and the Borg backup engine) can use local storage as its backup store. That’s how I’m using it - backing up to a USB hard drive.
Borg can optionally perform client-side encryption of its backup data (encrypting it before it is sent to the data storage). That’s a valuable feature if you’re sending backups off-site – either a cloud hosting service or a local hard drive that you’re going to store off-site.
Sorry everybody but I guess we should clarify some basic things here.
Some time ago I was reading in the “water cooler” about losing hard drives because of rain storms.
Then there are two other possible issue, one is typical in Windows, that is ransomware and alike that propagate to all the drives, the other is the silent corruption of the drives.
Having an external drive connected as backup is good but as only backup strategy it is a bit weak.
Instead having backups on remote hosting services means to diversify the strategy and the risk coverage.
Given that all the remotely serious back up software encrypts the backups, the “privacy” concerns are related to possible backdoors against the encryption and that kind of concern pretty much extends to anything.
IMHO the remote hosting is not an issue for “privacy”, it is more about cost and convenience. I am lazy and I don’t feel like working on a more sophisticated backup method, then I have very few data to save. If I had to put in place a small business backup strategy, with few time for manual procedures, I would go for the remote hosting instead of external drives because it is safer.
The issue with remote backup storage is definitely security. If the remote host has your (encrypted) data hackers have all the time in the world to crack the encryption. No encryption is fool proof and strong enough it cannot be hacked.
Of course it is up to the user how they structure the back ups and where it is stored, and they must be willing to accept the security risks with whatever choice they make.
Deja Dup is for personal files.
You can place the backup on an external drive or a remote hosting service.
Now the point is:
What is more probable, an hardware failure or a mistake by me that make the backup useless or an “hacker” that access my GDrive account and breaks the backup encryption? What I fear the most?
To put things in perspective, I have lost several backups because I removed the external drive before Fedora had finished to write over it. I was in a hurry and I forgot Fedora pretends to have completed the task but actually it has not.
For me is convenience. I am using for ages DejaDup and like the simplicity, also i m familiar with restic command line. So if anything bad happens i know how to handle raw restic backups that were made by Dejadup.
So Fedora needs to update DejaDup from 46.1 to 47.
Theoretically this is correct. Practically speaking: what’s the real likelihood that AES256 encrypted data will be exposed? While there are some published theoretical weaknesses, they only seem to slightly reduce the amount of effort for brute force attacks. (This is slightly akin to the old joke that given enough monkeys with typewriters and enough time they’ll write Shakespeare’s plays).
The other attack vector is a compromise of the encryption key for the data. The big no-no with storing encrypted data in the cloud is relying on the cloud to encrypt and/or hold the encryption key. That’s the a big exposure should the cloud’s keystore get compromised. Encrypting locally and only you having the key (only encrypted data goes to cloud) keeps the keys to your data out of the bad guys’ hands and they will be spending “all the time in the world” to decrypt your data – and that’s extremely likely to be a long, long time. Way past the likelihood that the data will be of value.
There’s another issue, though with relying on cloud backup: Time to restore.
Restore from cloud storage works fine for a few files. Restoring gigabytes of content from the cloud will be a function of your network speed and server responsiveness. Which will almost always be slower than a local disk. This may or may not be acceptable, depending on your use case.
Very true. Just make sure you properly assess the risks to your needs before making the decision.
Have you tested the combination of DejaDup 47 and Restic 0.17?