I think that could work ; P
But it would probably be nice to be able to mount the storage as filesystem. But it might be a good solution, it depends on the use-case.
Looking through my vaults, when I began to include keys into it, it quadrupled in size. So .txt files and notes are completely feasible in a KeepassXC vault. . .
This topic reminds me of the olden days when I was caring around a usb on my keychain, with a portable instance of truecrypt on it, together with the encrypted file.
Indeed, we don’t know the exact use-case of the OP. I could conclude that:
if the user only wants local encryption, but not willing to reinstall the system with a full-disk-encryption, then cryptsetup can be a solution
if there is both local encryption and cloud backup needed, then GPG could be it, with all the pros and cons
if only the files backed up in cloud need to be encrypted, then there are nice backup tools such as Deja-Dup (GNOME app though), with backend duplicity, which is encrypting all the backed up files in the cloud with gpg.
Are you referring from a security or only from a size management perspective? Because from a security perspective there should’t be an issue, I guess, as it is more or less the same setup as with full-disk encryption (files being constantly added and deleted).
I am still curious (in case anyone knows) if increasing the file size of the “vault file” created with cryptsetup is reducing the encryption security. To be more clear with an example:
I am creating a 10GB vault file and using it to store all my documents. After 1 year of usage, the file gets almost full, so I am increasing the size to 20GB, with the truncate and resize2fs commands. I have tested this and it works. Didn’t re-encrypt the file, just increased the size of the image file and of the virtual filesystem.
Hmmm. I think they would be if you tar’d them up in an archive, then gpg’d the archive. Depending on the use case, I would consider doing just that. A person can get pretty handy with tar quickly.
The metadata of the contained file can be encrypted by deploying other means than gpg, as you say. Such solutions would also work for fscrypt. But the gpg’s metadata itself is still unencrypted. gpg ain’t intended for that. And it has to be said that in average gpg use cases, metadata usually does not need to be part of the secret. The same for fscrypt.
But again, let’s focus on the author’s case and not theoretical constructions
We have not sufficient information to go already into constructing solutions that involve different tools and make trade offs into any direction.
The author asked for encryption tools. We provided already a list. We don’t know if the encrypted data needs to be synced, backed up, if metadata needs to be protected or if the approach needs to be automated or standardized (e.g., without self-made scripts or commands the author needs to consider on their system). So we can only provide tools at this time, and wait what information the author further adds
In short, we don’t know yet if tar can fit the needs of the author in any solution approach
To avoid ambiguity in my question due to the breadth of use cases for encryption (although a little late), I will be more specific, what I am looking for is specifically to protect some files that contain sensitive information (information which I have to be accessing and modifying constantly), when I say “have easy access to them” I mean having the ease of managing some system (I had thought about using a vault or something like that) that protects my files but simply decrypting it and then having access to these files (without having to individually redirect them to a location through standard output or decrypt them 1 by 1 with their respective private key as happens with gpg), just before this topic I had thought about using gpg since it seems to me the safest way to encrypt information but due to its accessibility problems that I mentioned I was looking for an alternative, within that I found cryptomator that seemed like a good open-source option however it did not work well on my system, so I wanted to ask here if you knew of any alternative (the best option that you know) Two things to keep in mind are,
first, the protection of those sensitive files is that everything is managed on my own system, I do not need to transfer it anywhere (except when for some reason I see it necessary to create a backup of my files and transfer it to another external storage location)
two, my main mount (where I currently have all my files) is not encrypted
An issue you have to watch for is how the apps you use to edit the sensitive data work.
For example if you decrypt the file and leave an in the clear copy that survives a reboot then you have lost the data to an attacker potentially.
If you swap to disk you risk the data being saved to the swap file.
What apps with process the data files and what format is the data in?
I am think simple text, office documents, an SQL database etc.
With that being said, if I have 100’s of documents that need to be presented at once, You need to rethink your usecase and model. Say you have a large collection of images you need to present at once, I would use zip/7z a directory and use flags to protect them accordingly.
7z has many flags that serve this use case nicely.
GPG is best suited to protect data transferred over unencrypted channels, but if someone gains unauthorized local access, they can perform signature-based data recovery to restore deleted files or their fragments from the filesystem dump, so LUKS or EncFS should better fit the described use case.