This is about requests for stored data and then the encryption is moot, that mostly affects data in-flight or seized computers if the data is stored. In the latter case you will probably be forced to cough up the decryption keys.
When we're talking about protection against government data requests, only companies that make sure they have access to the absolute minimum client information they possible can do truly have our backs. Everyone else just has good intentions.
Colin has it right. If you don't want to ever compromise your clients data make sure you can't read any of it. It's that simple. Anything else simply won't do.
That's why I keep recommending tarsnap to customers.
Or you could.. you know... recommend an appropriate client-side encryption tool so they can then store the archive/backup data on the storage provider of their choice...
The advantage of having client-side encryption built into tarsnap is that it encrypts only after data deduplication and compression.
Obviously there could be a tarsnap option to stream the data to be uploaded through an encryption program of your choice, but doing it just as you suggest would nerf a few of tarsnap's prime advantages.
Tarsnap is a combination client-side application and remote service.
I am suggesting instead to use/recommend one of the existing client-side tools that work similar to the tarsnap client does, but don't lock the user into a single service provider.
By using a client-side tool that just generates archives (and isn't tied to a single storage service provider), you can store them anywhere - AWS, iCloud, Google Drive, Rsync.net, a rented VPS, a friends computer, an external hard drive, all of the above. You name it.
I understood what you said, I just didn't know that there were tools in existence that are as good as or better than tarsnap at the archiving part which allow you to specify the storage location.
Edit: I used 'specify the storage location' very loosely. I.e., I realise it could mean simply piping the archive data to yet another program in the shell.
I've never used it, but I've seen people on HN recommend Attic. It dedups and encrypts. https://attic-backup.org
Personally, I use git-annex, which isn't exactly a backup tool but a general distributed file manager which can, among other things, automatically make encrypted copies of the files to various places (SSH servers, S3, Google Drive, etc).
Both do dedup and encryption, Attic can also store the data remotely via SSH (either with or without installation on the remote end) and Obnam can handle remote storage to an SFTP server.
It's an almost completely transparent user-space filesystem. Basically you store your files in a given folder, and it automatically stores a parallel encrypted copy in a different folder.
If a group planned ahead they could give out some secondary kind of key. Gmail gives out these really long codes I can use to login should I not have the authenticator app.
Sorry, I might be missing something here, but would there be any tangible differences between the service provider having access to a secondary key vs them having access to the primary key if both can be used to access your data?
I'm honestly interested because I'm building a distributed system where only the user has the decryption key, and I've always just assumed that password recovery is a lost cause in such systems.
I would assume the recovery key is not stored in plain-text - it's likely hashed, similar to a password. If you need to use it, you enter the (hopefully safely stored) recovery key you have, they re-hash it and compare to the hashed one they keep.
I believe it is the only way if you truly want a single user to have complete control of decryption. There are other solutions if you don't. I heard of one the other day (from MaidSafe maybe?) where you have a shared secret amongst your "friends" and if a quorum of them agree, it can reset your password. I assume this means your data is duplicated and encrypted via that shared secret as well which could be coerced I suppose.