[HN Gopher] Show HN: FractalCrypt 2.0 - free deniable encryption...
       ___________________________________________________________________
        
       Show HN: FractalCrypt 2.0 - free deniable encryption cryptoarchiver
        
       Author : zorggish
       Score  : 109 points
       Date   : 2021-09-09 12:13 UTC (10 hours ago)
        
 (HTM) web link (github.com)
 (TXT) w3m dump (github.com)
        
       | tifadg1 wrote:
       | for use within linux I wouldn't trust anything but luks. speaking
       | of luks, it can theoretically accomplish this with offsets and
       | payload align, but I'm not sure I'd trust it not to fudge up when
       | reaching allocation limits.
        
       | Comevius wrote:
       | This works like TrueCrypt hidden volumes, which are volumes
       | created in the free space of volumes.
       | 
       | This is not secure against multi-snapshot adversaries, like those
       | who can take multiple snapshots of your storage at different
       | times.
       | 
       | The solution is to hide the access pattern, for example by using
       | a write-only oblivious RAM.
       | 
       | I'm currently working on a cloud database that uses searchable
       | encryption. In a database the smallest things can hurt you, both
       | the access and search pattern (must hide the encrypted data that
       | satisfies the query condition or multiple query conditions, the
       | volume of that data, and hide which queries are identical). And
       | the attacker can have auxiliary information (known-data, known-
       | query, inference). On top of that the database must be verifiable
       | (authentical, sound, complete, fresh). Encrypted and non-
       | encrypted data might be searched together (partitioned data
       | security). A database must be resizable, that's the point of a
       | cloud database. And then there is data sharing. And it must be
       | cheap. The existing solutions in the literature either compromise
       | security or practical efficiency.
        
         | anonypla wrote:
         | TrueCrypt was abandoned/discontinued and forked 7 years ago and
         | replaced by Veracrypt ... Just sayin ...
        
         | temptemptemp111 wrote:
         | Can you share more about this project? Thoughts on management
         | engined access to registers and DMA being a weakness?
        
       | bb88 wrote:
       | > Whereas, using FractalCrypt you can safely give away the keys
       | to unclassified volumes, and there is no way to prove that there
       | are actually more volumes than you have disclosed.
       | 
       | So the problem is that you're going to use what exactly to
       | decrypt it to prove plausible deniability? FractalCrypt? And then
       | what do the adversaries do once they google FractalCrypt and see
       | the phrase above?
       | 
       | Once your adversaries know you're using FractalCrypt, you've
       | negated any plausible deniability.
        
       | FpUser wrote:
       | Unless the functionality is supported in common software like zip
       | the mere fact of one using it can be a clue.
        
       | cdumler wrote:
       | > The capability of plausible deniability is that the encrypted
       | file is indistinguishable from noise; There is no way you can
       | find out the amount of data stored in the cryptocontainer.
       | 
       | The problem with deniable encryption is: if the attacker can
       | watch the file changes, one can determine the rough size of the
       | data in the volume. The attacker makes note of where in the file
       | changes occur. Once you get them to unlock the file you see if
       | the data is shorter than the size of file changes. If so, you
       | know there is more data.
       | 
       | Once an attacker can see your encrypted volume, you can no longer
       | make changes to the hidden data.
        
         | derefr wrote:
         | This presumes that you're working in a random-access manner
         | with your data. A lot of deniable-encryption use-cases involve
         | archival access (i.e. tossing new files into the container,
         | with each written file then treated as immutable.)
         | 
         | An attacker watching the side-channel of your access pattern
         | would just see you growing the volume. But that doesn't tell
         | them whether you're adding new real data, or adding noise.
        
           | cdumler wrote:
           | An important part of deniable encryption is that you can
           | "provably" deny data isn't there when it is. You want the
           | authorities to agree that when you unlock your volume they
           | see "all" of your data so they don't hold you in contempt. If
           | the total changes is less than the data visible, the
           | assumption is you're hiding more data. Adding noise isn't
           | deniable. In fact, it would hurt you: If you can't prove your
           | noise is noise then the authorities are going to assume you
           | are hiding data.
           | 
           | I think you make a good point: the only way to be secure in
           | this method is to A) work on an uncompromised machine and B)
           | write each layer in series and C) never change it again.
        
         | thesz wrote:
         | I think yours is a very deep comment.
         | 
         | Some log-oriented file systems may provide insight into changes
         | made. From what I know, ZFS is one example of such file system
         | and btrfs is another.
        
         | anigbrowl wrote:
         | Sure, but all security mechanisms have problems. That doesn't
         | mean they can't be good for limited purposes as long as you're
         | mindful of limitations.
        
         | [deleted]
        
         | 3np wrote:
         | This is a specific threat model where the attacker can watch
         | live file changes undetected. This may be acceptable e.g. for a
         | laptop without historical archives of the file.
        
           | malf wrote:
           | If there's wear leveling, there's metadata. If not, there's,
           | like, wear.
        
       | inetsee wrote:
       | I am not an expert in encryption or plausible deniability, but
       | couldn't steganography be used to conceal sensitive information?
       | I realize steganography would be problematic if you need to store
       | large amounts of information, or need to modify the information
       | often, but couldn't it be used for small amounts of information
       | that doesn't change often?
       | 
       | Couldn't you tell an attacker "It's just a picture of my cat."?
        
         | gzm55 wrote:
         | https://github.com/gzm55/dpad-enc Here is a poc to encrypt just
         | static pieces of short secrets in to a large enough file, and
         | decrypt only one secret by select a correct password.
        
         | shpongled wrote:
         | I'm not an expert either, but I think you could also use a one
         | time pad to do exactly that.
         | 
         | You have a picture of a cat, and 2 one time pads (OTPs). OTP #1
         | is the key for your real data, and you can generate OTP #2 such
         | that it decrypts the ciphertext (in this case, an image) into
         | whatever data you pick.
         | 
         | Whether this is practical is a completely different question
         | though.
        
       | ComodoHacker wrote:
       | Not a word about security analysis, threat model, etc. That's not
       | how a security tool your life could depend on is presented these
       | days.
       | 
       | Also, is it a coincidence that the "Made with C++" badge is
       | colored red?
        
       | DarkmSparks wrote:
       | Cant speak for this implementation, but deniable encryption has
       | an additional benefit over just encrypting stuff that even if you
       | are actually targeted they need to get really really deep into
       | your life to even know it exists and where.
       | 
       | Be that your super secret draft annual report or a fat bitcoin
       | wallet, it will pass a casual inspection and they will move onto
       | more interesting targets.
        
       | m3kw9 wrote:
       | Can it be detected that you are using fractalcrypt archive type?
        
       | hartator wrote:
       | Can't the attacker just see that FractalCrypt is installed, read
       | the same Readme, and ask for the 2nd key?
        
         | matharmin wrote:
         | There is no way for the attacker to know how many keys you
         | have. So you can give the attacker 2 keys, while you have your
         | actual sensitive data behind the 5th one.
         | 
         | It could still be a challenge to convince the attacker that you
         | really only had n-1 keys, so you may need to include plausibly-
         | sensitive data in earlier layers.
        
           | hartator wrote:
           | Hum, it the attacker knows that he wants the key for a
           | specific secret, like a Bitcoin wallet, can he just torture
           | you until he gets it?
        
             | p_j_w wrote:
             | Sure, but there's a significant subset of attackers for
             | whom this isn't an option. For example, I live in the USA.
             | Torturing me because the government thinks there's a 7th
             | key I'm not being up front about isn't an option for them,
             | at least on paper.
        
             | j_walter wrote:
             | Have a different and much smaller bitcoin wallet that
             | serves as a decoy...and have the keys in the 1st layer of
             | encryption.
        
             | istjohn wrote:
             | Well, yeah. There's no system that will defeat rubber hose
             | exploits.
        
       | 3np wrote:
       | > First, it creates a cryptocontainer of a user-specified size,
       | filled with random characters. To create the first volume, the
       | program archives the user-specified files to the beginning of the
       | cryptocontainer and encrypts it using the first key.
       | 
       | This is problematic; key reveal gives important metadata hints as
       | to size and location of other volume(s).
       | 
       | This could be redeemed by encoding offset and size parameters in
       | the key. These could be randomized or fixed at initialization.
       | 
       | Great ambition, I'll be keeping tabs on how this evolves.
        
       | akx wrote:
       | So... assuming there are bad guys demanding access to your data
       | and you say "oh yes, I've been using this plausible deniability
       | encryption/archive format", chances are that they're going to
       | torture you for about exactly as long as they want until they get
       | the data they want.
       | 
       | Also - assuming you have three layers of equal compressed size in
       | your container, and you provide two passwords, can't your
       | interrogator see that only 2/3 of the container file gets
       | accessed, and has a reason to believe there's more data to be
       | found?
        
         | sildur wrote:
         | That format really needs widespread adoption. Using it is
         | suspicious by itself right now.
        
           | arsome wrote:
           | Yeah, TrueCrypt or VeraCrypt are widespread enough right now
           | and most people are just using them in normal, non-deniable
           | form, so it seems like better cover currently.
        
         | davidhbolton wrote:
         | In countries like the UK where you can be jailed or fined for
         | not giving a password, this provides a way to do that and
         | escape jail. Truecrypt did it and after the developers stopped
         | supporting that, VeraCrypt came along.
         | 
         | You obviously don't reveal that you are using a plausible
         | denial storage method. Give it a zip extension and rename the
         | application that you access with to something like Zip
         | Archiver. "It's an encrypted zip file and the password is ...."
         | How do they know its not zip or that's there's secret data
         | there?
        
           | akx wrote:
           | For one, it's not clear that this tool creates standard .ZIP
           | files, so the bad guys using an off-the-shelf `unzip` tool
           | would probably suspect things.
           | 
           | If the tool does create regular ZIPs with irregular contents,
           | they could still see that there's noise that isn't read
           | during the course of decryption/extraction, which is suspect.
        
           | flixic wrote:
           | The app literally says "Welcome to FractalCrypt!" when you
           | open it. Not only revealing the encryption format name, but
           | clearly hinting to how it works.
           | 
           | I'd much prefer an encryption format that hides itself in a
           | well-known one layer encryption (like encrypted zip).
        
             | KingOfCoders wrote:
             | I agree, something like VeraCrypt where the partition has a
             | certain size, with or without hidden data.
             | 
             | But state level actors might nevertheless have methods to
             | find out, that you write 120gb of data compressed into a
             | 100gb file, there needs to be something hidden because
             | otherwise you would get in 122gb - something like that.
             | 
             | Or single stepping VeraCrypt machine code execution (you
             | see I have no clue).
        
         | moritonal wrote:
         | To expand on your second point, these kinds of systems should
         | let you set the fixed-size of the volume, like 1G or 5G, with
         | the payload being unrelated.
        
         | XorNot wrote:
         | Partly because these systems are designed to destroy the data
         | if not unlocked. Your "plausible" container, if not unlocked,
         | makes the rest of the container look like free space - i.e.
         | destroyed by an OS not aware it shouldn't write to it.
         | 
         | Which is common with HDD block-device format containers (not
         | sure this thing makes as much sense) anyway: if my laptop here
         | (which is encrypted) gets unlocked with 2 passwords, you would
         | need to independently verify that in fact I normally used 3 and
         | the idea is you can't prove that the "free space" is actually
         | not just normal freespace on my HDD.
         | 
         | Combined with a TPM chip and _not_ having any recovery codes
         | and the HDD can 't be realistically extracted except by nation-
         | state level actors with a motivated interest.
         | 
         | Also why would "truly secret" data be large in size to start
         | with? The more likely relationship would be 100:10:1 or greater
         | in terms of "plausible" to "implausible".
        
           | dane-pgp wrote:
           | > and _not_ having any recovery codes
           | 
           | An alternative might be to use something like Shamir's Secret
           | Sharing to split the recovery codes between a dozen mutually-
           | unknown friends in different jurisdictions, such that the
           | secrets held by some threshold of them could produce the
           | recovery codes.
           | 
           | These friends would have to be trusted to only hand you their
           | share if they meet you in person in their jurisdiction, and
           | should perhaps also first tweet out that they were doing so,
           | in order to warn anyone whose security might depend on your
           | encrypted data not being compromised.
        
             | XorNot wrote:
             | Well the data is going to get wiped after you unlock
             | without enough passphrases anyway, so it's kind of
             | pointless - you need a backup. The point of not having
             | recovery codes for the TPM is to ensure the disk is
             | completely unusable if the machine is tampered with - i.e.
             | you have to be forced to unlock that machine, and not a
             | copy, to ensure the data is destroyed. I do wonder if TPM's
             | would detect the use of SATA/PCI-E write blockers (or some
             | elaborate shim system - but again, nation-state level).
             | 
             | Of course this is the real fiction: in reality I'm somewhat
             | too lazy to set all that up for the much more likely
             | scenario of a preventable glitch hosing my system.
        
           | qwerty456127 wrote:
           | > you can't prove that the "free space" is actually not just
           | normal freespace on my HDD.
           | 
           | Isn't normal free space supposed o contain at least partially
           | recoverable traces of deleted files usually? I think we need
           | a file system that wipes everything deleted (including file
           | names!) and replaces it with random data by default.
        
         | dane-pgp wrote:
         | > until they get the data they want.
         | 
         | The game theory here is interesting. If they are sure that you
         | have the information (for example, the private key to your
         | bitcoin wallet) then "plausible deniability" isn't really a
         | useful feature. It means you can credibly bluff "The key isn't
         | on _this_ device ", but they can just torture you until you
         | reveal which device it _is_ on.
         | 
         | In contrast, the threat model of Rubberhose[0] assumes that the
         | secret police believe that you have an incriminating file on
         | your device, but they aren't sure. That means if you are
         | innocent and disclose all your passwords to them, they won't be
         | satisfied and will have to keep on torturing you forever,
         | hoping that you might give them the information you don't
         | actually have. Therefore they have to convince you that there
         | is some information that you could hand over which would
         | satisfy them, and they mustn't over-estimate what information
         | you have, otherwise they are committing to torturing you
         | forever and there is no advantage to you disclosing even the
         | information you do have.
         | 
         | [0] https://en.wikipedia.org/wiki/Rubberhose_%28file_system%29
        
           | orev wrote:
           | Exactly this. In True/VeraCrypt, there's only the possibility
           | of having two keys, the main and hidden one. Just the
           | existence of this feature places everyone using the software
           | in danger (at least people who are potential targets of this
           | type of regime), because if you're not using the hidden
           | volume, you can't ever prove it. To be really safe, everyone
           | would need to use both volumes, with the hidden one being
           | empty so it can be proven nothing is in there.
           | 
           | But with something that has an arbitrary number of hidden
           | volumes, you have no way to prove it and they can interrogate
           | you forever.
        
             | anigbrowl wrote:
             | It's bleakly amusing that you think torturers are worried
             | about some sort of credibility calculus. Where torture is
             | sanctioned or tolerated, people are sometimes tortured for
             | information, sometimes for compliance - but those
             | considerations are often _excuses_ offered to justify the
             | torture to external critics. In many cases, people are
             | tortured purely in order to terrorize others into
             | compliance, or because the torturers are sadists who get
             | off on it.
             | 
             | A lot of HN discussions on this topic are based on the
             | implicit assumption that torture is a rational tactic, if
             | extremely brutal and unpleasant one, because most people
             | will eventually tell torturers what they want to hear in
             | hopes of making it stop, and giving up secrets is a
             | bargaining option. The sad fact is that many torturers are
             | motivated by their enjoyment of others' suffering, so you
             | could give them everything only to have them laugh at your
             | dismay when you figured out they never cared about your
             | secrets in the first place.
             | 
             | In some historical conflicts, this realization ahs been
             | exploited by the underdogs; Algerian guerrillas under
             | French occupation had standing agreements to maintain
             | silence for 24 hours if arrested, but after that they could
             | spill everything freely without fear of moral compromise,
             | thus denying the incumbent powers a credible excuse for
             | carrying out torture. Guerrillas were expected to keep
             | abreast of each others' liberty status and to have an
             | unshared plan to bail out if their network was compromised.
             | 
             | I point this out purely as a tactical maneuver; following
             | the ejection of the French the newly independent Algerian
             | state itself instituted all kinds of unethical and
             | repressive practices.
        
               | ativzzz wrote:
               | > It's bleakly amusing that you think torturers are
               | worried about some sort of credibility calculus. Where
               | torture is sanctioned or tolerated, people are sometimes
               | tortured for information, sometimes for compliance - but
               | those considerations are often excuses offered to justify
               | the torture to external critics. In many cases, people
               | are tortured purely in order to terrorize others into
               | compliance, or because the torturers are sadists who get
               | off on it.
               | 
               | They actually are in some ways. I toured a former secret
               | East German prison in Berlin, and they would keep
               | prisoners for a long time, and psychologically torture
               | them until they confessed, and would then send them to
               | "trial" with their confession as proof.
               | 
               | I asked the guide why they didn't just physically torture
               | them or falsify the trial right off the bat and he
               | answered something along the lines of the prison guards
               | thinking they were civilized people and wouldn't resort
               | to such barabarous manners.
               | 
               | Torturers are still people and have some level of
               | cognitive dissonance going on, but do require _some_ kind
               | of credibility.
        
         | Seirdy wrote:
         | One of the best mitigations against rubber-hose and similar
         | attacks is a hardware key. If you leave it at home, you can't
         | be compelled to decrypt unless an attacker also breaks into
         | your home and searches the place.
         | 
         | In a pinch, you might be able to conveniently "lose" your
         | hardware key, or smash it if you hear the front door break
         | open. Doing so effectively erases your data without actually
         | erasing it, since it's unreadable without the key.
        
       | [deleted]
        
       | anonypla wrote:
       | Old but relevant https://defuse.ca/truecrypt-plausible-
       | deniability-useless-by... .Be careful with plausible deniability
       | depending on your threat model as it's only efficient against a
       | soft "lawful" adversary. It's probably a terrible idea against an
       | adversary willing to resort to "enhanced interrogation
       | techniques" (not mentioning the usual 5$ xkcd).
        
       | istjohn wrote:
       | Here's a quick and dirty DIY system:
       | 
       | 1. Create 10 files of encrypted random data. Discard the
       | passwords.
       | 
       | 2. Replace a random selection of these files with encrypted fake
       | data, using different passwords for each one.
       | 
       | 3. Replace one file with the actual encrypted data.
       | 
       | If challenged, openly share this scheme you used with your
       | opponent. Under durress give up the passwords to the fake data
       | files. Insist that one fake data file is the real data and that
       | the keys to all the other files were discarded.
        
         | tsujp wrote:
         | Get punished anyway because "only bad people have things to
         | hide" would be my guess. It's a shame we even need to have
         | plausible deniability in the first place.
        
         | [deleted]
        
       ___________________________________________________________________
       (page generated 2021-09-09 23:01 UTC)