[HN Gopher] Show HN: HiddenVM - Use any desktop OS without leavi...
       ___________________________________________________________________
        
       Show HN: HiddenVM - Use any desktop OS without leaving a trace
        
       Author : aforensics
       Score  : 367 points
       Date   : 2020-03-05 10:09 UTC (12 hours ago)
        
 (HTM) web link (github.com)
 (TXT) w3m dump (github.com)
        
       | unnouinceput wrote:
       | Or, or..hear me out, swap your HDD with a gaming one so when a
       | smart guy takes a look at your HDD will find only benign games.
       | You think customs agencies does not have smart people who can
       | look past a simple boot screen? Think again
        
         | incompatible wrote:
         | Or just wipe the HDD and install a fresh OS? I have to admit
         | that I'm uncertain exactly what the goal is.
        
           | saagarjha wrote:
           | A fresh OS is pretty suspicious.
        
             | Insanity wrote:
             | It takes almost no time to reinstall an OS, install a few
             | games from steam.
             | 
             | If you want you don't even need to wipe your install and
             | have a bit more time to spend, you can dual boot and remove
             | the other boot option temporarily from GRUB / MBR or
             | whatever the windows equivalent was.
             | 
             | Or fetch out a few github repos if you don't want to
             | install steam games. :)
        
               | saagarjha wrote:
               | I'm not sure I understand your response. Am I missing
               | something in my original comment?
        
               | Insanity wrote:
               | My reply was to say that it's relatively easy to make a
               | clean install not look 'fresh' :P
        
             | incompatible wrote:
             | A zeroed drive and a stock OS install, surely unlikely to
             | have any hidden data.
        
           | unnouinceput wrote:
           | Github says the goal is to hide your privacy from customs
           | agents. That's the honorable use of this. But is a tool, and
           | just like any tool can be used for both bad or good things. A
           | scammer or spammer will have use of this faster then a person
           | facing customs agents.
        
         | jedieaston wrote:
         | Put your sensitive data on a microSD card and put the card in
         | your mouth while in the checkpoint, the detector won't be
         | sensitive enough to pick it up (it doesn't detect fillings and
         | they are around the same size).
         | 
         | If you get in trouble, just swallow.
        
           | unnouinceput wrote:
           | Or have Veracrypt volumes uploaded to some service and use it
           | from anywhere on the world when need it? Much simpler
        
       | walrus01 wrote:
       | If your laptop is getting inspected at the border of an actual
       | authoritarian police state:
       | 
       | https://en.wikipedia.org/wiki/Rubber-hose_cryptanalysis
        
         | bergheim wrote:
         | Reminds me of this xkcd: https://www.xkcd.com/538/
        
         | goblin89 wrote:
         | The same tradeoff as with any security measure applies to
         | border officials. They may or may not go beyond scanning for
         | low-hanging fruit, and in a typical scenario probably won't.
        
         | joantune wrote:
         | That's why TrueCrypt, and I think Vera Crypt have support for a
         | volume that can be decripted with two keys - one will yield a
         | decoy volume where its free space is actually the real volume
         | that you want to protect which gets decoded only with a second
         | key
         | 
         | I'm not sure if the entropy analysis of that free space can
         | suggest that there's something funky about that free space or
         | not.. Because usually free space is either actual info just
         | marked as deleted, or info reset to zeros by some pro active
         | wiping of the free space. So, having a bunch of whacky data
         | that doesn't look like any kind of file, can probably be used
         | as a tell tale sign? No?
        
           | RcouF1uZ4gsC wrote:
           | I think the authoritarian regime will just torture anyone
           | with any VeraCrypt or TrueCrypt volume and the plausible
           | deniability will come back to bite you as you can't prove
           | that there are no other hidden volumes.
        
             | rolltiide wrote:
             | The point of a decoy is to satisfy scrutiny. Truecrypt
             | WITHOUT decoy will escalate the scrutiny, truecrypt WITH
             | decoy will avoid further scrutiny.
        
               | XMPPwocky wrote:
               | The problem is the level of scrutiny. Against some
               | attackers, decoys have very very nasty game-theoretic
               | failure cases.
               | 
               | Specifically- there's no limit to the number of decoys
               | that could be on a disk. So you can get into the
               | situation where you've decrypted every volume that
               | exists, under coercion, but your adversary believes there
               | are more volumes remaining.
               | 
               | By design, you have no way to prove that there isn't more
               | hidden data on that disk. This is unlikely to end very
               | well for you.
        
               | sveme wrote:
               | If that's the operation mode of your foe, it's not going
               | to end very well for you anyways.
        
               | [deleted]
        
               | heeen2 wrote:
               | When the sum of nonfree data on all volumes reaches the
               | capacity of the drive there is no more space for hidden
               | data
        
               | shaftway wrote:
               | If I remember correctly, the decoy volume treats all the
               | hidden space as available disk space. TrueCrypt used to
               | have a warning that booting into the decoy volume could
               | scramble the hidden volume when the OS wrote files to
               | disk if it happened to choose some space that overlapped
               | with your data.
               | 
               | If your decoy only lists 5GB of space on a 5TB drive,
               | then it isn't a very good decoy.
        
           | bArray wrote:
           | An encrypted volume (fixed space) should even remove the
           | white space. After all, knowing the size of a file contained
           | within could leak information about its contents.
           | 
           | I imagine the only way to detect a volume would be to have it
           | decrypted (enforced by law enforcement), to take the supposed
           | volume type and files within and then re-encrypt with the
           | same data. If your volume and the supposed clone are
           | different, it would suggest that you have hidden another
           | volume within.
        
             | abhorrence wrote:
             | I think the defense is that (assuming IVs, nonces, etc...
             | are held constant) that the files would encrypt
             | identically. And the excuse for the rest of the disk is "it
             | gets filled with random bytes to obscure how much disk
             | space is actually being used.
        
           | threatofrain wrote:
           | It's also why, IMO, obscurity is a valid component to
           | security.
        
             | ivankolev wrote:
             | Yes, but defence in depth/layering is the over-arching,
             | higher-order concept in the security game.
        
             | tetha wrote:
             | As I've started putting it: Make your infrastructure and
             | systems hard. Then don't tell anyone the details.
        
             | OJFord wrote:
             | I agree, not least because I can't see how to define
             | 'obscurity' without it also being a basic explanation of
             | encryption.
             | 
             | (Good) Encryption is a (secure) mechanism for obscuring
             | data, surely?
        
               | antepodius wrote:
               | It seems like there's something like a way to measure the
               | different mechanisms in terms of how inherently decoupled
               | they are from their surroundings. So, a the fact you have
               | to send messages to my server in a particular format is
               | one type of obscurity- but it's highly non-incidental,
               | linked to many different parts of the world (i.e. it's
               | some common network protocol) and more easily
               | investigated (you could get interesting different
               | responses by vary what string of bits you send).
               | 
               | In comparison, which particular password I use can be
               | very highly decoupled from the rest of the world and my
               | architecture, which makes it vastly more (reliably)
               | obscure.
               | 
               | Somewhere inbetween "you have to know my server exists to
               | send 'login:admin password:pass' to it" and "the volume's
               | encrypted with a 2048-bit cypher generated from
               | atmospheric entropy" is, maybe, a useful middle ground.
               | 
               | Hidden volumes seem like more of a defensive meta-
               | obscurity, in that they obscure your metadata (your
               | ownership of a particular piece of encrypted data).
        
               | XMPPwocky wrote:
               | https://en.wikipedia.org/wiki/Kerckhoffs%27s_principle
        
               | OJFord wrote:
               | I'm aware. But Kerckhoff's principle is just saying that
               | your mechanism of encryption shouldn't be obscured. It
               | doesn't change that I can't define 'obscure' in a way
               | that doesn't make it a mechanism (in itself inobscure) of
               | obscuring data.
               | 
               | Also, there are plenty of historical ciphers that fall
               | foul of Kerckhoff, I don't think we can say
               | retrospectively that they weren't done for security, and
               | in many cases were probably totally adequate for some
               | time, if not their lifespan.
        
             | grifball wrote:
             | When cryptographers talk about "security through obscurity"
             | they're talking about cryptographic algorithms and
             | protocols. So even systems that aim to prevent "rubber-
             | hose" attacks could benefit by avoiding algorithms (like
             | AES) who's security is based on obscurity, even if there
             | are parts of the system that are obscured.
        
             | TheFiend7 wrote:
             | Only circumstantially in IMHO. Obscurity can be a semi-
             | decent security tool in some situations, and in others
             | completely and utterly useless. It depends on what you're
             | trying to secure.
        
           | brianjlogan wrote:
           | In most instances I think torture would yield all of your
           | knowledge including the secondary unlock.
           | 
           | Unless you're thinking your attacker would exclude you from
           | torture for "yielding" the password.
           | 
           | I would think if they are capable of torturing you than they
           | wouldn't stop at a polite confession.
        
             | bootlooped wrote:
             | Border agents inspect and copy many more digital devices
             | than just those of people they actively suspect of
             | something, or are willing to torture.
        
             | varenc wrote:
             | VeraCrypt hidden volumes are supposed to be a 100%
             | deniable. There's no way to prove they exist.
             | 
             | The idea is that they'll stop torturing you because they
             | don't know of the existence of the hidden volume.
             | 
             | (In practice I suspect it leaves some subtle queues, but
             | maybe perfect for a border crossing )
        
               | frandroid wrote:
               | My understanding is that people who torture you don't
               | know what you don't know; so they don't know when to
               | stop. As such, they'll keep torturing you way past the
               | point where you've admitted to everything you know. This
               | is why information obtained under torture is considered
               | unreliable: eventually you'll just say anything to stop
               | the torture; further admissions will support the use of
               | torture as an information extraction tactic, and then
               | lead to more torture.
        
               | elif wrote:
               | Yes but, under what pretext would you torture someone who
               | has complied with all of your requests?
               | 
               | ~"Do you have any encrypted data we can't see?"
               | 
               | "Yes. The entire drive is encrypted"
               | 
               | ~"What is the key?"
               | 
               | "iloveapplesauce6969"
               | 
               | ~"Well, that worked and I see your data here. What a
               | lovely family.. is that Disney world?"
               | 
               | "Yes it was Timmy's 5th birthday"
               | 
               | ~"I'm going to waterboard you"
        
               | frandroid wrote:
               | Them: "You're still hiding something"
               | 
               | You: "This was everything!"
               | 
               | Them: "We don't believe you" _-rubber hose-_
               | 
               | You: "Stop it! I planned to blow up the world trade
               | center!"
               | 
               | Them: "We knew you were a liar."
               | 
               | To themselves: "Wow, torture works."
               | 
               | Rinse, repeat.
        
             | lukifer wrote:
             | In theory, you could just keep adding n+1 layers of fake
             | passwords (maybe with realistic fake data), on the hope
             | that after n attempts, they think they've broken you and
             | hit the jackpot.
             | 
             | But as sibling commenters describe, if sufficiently
             | motivated, there's no reason that an authoritarian state
             | wouldn't just keep torturing you anyway. :(
        
         | dmos62 wrote:
         | What countries do this? I saw a mention of Australia.
        
           | chupasaurus wrote:
           | Any, if they would really want the data.
        
             | dmos62 wrote:
             | All borders don't have privacy protection laws? Hard to
             | believe.
        
               | duxup wrote:
               | Depends on the border / country.
        
               | IggleSniggle wrote:
               | All things are fungible
        
               | kube-system wrote:
               | That just raises the bar for how much they need to want
               | it. It doesn't eliminate it.
        
         | 87zuhjkas wrote:
         | Are there any protection mechanism against that? Something
         | like, even if you are being tortured you cannot provide the
         | access to anyone else?
        
           | Santosh83 wrote:
           | Sharded secrets... you only have one part of a key or keys
           | needed to decrypt some data so even extracting that from you
           | by torture will not suffice. Of course this isn't always
           | practical.
        
             | maxaf wrote:
             | "They" (whoever they are) can torture more than one person
             | at a time.
        
               | gruez wrote:
               | Presumably the other key holders are in a different
               | jurisdiction.
        
               | bluGill wrote:
               | The trick is to ensure you are never near all the people
               | who know the secret when there is the possibility of
               | trouble. That makes kidnapping everyone harder.
               | 
               | Of course if you really worry about such things you
               | shouldn't be trusting the other people you are working
               | with either...
        
           | ben_w wrote:
           | Yes, but:
           | 
           | 1) they might not believe you, and
           | 
           | 2) that's still true even if the reason you don't have a key
           | is because you don't actually _have_ a secret encrypted
           | partition -- or whatever -- to supply a decryption key for
           | 
           | So the best thing to do is avoid being in a situation where
           | someone is allowed to do that in the first place.
        
             | aforensics wrote:
             | This is interesting. This also means that using encryption
             | or anything that can plausibly make someone even slightly
             | suspect you're using encryption (even if you are not) can
             | make your situation worse, with certain classes of enemies.
             | 
             | I'm sure advanced configurations with well-crafted decoys
             | and steganography can help combat that, but as we can see,
             | encryption can only take you so far and it's only one
             | element of the picture.
        
           | smichel17 wrote:
           | Encrypt with two keys, one that you know and one that a
           | trusted third party , knows. When you reach your destination,
           | establish secure contact with the third party and have them
           | share their key.
        
           | amoshi wrote:
           | Plausible deniability, like hidden containers in TrueCrypt?
           | 
           | That's a double edged sword though - imagine you give up,
           | surrender the password and are then being asked to unlock a
           | hidden volume, which you don't have.
        
           | jmt_ wrote:
           | I believe Truecrypt supported a feature where different
           | passwords would unlock different partitions in a volume. So
           | someone could ask you to input the password for BadBoy.tc,
           | and if you enter password1, then you get say the data they
           | actually want. But if you enter password2, it mounts a
           | different part of the file which gives the appearance that
           | you unlocked the whole thing. So, you could stage a dummy
           | partition that has false but convincing data and hopefully
           | fool any captors.
        
           | contravariant wrote:
           | Don't take the laptop with you.
           | 
           | At this point setting up a secure connection to a device in a
           | secure location is _way_ easier than trying to protect your
           | data against someone with physical access.
           | 
           | You can also get your collaborators to revoke access if you
           | fear you might be 'compromised', although ultimately it's
           | hard to protect a system against yourself.
        
           | bArray wrote:
           | I think one method would be to ensure you don't have the full
           | key, i.e. you have some select friends, each that have part
           | of the key (with some redundancy) - all unaware of one
           | another and potentially all unaware that they even have part
           | of the key.
           | 
           | Then you position your friends over multiple jurisdictions so
           | that they cannot legally compel all of them to play along.
        
           | megous wrote:
           | If you're talking about protection for the people that may be
           | captured with the help of the data you may have on your
           | computer, yes.
           | 
           | Otherwise, no. When they have you they can just torture you
           | to death for whatever reason or no reason at all.
        
       | jstanley wrote:
       | > The VM will even connect to full-speed pre-Tor Internet by
       | default, while leaving the Tor connection in Tails undisturbed.
       | 
       | This doesn't strike me as a selling point? Surely the default
       | should be to have the VM traffic all go over Tor?
       | 
       | Cool project though.
        
         | aforensics wrote:
         | Well, the idea is that you don't have to be limited by Tor's
         | speed or handicaps like being IP blocked when web browsing, if
         | you don't want that by default. We love Tails' amnesia for
         | anti-forensics, but we prefer Whonix's more secure Tor
         | anonymization, if you want to be anonymous. Now you can easily
         | combine both benefits.
        
         | norswap wrote:
         | I think the point is to make a decoy OS that you can boot into
         | if forced to unlock your laptop. Running on Tor would be highly
         | suspicious.
         | 
         | The point of running this on Tails is to prevent the use of
         | forensic tools inside the decoy OS to unearth what's
         | underneath.
        
           | joosters wrote:
           | If using Tor is suspicious, then having Tails on your
           | computer is also going to be suspicious. I'm certain that no
           | border agent will be swayed by your "but I don't actually use
           | this Tor I have installed" arguments.
        
       | bluesign wrote:
       | Safest way is to hack the SSD/HDD firmware, make it report its
       | size half. Depending on some condition, make it use the selected
       | half. (ex: some byte in first sector, some ATA command)
        
         | flyGuyOnTheSly wrote:
         | And when they pull the hdd and realize it says 512gb when
         | you're only showing 256gb?
        
           | Someone1234 wrote:
           | Replacing a SSD's sticker doesn't seem particularly
           | challenging relative to modifying the firmware to misreport
           | but make available storage.
        
         | threatofrain wrote:
         | Any entity big enough to seize your laptop for analysis is also
         | going to be able to look up the specification for any
         | particular part in your laptop, and eventually this portion of
         | the cat and mouse game will end.
        
           | ArchReaper wrote:
           | That's definitely not true, the article mentioned people
           | traveling between countries - TSA/border patrol/airport
           | police aren't going to send your laptop over to the NSA/KGB
           | to have it cracked by an expert. That being said, actually
           | encrypting the data is way more secure than fucking with HD
           | self-reporting.
        
           | Someone1234 wrote:
           | That's predicated on an all knowing adversary with unlimited
           | time and budget. In other words it is a largely fictional
           | problem.
           | 
           | Most of the people we're talking about just run off the shelf
           | forensics software and have minimum actual expertise (the
           | government doesn't pay well enough for legitimate experts
           | doing it by hand).
           | 
           | But then again, very few people are crazy enough to modify
           | computer hardware to protect their information. So both sides
           | of this coin might be largely fictional.
        
             | threatofrain wrote:
             | Are most people getting their laptops seized?
        
               | Someone1234 wrote:
               | A international borders? It isn't at all uncommon. I've
               | had my electronics searched before.
        
               | jedieaston wrote:
               | Out of curiosity, if you're comfortable saying, what
               | country was it and were you a citizen there?
               | 
               | I'm always curious what the breakdown is.
        
       | 6510 wrote:
       | I forget where I hear it or if it was my own idea (the shame, I
       | know, I know!) but... cant you have an unknown number of
       | username/password pairs that decrypt/unpack the same chunk of
       | data into different things? Say you have the same OS 51 times, as
       | clean installs the data shouldn't have to be all that much larger
       | than 1. You install some games on one, some office apps on the
       | next, put some downloaded movies on the 3rd. You could give them
       | "all" 50 passwords and they could never find OS nr 51.
        
         | lousyd wrote:
         | Perhaps you're thinking of this:
         | https://en.wikipedia.org/wiki/Rubberhose_%28file_system%29
        
         | jetrink wrote:
         | It's not possible to encrypt 51GB of real-world data in 1GB
         | space for the same reason that compression algorithms can't
         | achieve 51x compression ratios. Given that, such a scheme
         | presents some challenges if you want to maintain plausibility.
         | Either,
         | 
         | 1. Each filesystem lives within an allocated area and knows not
         | to overwrite its neighbors' data.
         | 
         | 2. Some filesystems (the real ones) are privileged and know
         | their actual allocated area. Others (the decoys) think they own
         | areas of the storage volume that contain hidden data and
         | therefore have the potential to overwrite the hidden
         | filesystems if they are written to.
         | 
         | In the case of (1), you need to be able to explain why your
         | computer has unallocated areas filled with pseudorandom data.
         | That is never going to pass the plausibility test, imo.
         | 
         | In the case of (2), a lot of effort needs to be put into making
         | the decoys look normal while not letting them overwrite the
         | hidden data. There are a number of strategies you could use
         | here that would work, but it will never be as convenient or
         | simple as dual-booting and the more convenient you try to make
         | it, the less innocent a hard drive will appear under close
         | inspection.
        
           | cheztir wrote:
           | I think the original commenter was going for an encrypted
           | copy-on-write setup, not some magical compressed fs. Just a
           | base image (eg 50GB) with various encrypted delta images (1GB
           | each) that are assigned to each user.
        
       | jstewartmobile wrote:
       | if i had reason to be this paranoid about doing something on the
       | computer, i probably wouldn't do it on the computer... see what
       | Ron Minnich amd Bunnie Huang have to say about the state of
       | modern hardware and bios.
       | 
       | that, and i believe AMT is still _a thing_
        
       | ThePowerOfFuet wrote:
       | > The VM will even connect to full-speed pre-Tor Internet by
       | default
       | 
       | Snatching defeat from the jaws of victory.
        
       | paulcarroty wrote:
       | Will be interesting to also have macOS as guest with Vera Crypt &
       | encrypted volume etc.
        
       | Santosh83 wrote:
       | Maybe good for hiding activity when you're already below the
       | radar. If you're a person of interest for a large enough state
       | then they can and will use all manner of dirty tactics to nail
       | you and simply encrypting is not enough. You will have to flee
       | like Snowden did. And once they bring in legislation that says a
       | govt agent can ask for your decryption keys under reasonable
       | doubt then everyone is in soup since encrypted data is easy
       | enough to detect as such. One may have to shift to steganography
       | of increasing sophistication. Basically this fight has to be
       | clinched politically. While technology can help it can't ensure
       | absolute privacy/security against an all-powerful state. The key
       | question is if a state should be all-powerful at all in the first
       | place...
        
         | aforensics wrote:
         | Indeed. Software is a supplement to the physical world. But we
         | do what we can, and at least in the realm of software, we can
         | have freedom.
         | 
         | It's possible Tor and Tails is dangerous software to use in
         | certain states. But if they can safely use it, it's here for
         | them.
        
         | tuxxy wrote:
         | > everyone is in soup since encrypted data is easy enough to
         | detect...
         | 
         | This is only half-true. Any secure encryption is going to
         | result in ciphertext that is indistinguishable from random
         | data.
         | 
         | In cases where the ciphertext is designated by a header or file
         | format, then it's trivial to know that something is encrypted.
         | Then there are cases where we can try to forensically determine
         | that there's encrypted data via the existence of an encryption
         | tool (e.g. VeraCrypt).
         | 
         | If you wipe a disk with random data, for example, then it would
         | be relatively difficult to determine whether or not the disk is
         | encrypted (implying that there are no headers on it). In fact,
         | one method of wiping disks is to generate a random encryption
         | key and encrypt a stream from /dev/zero to fill the disk
         | (https://wiki.archlinux.org/index.php/Dm-
         | crypt/Drive_preparat...).
         | 
         | This tool is making use of a VeraCrypt hidden volume which is a
         | rather really interesting application of plausible deniability
         | in cryptography. Essentially, this let's you have two volumes
         | where both are encrypted, but each has a different key. In this
         | setup, you'd put some files on one of the volumes to make it
         | appear that it's your "used" volume. On the other "hidden"
         | volume, you'd place the real files you want to keep safe.
         | 
         | In a case where the government is demanding that you release
         | your encryption keys, you would give up the keys to the "fake"
         | volume. Unless you divulge the keys to the "real" volume, the
         | attackers wouldn't necessarily know that it exists.
         | 
         | Unless there's evidence of you using one (maybe chat logs or
         | google searches asking for help on using it, for example),
         | there's no reason for anyone to suspect you use it.
         | 
         | The VeraCrypt documentation explains the technical details
         | (https://www.veracrypt.fr/en/Hidden%20Volume.html) well enough.
        
           | ansible wrote:
           | > _This is only half-true. Any secure encryption is going to
           | result in ciphertext that is indistinguishable from random
           | data._
           | 
           | A new SSD with very little data in the filesystem isn't going
           | to have many, many sectors filled with random bytes. They're
           | going to be blank instead.
           | 
           | A used drive will have free sectors (not used by the
           | filesystem) containing unencrypted contents of old files that
           | have since been deleted or something. This is also not random
           | data. Chunks of movies, pictures, applications and music will
           | be identifiable, easily.
        
             | Piskvorrr wrote:
             | A previously-used disk, wiped to NIST standards, will be
             | filled with random data - that's exactly the point of the
             | wipe.
        
               | ansible wrote:
               | > _A previously-used disk, wiped to NIST standards, will
               | be filled with random data - that 's exactly the point of
               | the wipe._
               | 
               | Yes, and that is suspicious. Random data is suspicious.
        
               | Piskvorrr wrote:
               | Suspicious, true. But in such case, you could be in
               | trouble for refusing to provide the password to a
               | suspected hidden drive _which doesn 't exist_. How does
               | that even make sense? (Rhetorical question)
        
               | ansible wrote:
               | From your point of view, it doesn't of course. You know
               | that you don't have an encrypted partition.
               | 
               | But from the authorities' point of view, they will beat
               | you until they're convinced you don't have anything of
               | interest you can give up. That could last quite a
               | while...
        
               | wpietri wrote:
               | Exactly. Something being suspicious is about small
               | differences from what's expected, differences that
               | correlate with something bigger. Whether the disk is full
               | of random data because it's encrypted or because it was
               | securely wiped, either way it correlates with somebody
               | having something they're working to hide.
        
               | Piskvorrr wrote:
               | Or a second-hand computer: I do not wish to carry
               | previous owner's use history into my usage, and neither
               | should anyone else. Do not conflate "unusual" with
               | "therefore hiding stuff", and don't even try "hiding
               | stuff, therefore bad".
               | 
               | There are legitimate reasons for wiping data...can't
               | believe we're having _this_ discussion, _here_ of all
               | places.
        
               | ansible wrote:
               | In no way am I arguing that you shouldn't wipe your data,
               | use encrypted filesystems, or anything like that. That is
               | a totally legitimate thing to do.
               | 
               | We're just talking about drawing attention to yourself
               | from governmental agencies that probably don't have your
               | well-being as their highest concern.
               | 
               | Using state-of-the-art encryption to keep your files safe
               | is good. But if it leaves _any_ evidence that you are
               | indeed using encryption, you are potentially drawing
               | attention to yourself. And people should be aware of
               | that.
        
               | wpietri wrote:
               | I agree there are legitimate reasons for doing so. None
               | of which will matter to some official busybody rummaging
               | through your drive. To them, a drive filled with random
               | noise (instead of, say, being zeroed out) is going to be
               | unusual in a way that correlates with bigger things they
               | worry about. Or, in short, suspicious.
        
           | laumars wrote:
           | > _Any secure encryption is going to result in ciphertext
           | that is indistinguishable from random data._
           | 
           | While that's technically true it feels a bit like a moot
           | point because if you have random data that cannot be
           | attributed to any other application (such as large volumes of
           | randomness) then it's a reasonable conclusion that you've
           | just detected an encrypted volume.
           | 
           | > _In a case where the government is demanding that you
           | release your encryption keys, you would give up the keys to
           | the "fake" volume. Unless you divulge the keys to the "real"
           | volume, the attackers wouldn't necessarily know that it
           | exists._
           | 
           | Unless they inspect the storage properties (either physically
           | or how it registers itself on the host) and see that it's a
           | 1TB drive with only a 500GB mountable volume. Again, it
           | wouldn't be a forgone conclusion that the individual has
           | other hidden volumes but it would be suspicious enough to
           | warrant further investigation / interrogation.
           | 
           | As always though, it really depends on the risk level you're
           | trying to protect yourself against.
        
             | ajphdiv wrote:
             | I would also assume the 'dummy' operating system wouldn't
             | have much activity. Since the user would be using the
             | hidden OS. So that coupled with the unaccounted for space
             | would raise more flags.
        
       | a_imho wrote:
       | Isn't downloading additional software defeats the purpose of
       | inspecting the code?
        
       | [deleted]
        
       | hleszek wrote:
       | It is kind of ridiculous to still use md5sum to check software
       | for integrity.
        
       | ralphc wrote:
       | If you want to use Tor in another country then come through a
       | border, what's the advantage of HiddenVM over putting Tor on a
       | bootable thumb drive, using it, then throwing away the drive
       | before crossing the border? Just the persistence?
        
       | aforensics wrote:
       | Hello HN,
       | 
       | We're finally sharing our github with the world. This post is the
       | first announcement of our project apart from our thus-far non-
       | populated subreddit. No one's discovered us yet. We've only told
       | one person in the world before right now. We're new to developing
       | and we're very humble and willing to learn, so any suggestions
       | and help is welcome.
       | 
       | What we aren't as humble about is the potential we think this
       | application has. HiddenVM allows full-scale anti-forensic use of
       | any desktop OS. (No longer just Tails.) If you place your
       | installed files inside good deniable encryption like VeraCrypt,
       | it means that no digital trace of your chosen OS is left on your
       | hard drive or can be forensically proven to exist. That is
       | significant.
       | 
       | There are many reasons why you may want to use HiddenVM. Some use
       | cases include:
       | 
       | - You're a spy protecting national security and you need to leave
       | no digital trace on the hard drive of the computer you just used.
       | 
       | - Law enforcement agents conducting sensitive investigations.
       | 
       | - Diplomats, politicians, and military personnel.
       | 
       | - Whistle-blowers needing to safely carry their information in
       | any situation.
       | 
       | - Activists, dissidents, political asylum seekers, and
       | journalists in need of stronger protection of their information
       | from corrupt governments when their equipment is forcibly seized.
       | (We know that the risk of the rubber hose remains a complex
       | problem and limitation of encryption.) Now that you can use
       | Windows once you set it up inside Tails, keeping your data
       | private could become easier for you.
       | 
       | Border agents forcibly invade our privacy and potentially steal
       | our secrets with no respect to who we are or what our rights are.
       | We need tech solutions to protect our data. More use cases
       | include:
       | 
       | - Lawyers carrying sensitive client information.
       | 
       | - People in business protecting their IP or trade secrets.
       | 
       | - Tactics in fighting against corporate espionage. It could be
       | expensive or impossible to sue for someone's unlawful intrusion
       | into your data. Easier to technologically prevent them in the
       | first place.
       | 
       | - Protect your basic privacy and dignity for any of the one
       | thousand other reasons why privacy matters.
       | 
       | - You travel a lot and you want to use Windows/macOS/Linux in a
       | way that prevents malware code from being forcibly installed
       | inside your operating system simply because you entered a
       | country.
       | 
       | - Digital currency: store a more private Bitcoin wallet. Secure
       | your assets against unwanted and unwarranted access. When data
       | literally is money you have a lot to lose.
       | 
       | - Domestic violence victims, and people in other dangerous
       | situations in life.
       | 
       | Data privacy is a human right. If you don't want someone
       | searching your naked body and violating your dignity in that way,
       | why should your data be any different? Airport border agents not
       | only perform a full digital strip search, but they're also
       | potentially stealing your data or implanting spyware and malware
       | without you knowing. It is a devastating act.
       | 
       | Using Tails should never be reason to suspect you are a criminal
       | or a spy. It also protects basic data privacy and democracy.
       | Tails should become a standard USB that anyone who values their
       | digital safety carries around in their briefcase, bag, purse or
       | wallet. We hope our application increases the size of the Tails
       | user base.
       | 
       | Thank you for your interest. We invite you to rip apart our
       | assertions and code (but with courtesy), try out HiddenVM, and
       | contribute to our project.
       | 
       | Sincerely, aforensics
        
         | smashah wrote:
         | Hi, very cool project! I'm getting more into security so
         | apologies if this is a stupid question. Is the veracrypt drive,
         | and therefore the HVM, linked to my specific instance of tails
         | or can it be accessed by anyone with a tails usb and my
         | veracrypt authentication details? Also, is it possible to have
         | tails on one usb and a veracrypt drive on a seperate USB drive?
         | How would that effect deniability at, say, a border?
        
           | aforensics wrote:
           | > Is the veracrypt drive, and therefore the HVM, linked to my
           | specific instance of tails or can it be accessed by anyone
           | with a tails usb and my veracrypt authentication details?
           | 
           | If someone has your VeraCrypt volume password, the volume can
           | be unlocked by them using via any Tails stick but potentially
           | any other operating system. What HiddenVM does is reduce
           | digital forensic evidence of using that volume quite
           | fundamentally.
           | 
           | > Also, is it possible to have tails on one usb and a
           | veracrypt drive on a seperate USB drive?
           | 
           | Yes. It may be faster if you make your computer's internal
           | SSD to be one entire partitionless hidden VeraCrypt volume.
           | 
           | > How would that effect deniability at, say, a border?
           | 
           | We want to be careful about making claims about deniability,
           | and it's still a field we have a lot to learn about at
           | HiddenVM. Someone more knowledgeable might dare to answer
           | this. We already give two examples on the github page. Your
           | situation is unique and only you can know what deniability
           | strategy works best.
        
         | jcahill wrote:
         | Copy notes:
         | 
         | 1. Pictures.
         | 
         | > What we aren't as humble about is the potential we think this
         | application has.
         | 
         | So you're not humble? Ditch the marketing goofiness. You think
         | it has major potential. Be humble or don't. It's inessential to
         | conveying what HiddenVM is.
         | 
         | > Like Tor, Tails, or Whonix, HiddenVM can be used for bad
         | purposes
         | 
         | Unnecessary. You're already on the back foot.
         | 
         | > - You're a spy
         | 
         | This isn't a normatively 'good' reason.
         | 
         | > Activists, dissidents, political asylum seekers, and
         | journalists (like Laura Poitras)
         | 
         | Don't cite a specific person unless that person is endorsing
         | the product.
         | 
         | > Using Tails should never be reason to suspect you are a
         | criminal or a spy. It protects basic data privacy and
         | democracy.
         | 
         | Don't lead with a user story that exactly matches the
         | stereotype, then. You're walking right into it.
        
           | aforensics wrote:
           | Thank you for the feedback. Some of it made sense and I've
           | updated the parent comment.
        
         | gadders wrote:
         | Don't forget private bankers trying to evade the tax
         | authorities:
         | 
         | "According to one former UBS banker, managers gave the private-
         | wealth team specially encrypted laptops that could be easily
         | deleted in case U.S. authorities barged in. "They told us about
         | the computers, 'if ever you run into problems in the U.S. with
         | the IRS, just push button X twice, and everything will be
         | deleted,' " said the banker. "It was like James Bond.""
         | 
         | https://www.cnbc.com/2015/04/30/why-did-the-us-pay-this-form...
        
         | mkl wrote:
         | I may be misunderstanding how it works, but aren't the presence
         | of Tails, the drive full of random-looking data, and the
         | absence of a visible consumer OS all massive red flags? It
         | seems like it would be completely undeniable that you're trying
         | to hide something.
        
           | aforensics wrote:
           | Red flags might not actually matter in many use cases. But
           | where it does, setting up a decoy OS that boots on the
           | computer by default when turned on may be one good strategy.
           | 
           | For a VeraCrypt volume, setting up an outer volume with
           | convincing files and providing the decoy password may be
           | effective.
           | 
           | Whether the mere possession of a Tails USB adversely affects
           | your situation is a matter that remains to be discussed at
           | length. There is clearly no one situation that applies to
           | everyone.
           | 
           | HiddenVM's potential to provide deniability is about
           | cryptographic deniability, not human deniability. Software
           | can only do so much. If humans are suspicious, software alone
           | cannot change their minds.
        
             | justsid wrote:
             | The thing is, if you are outside the norm, you are raising
             | suspicion. Using this kinda setup will prevent you from
             | flying under the radar, instead, you are painting a nice
             | target on your back. This is gonna bite you in particular
             | if you are already a person that your adversaries are
             | keeping an eye on.
        
               | aforensics wrote:
               | Right. So all we can do at our end is better document the
               | risks and limitations, and then work on the political
               | advocacy side of things to promote diversity and
               | nonconformity.
               | 
               | Such is the spirit of Linux. Is every Linux user
               | automatically suspicious to various enemies? If so, the
               | work to be done is not in our code repositories.
               | 
               | If we could incorporate some steganography in the future
               | that could also help. Open to ideas.
        
       | lovetocode wrote:
       | So does Tails run as the root OS but displays a separate OS in a
       | VM? For example, does it look like your booting into Windows when
       | really it's just a VM inside Linux? If so, does that Windows VM
       | or whatever it is you choose act like what is essentially a read
       | only OS?
        
         | aforensics wrote:
         | Yes and yes. No, the VM is not read-only. (But you can run a VM
         | as read-only.)
        
           | lovetocode wrote:
           | Interesting, thank you for sharing.
        
       | louwrentius wrote:
       | I wonder if it isn't easier to buy a laptop with two drives.
       | Install a regular OS on the first, hide the second in the BIOS
       | and nobody will notice.
       | 
       | The people doing the cloning / data theft would have to know
       | about your particular model. Obviously, you encrypt the second
       | drive, that in itself contains a hidden partition in case they do
       | discover it.
        
         | jmnicolas wrote:
         | Wouldn't they see that the laptop have 2 drives on their x-ray
         | scanner ?
        
         | raxxorrax wrote:
         | The project here is quite neat. But I wonder if your idea would
         | also work and ask myself how competent the forensic teams of
         | airport security really are. Or even if they are, they
         | certainly don't have a lot of time per device.
         | 
         | IT specialists are expensive and it would be shame if we waste
         | that on something benign as airport security which was mainly
         | established by paranoia and the wish to save face.
         | 
         | An what exactly are they targeting? Are they looking for
         | howToBlowUpAnAirplane.txt? Just some industrial espionage? Just
         | some display of authority? I don't really get what would prompt
         | these measures.
         | 
         | Was there ever anything they found on a device someone took on
         | a plane?
        
           | IanSanders wrote:
           | >An what exactly are they targeting?
           | 
           | journalists, I heard
        
             | raxxorrax wrote:
             | That doesn't seem it would increase airport security too
             | much.
        
               | wpietri wrote:
               | But it definitely increases the security of the people
               | who control airport security.
        
         | nkrisc wrote:
         | If you're only trying to slip by a cursory inspection, mount
         | the second drive in your computer but don't attach any cables
         | to it. Could be trickier depending on how drives are mounted in
         | your laptop.
        
       | haunter wrote:
       | How about running from RAMdisk? Feels like that would be the
       | safest
        
         | GekkePrutser wrote:
         | It would be, but how do you go through the whole install every
         | time you need it?
        
       | kchr wrote:
       | Please consider an acronym other than "HVM", which already has
       | meaning in virtualization context (Hardware Virtualized Machine).
        
       ___________________________________________________________________
       (page generated 2020-03-05 23:00 UTC)