[HN Gopher] Tillitis Key - Mullvad spin-off inspired by measured...
       ___________________________________________________________________
        
       Tillitis Key - Mullvad spin-off inspired by measured boot and DICE
        
       Author : km
       Score  : 269 points
       Date   : 2022-09-19 10:42 UTC (12 hours ago)
        
 (HTM) web link (www.tillitis.se)
 (TXT) w3m dump (www.tillitis.se)
        
       | croon wrote:
       | Unimportant trivia from a Swedish speaker:
       | 
       | Mullvad means mole (for the tunnelling more than planted spy
       | connotations, hopefully), and the "Tillit" part of the name means
       | trust.
       | 
       | They're working on some IKEA style naming, which I enjoy.
        
         | kfreds wrote:
         | Thanks! The other news of today is that we've started a second
         | sister company - Glasklar Teknik AB - which will focus on
         | maintenance and development of System Transparency and Sigsum.
         | 
         | System Transparency: Mullvad's security architecture we'll use
         | to eventually make our running VPN systems transparent.
         | 
         | Sigsum: A transparency log design with distributed trust
         | assumptions (witness cosigning).
        
           | croon wrote:
           | Glad to hear it! Both valiant efforts, and good naming here
           | too.
           | 
           | For non-speakers; "Glasklar" means literally "glass clear",
           | but makes more sense to explain as the phrase in Swedish
           | equivalent to "clear as day".
        
             | cinntaile wrote:
             | You can say crystal clear in English, it's a bit closer to
             | the original version.
        
       | trelane wrote:
       | Wonder how this compares to NitroKey, which also is open, but not
       | FPGA-based.
       | 
       | It's almost certainly a good time to update
       | https://lwn.net/Articles/736231/
        
       | teddyh wrote:
       | RYF certification?
        
       | AtNightWeCode wrote:
       | From a security perspective people should glue their USB-ports
       | shut. Do the people endorsing this crap even considering that
       | anyone with a bit of knowhow can buy a malicious replica.
        
       | octoberfranklin wrote:
       | The most important part of this project is in the very last
       | sentence: it's all implemented on an FPGA (one which doesn't have
       | any backdoorable-for-surveillance hard cores).
       | 
       | Without that, none of the other stuff would be trustable.
        
         | kfreds wrote:
         | Note that we specifically chose the Lattice ice40 UltraPlus 5K
         | because:
         | 
         | - It is supported by an open-source FPGA toolchain
         | 
         | - Has an in-package non-volatile configuration memory (NVCM)
         | that is lockable. This is where we'll eventually keep the FPGA
         | configuration bitstream, including the unique per device
         | secret.
         | 
         | After some reverse-engineering work we're also able to program
         | and lock NVCM with open tooling, as opposed to having to use
         | Lattice's proprietary one.
        
       | fiat_fandango wrote:
       | So basically a pluggable HSM? Curious if this should be
       | considered more similar to the Yubi HSM 2 [0] (think dedicated
       | HSM, not very useful for anything else but very secure) or the
       | USB Armory [1], very adaptive small arm SBC with a USB port?
       | 
       | [0] https://www.yubico.com/product/yubihsm-2/
       | 
       | [1] https://inversepath.com/usbarmory
        
         | kfreds wrote:
         | Yes, except YubiHSM boots using verified boot. In the case of
         | Tillitis Key each program gets its own secret key, which it can
         | use to e.g. derive a key pair.
        
         | JoachimS wrote:
         | Or CrypTech: https://cryptech.is/
         | 
         | There are things developed by the CrypTech project I would like
         | to try and reuse for TillitisKey.
        
           | fiat_fandango wrote:
           | Very cool, thanks for the link! This context helps a ton :)
        
       | greyface- wrote:
       | https://github.com/tillitis/tillitis-key1/
       | 
       | https://github.com/tillitis/tillitis-key1-apps/
        
       | pxeger1 wrote:
       | It strikes me that open source hardware should be more common.
       | It's surely much easier to monetise than open source software:
       | you just sell the hardware, because noone wants to build one
       | themselves. Why isn't it?
        
         | Semaphor wrote:
         | Just from reading comments and articles, I'd guess that's
         | because you very often rely on 3rd-party parts that are not
         | open. So you either need to limit what you use, or design the
         | whole thing.
        
         | arsome wrote:
         | No one wants to built it themselves until people actually want
         | it, if your device is popular then your device is $2 on
         | AliExpress/eBay and you have no part in that, look at Arduino
         | for a good example.
        
           | tyingq wrote:
           | This...if it's successful, a knockoff of your device will end
           | up on eBay, AliExpress, Amazon, etc, often with sketchy
           | parts, at a price you'll never match. With some exceptions,
           | where you're able to make the product depend on a hard-to-
           | get-cheap part, etc.
        
         | spicyjpeg wrote:
         | Because hardware is, well, hard. There is a huge upfront
         | investment that isn't even remotely comparable to the amount of
         | money you can spend on software development, and equally huge
         | incentives for third parties to undercut you by taking your
         | designs, manufacturing them for cheap and offloading support
         | onto you (as already pointed out Arduino is a great example of
         | this happening in real life). Even if everything is open source
         | you have to build an entire business and marketing department
         | around selling the hardware, while with pure software you can
         | just put it up on GitHub and call it a day.
         | 
         | Not to mention that in this day and age every piece of hardware
         | has software at its core, so open source hardware does not save
         | you from also writing the code that runs on it. If anything
         | developing open source firmware is actually harder, because
         | most chip vendors expect your product to be closed source and
         | want you to sign countless NDAs before you can access a
         | datasheet or even just buy the chips. You are restricted to
         | using older and/or more expensive parts whose documentation is
         | freely available; it's the exact opposite of the software
         | world, where the latest and greatest is one npm install away.
        
         | _def wrote:
         | It's risky. You still need the software, otherwise only some
         | devs and tinkerers will buy it - and even for them there better
         | be tool chains etc
        
       | Havoc wrote:
       | Am I right in thinking that this is basically like a yubikey
       | except with openness as key differentiator?
       | 
       | Or is it's function something else ?
        
         | michaelt wrote:
         | According to [1]
         | 
         |  _> It offers both security and flexibility by being end-user
         | programmable while also preventing applications loaded onto the
         | device from knowing each other's secrets. During use firmware
         | on Tillitis Key derives a unique key for each application it
         | runs by measuring it before execution. This is done by
         | combining an application's hash value with a unique per device
         | secret. Applications are loaded onto the device from the host
         | computer during use, and are not stored persistently on the
         | device._
         | 
         | So the idea here is:
         | 
         | * General purpose, reprogrammable security coprocessor
         | 
         | * If you save secrets with application A, then install evil
         | application B, it can't access the secrets from A.
         | 
         | * And if you revert back to A, those saved secrets will still
         | be there.
         | 
         | * Therefore, it's more practical to run two different
         | applications - and safer to experiment with your own
         | applications, because you won't lose all your website logins.
         | 
         | [1] https://www.tillitis.se/
        
           | xani_ wrote:
           | * If you save secrets with application A, then install evil
           | application B, it can't access the secrets from A.
           | 
           | * And if you revert back to A, those saved secrets will still
           | be there.
           | 
           | What stops app B from pretending it's an app A ?
        
             | michaelt wrote:
             | Something like 'Secure Boot' / 'Measured Boot' on modern
             | PCs, I imagine.
             | 
             | A bootloader will checksum the current application before
             | running it, checking its digital signatures and version and
             | whatnot, and deriving an encryption key based on that.
        
             | kfreds wrote:
             | 1. The hardware contains a UDS (unique per device secret)
             | which can only be read once per boot cycle.
             | 
             | 2. Firmware in ROM does unconditional measurement of the
             | first mutable boot stage, which is loaded from the host,
             | over USB.
             | 
             | The KDF used for measurement is Blake2s(UDS,
             | Blake2s(application), USS).
             | 
             | Note that when I say hardware I mean FPGA hardware design.
        
               | xani_ wrote:
               | Again, what stops malicious app B from just taking the
               | A's id and presenting it to device ? token doesn't know
               | who sent USB packet
        
               | stavros wrote:
               | I think the app is installed on the stick itself, kind of
               | like how you install coin apps on a Ledger.
        
               | kfreds wrote:
               | I'm not sure I understand your question.
               | 
               | If you're asking about applications running on the device
               | (Tillitis Key) the answer is measured boot. You can read
               | more on tillitis.se.
        
             | timando wrote:
             | It would have to hash to the same value as app A
        
               | xani_ wrote:
               | The device is not checking app's hash tho, it has no way
               | to verify that the usb frame containing that hash is
               | really from app A
        
               | craggyjaggy wrote:
               | The app runs _on_ the USB device. The code is loaded from
               | the host, and if it hashes to the correct value, it will
               | be able to access the secrets on the Tillitis.
        
               | [deleted]
        
         | kfreds wrote:
         | The Tillitis Key is a new kind of USB security key inspired by
         | measured boot and DICE.
         | 
         | Tillitis Key's design encourages developers to experiment with
         | new security key applications and models in a way that makes
         | adoption easier and less risky for end-users.
         | 
         | It offers both security and flexibility by being end-user
         | programmable while also preventing applications loaded onto the
         | device from knowing each other's secrets. During use firmware
         | on Tillitis Key derives a unique key for each application it
         | runs by measuring it before execution. This is done by
         | combining an application's hash value with a unique per device
         | secret. Applications are loaded onto the device from the host
         | computer during use, and are not stored persistently on the
         | device.
         | 
         | A user- or host-supplied secret can also be mixed into the key
         | derivation function, providing further protection. A
         | sophisticated physical attacker should be assumed to have
         | knowledge of the target application's hash, and will likely
         | eventually succeed in extracting the UDS from the hardware. By
         | adding a host-supplied secret, knowledge of the application
         | used as well as the security key's UDS is not sufficient to
         | produce the application secret. This makes the security impact
         | of a lost or stolen Tillitis Key less than for conventional
         | security keys.
         | 
         | Device applications can be chain-loaded where the first
         | application stage hands off its secret to the second stage.
         | This improves user experience as it makes it possible for the
         | application secret (and its public key) to remain the same even
         | if a device application is updated. It also enables developers
         | to define their own software update trust policies. A simple
         | first-stage application might do code signing verification of
         | the second stage, whereas a more advanced one will require
         | m-of-n code signatures, or a Sigsum inclusion proof. Sigsum was
         | designed with embedded use cases in mind.
         | 
         | Tillitis Key is and always will be open source hardware and
         | software. Schematics, PCB design and FPGA design source as well
         | as all software source code can be found on GitHub.
         | 
         | https://www.tillitis.se
         | 
         | https://github.com/tillitis/
         | 
         | (Full disclosure: I'm Fredrik Stromberg, cofounder of Mullvad
         | VPN and co-designer of Tillitis Key)
        
           | Semaphor wrote:
           | One question, with the two-stage approach, isn't that a
           | security risk? App 1 gets measured and can't change, but it
           | needs to identify app 2 in a way that still allows it to be
           | updatable, invalidating the whole security chain.
           | 
           | I'm probably not understanding something, so I'd love an
           | explanation (preferably one that non-cryptographers
           | understand)
        
             | kfreds wrote:
             | > One question, with the two-stage approach, isn't that a
             | security risk? App 1 gets measured and can't change, but it
             | needs to identify app 2 in a way that still allows it to be
             | updatable, invalidating the whole security chain.
             | 
             | It doesn't invalidate it if it works as the application
             | developer intended. The essential idea is that the first
             | mutable boot stage contains a trust policy which somehow
             | verifies the second stage. Let's say that's a valid
             | signature over the hash value of the second mutable stage.
             | The trusted public key in contained in the first stage.
             | 
             | What we've done there is first used measured boot, and then
             | verified boot.
             | 
             | Measured boot is basically load-hash-measure-execute, where
             | "measure" means "store the hash value of whatever I'm about
             | to execute somewhere safe where the soon executed thing
             | won't be able to undo my storing of its hash".
             | 
             | Verified boot on the other hand is about verifying the next
             | stage and only execute it if it verifies as valid by the
             | trust policy.
             | 
             | > I'm probably not understanding something, so I'd love an
             | explanation (preferably one that non-cryptographers
             | understand)
             | 
             | Honestly I'm beginning to realize it's not all that simple
             | to explain.
        
           | WaitWaitWha wrote:
           | @kfreds thank you for the response, and the commitment you
           | have for this project.
           | 
           | >> ... this is basically like a yubikey ...
           | 
           | > ... new kind of USB security key ...
           | 
           | The things you have listed are indeed very nice, but they are
           | not _new_ kind, as they are available elsewhere.
           | 
           | Can you give a bit more compare and contrast to the original
           | question?
           | 
           | Again, thank you.
        
             | bestham wrote:
             | No, the integrity is within the device. You load the small
             | (64k) apps onto the key and the content of the apps with
             | the unique key for the device can be used by the app to
             | perform cryptography and their integrity can be audited.
             | This is similar to JavaCard with cryptographic integrity of
             | the applets. Read more at:
             | https://github.com/tillitis/tillitis-
             | key1/blob/main/doc/syst...
        
             | kfreds wrote:
             | > The things you have listed are indeed very nice, but they
             | are not new kind, as they are available elsewhere.
             | 
             | Really? I wasn't aware that there is another USB security
             | key with measured boot-based key derivation. Please provide
             | a link!
             | 
             | > Can you give a bit more compare and contrast to the
             | original question?
             | 
             | Except for Tillitis Key, all USB security keys I'm aware of
             | either boot any software, or only boot software that has
             | been signed with a key pair. Tillitis Key is different in
             | that it measures the application, and uses that measurement
             | to derive a secret specific to the application as well as
             | the stick it's running on.
        
           | fiat_fandango wrote:
           | So basically a pluggable HSM? Curious if this should be
           | considered more similar to the Yubi HSM 2 [0] (think
           | dedicated HSM, not very useful for anything else but very
           | secure) or the USB Armory [1], very adaptive small arm SBC
           | with a USB port? (for this thread option 3 - more similar to
           | a bone stock Yubikey?)
           | 
           | Also curious if there are plans to support BLS signatures
           | natively?
           | 
           | [0] https://www.yubico.com/product/yubihsm-2/
           | 
           | [1] https://inversepath.com/usbarmory
        
           | Havoc wrote:
           | Thanks for the detailed response!
        
           | AviKav wrote:
           | > A user- or host-supplied secret can also be mixed into the
           | key derivation function
           | 
           | To clarify, this secret does not affect the program's hash,
           | right? (e.g. to prove liveness, the parameter is a nonce to
           | be signed with a deterministic private key)
        
             | JoachimS wrote:
             | No, the USS would be used (mixed in as kfreds stated)
             | during the hash operation. So the hash result would be
             | based in (1) the hash of the application, (2) the unique
             | device secret, and (3) the user supplied secret. The result
             | is called Compound Device Identity in DICE parlance. And is
             | basically
             | 
             | CDI = Hash(UDS, Hash(application) + USS)
             | 
             | If the application would use the result (called CDI -
             | Compound Device Identity in DICE parlance) to derive a pair
             | of keys, the keys would thus be based on the hardware (the
             | specific device you have), the integrity of the application
             | and what you know.
        
           | throwoutway wrote:
           | What is DICE? I'm searching but the only results are the
           | company and not whatever you're referring to
        
             | transpute wrote:
             | https://trustedcomputinggroup.org/work-groups/dice-
             | architect...
             | 
             |  _> The DICE Architectures Work Group is exploring new
             | security and privacy technologies applicable to systems and
             | components with or without a TPM. The goal is to develop
             | new approaches to enhancing security and privacy with
             | minimal silicon requirements. Even simple silicon
             | capabilities combined with software techniques can
             | establish a cryptographically strong device identity,
             | attest software and security policy, and assist in safely
             | deploying and verifying software updates._
        
             | mwcampbell wrote:
             | From elsewhere in the thread: https://www.microsoft.com/en-
             | us/research/project/dice-device...
        
           | mwcampbell wrote:
           | IIUC, popular security key devices like the YubiKey securely
           | store a private key, but only allow it to be used for
           | specific authentication applications (e.g. OTP or U2F). Would
           | the Tillitis Key be able to securely store a private key,
           | then with appropriate authentication from the host, use that
           | key for encryption and decryption?
        
             | kfreds wrote:
             | > Would the Tillitis Key be able to securely store a
             | private key, then with appropriate authentication from the
             | host, use that key for encryption and decryption?
             | 
             | Yes. I don't think that would be very hard to do.
        
             | MaKey wrote:
             | Devices like the YubiKey can also be used for encryption
             | (e. g. with LUKS or OpenPGP), digital signatures and SSH
             | authentication.
        
           | [deleted]
        
         | ldng wrote:
         | It is an FPGA, fully open both at software and hardware level.
         | So quite a bit more futurproof, inspectable and upgradable than
         | a yubikey.
        
           | JoachimS wrote:
           | (For full disclosure I am the primary FPGA designer of
           | TillitisKey.)
           | 
           | It also perform a measurement of the application being
           | loaded. And the measurement together with the Unique Device
           | Secret (UDS) will generate the primary secret applications
           | can use to derive keys etc it needs. This means that you can
           | verify the application integrity.
           | 
           | This is very close to, inspired by DICE:
           | https://www.microsoft.com/en-us/research/project/dice-
           | device...
        
             | zimbatm wrote:
             | Does this mean that a software upgrade will change the
             | keys?
        
               | JoachimS wrote:
               | For now, yes. But as Fredrik (kfreds) has written in
               | another comment. What is possible to do is a two stage
               | approach with an application (which gets measured)
               | loading other applications.
        
             | layer8 wrote:
             | What exactly is the "measurement"? A hash of the
             | application code?
        
               | JoachimS wrote:
               | Yes. The hash of the application code and the 256 bit
               | Unique Device Secret is hashed to generate a primary
               | secret, which then the application can use to derive the
               | secrets it needs.
               | 
               | You can additionally supply a secret from the host (the
               | User Supplied Secret). This means that the keys generated
               | are tied to the specific device (the UDS), that the
               | integrity of the application is correct, and to you as a
               | user.
        
             | tinco wrote:
             | Did you design the board? It looks sick, such high density
             | of components on the top layer.
        
               | JoachimS wrote:
               | No, the board design is done by the wizard Matt Mets at
               | https://blinkinlabs.com/
        
               | zx85wes wrote:
               | OMG. Just saw the Thinking Machines CM-2 replica on their
               | homepage. What an awesome idea.
        
               | panick21_ wrote:
               | Yeah, super cool. A old school lisp machine would be cool
               | as well.
        
           | dabeeeenster wrote:
           | I was under the impression that not being upgradeable was a
           | security feature of Yubikeys?
        
             | lucideer wrote:
             | It is. And therein lies the innovation here: upgradable
             | with verification.
        
         | kreetx wrote:
         | Not exactly sure, but the OSFC conference page has some extra
         | info on what it can do:
         | https://www.osfc.io/2022/talks/tillitis-key-a-usb-security-k...
         | 
         | Maybe it will be ~YubiKey plus extras?
        
           | LibertyBeta wrote:
           | So Solokey V2. Not that I'm complaining by the way. Any open
           | competition to yubikey is a win in my book.
        
             | byyll wrote:
             | I really dislike how "Yubikey" is being used in many places
             | as a name for U2F and FIDO2.
        
               | riedel wrote:
               | This is because the fido2 libraries follows a lot of
               | defacto standardisation: either to match yubikey or
               | windows hello as the only real implementations in the
               | wild. Fido2 ctap extensions would support all the use
               | cases of the new device except you won't be able to use
               | them e.g. in windows because vendors are ignorant about
               | all the openess to push their own agenda.
        
               | jffry wrote:
               | It's the same as "Google Authenticator" being used
               | instead of TOTP. I think it's reasonable for apps'
               | documentation to meet users on their turf to aid
               | understanding.
        
         | badrabbit wrote:
         | Probably for use with something like secure enclave.
        
       | dang wrote:
       | Related: https://mullvad.net/en/blog/2022/9/19/mullvad-creates-a-
       | hard...
       | 
       | (via https://news.ycombinator.com/item?id=32896658, but we merged
       | that thread hither)
        
       | PhilippGille wrote:
       | > Something that makes the key unique is the fact that both its
       | software and hardware are open source
       | 
       | Aren't SoloKeys [1] also open hardware and software? Or is the
       | Tillitis key more general purpose and thus not in the same
       | category?
       | 
       | [1] https://solokeys.com/
        
         | FredFS456 wrote:
         | My understanding is that it's both a more general platform
         | (targeting more than 2FA) and also uses an FPGA running open-
         | source code, so that the "secure enclave" functionality can be
         | inspected and found to be secure, rather than just trusting
         | NXP/ARM's chip as SoloKeys have done.
        
           | kfreds wrote:
           | Good explanation. Thank you.
        
           | ptman wrote:
           | FTR SoloKeys targets FIDO2, not just U2F
        
             | Matl wrote:
             | I think what they mean is that this can be reprogrammed for
             | more use cases than FIDO2 and U2F, it can say be programmed
             | to support my own homegrown thing that I've made up just
             | now or even a more general concept than just getting into
             | things perhaps.
        
               | JoachimS wrote:
               | Yes. And your application will get a per device unique
               | primary secret when loaded, which the application then
               | can use for whatever it needs. (Including not using it
               | all all.)
               | 
               | TOTP, FIDO2, PIV, simple touch triggered
               | challenge/response... or something completely different.
               | If it can fit in around 100 kByte RAM when compiled for
               | RV32IMC and not be too computationally expensive, it
               | could be a Tillitis app.
               | 
               | Just to give you some indication, the Ed25519 signer
               | operation in the SSH authentication we showed on stage
               | today takes ~ one second to perform the signing. And we
               | have several ways to improve that we know already.
        
             | JoachimS wrote:
             | The TillitisKey should be able to be used for FIDO2, as a
             | TOTP generator etc. Right now there is a SSH agent
             | application, which allows you to sign in by touching the
             | device.
             | 
             | Personally I'm very excited to see what applications will
             | be developed at the hackathon at the OSFC conference, and
             | onwards. We have had people at the conference showing
             | interest in trying to write applications in Rust. I will
             | try and implement an application of my own tomorrow.
        
               | RL_Quine wrote:
               | SSH supports FIDO2 so I'm not really sure the purpose of
               | having an agent.
        
           | ecesena wrote:
           | Correct. I think the difference is just NFC.
           | 
           | If you want to power your key via NFC (tap to phone to
           | authenticate), you need a micro which consumes very little,
           | powers up quickly and can do a signature before the FIDO
           | protocol times out. I'm not sure this is currently possible
           | with a FPGA, but maybe it is.
        
       | [deleted]
        
       | badrabbit wrote:
       | Good VPN company (one of the best) and good idea (sounds like USB
       | Armory). But the best it can do is assure that their VMs are not
       | logging anything and keep other promises. Will they also be able
       | to share details of their hosting setup in a way you can
       | independently verify (because they can always have more
       | middleware transparent traffic logging VMs)? doubt it, same goes
       | to whomever they use for hosting.
       | 
       | My point is, while I don't ascribe to the extremes of pro or anti
       | VPN sentiments, having a good understanding of what services like
       | this can and cannot do and performing rudimentary yet essential
       | security and privacy risk asessment is essential before trusting
       | them with all your traffic.
        
         | Foxboron wrote:
         | > Good VPN company (one of the best) and good idea (sounds like
         | USB Armory). But the best it can do is assure that their VMs
         | are not logging anything and keep other promises. Will they
         | also be able to share details of their hosting setup in a way
         | you can independently verify (because they can always have more
         | middleware transparent traffic logging VMs)? doubt it, same
         | goes to whomever they use for hosting.
         | 
         | We are working on this as part of the System Transparency
         | project.
         | 
         | https://system-transparency.org/
         | 
         | Disclaimer: I work on this.
         | 
         | Beyond this Penetration Testing reports on the Mullvad
         | infrastructure is public.
        
           | LASR wrote:
           | I've always wondered what is feasible through a state-issued
           | mandate along with a gag order to circumvent the technology
           | for something like this.
        
             | vladvasiliu wrote:
             | Couldn't this be solved by something like remote
             | attestation?
        
             | badrabbit wrote:
             | That's what I mean about risk asessment. You should not
             | expect mullvad or any other legally liable organization to
             | resist lawful orders or unlawful coercion, these are not
             | reasonable expectations and your security posture should
             | account for that.
        
           | badrabbit wrote:
           | Thanks for the response and your transparency, it looks like
           | you folks really believe in your mission.
           | 
           | The most revolutionary thing you are doing in my opinion is
           | "registration" and email free account management and accept
           | various forms of payment. You are way ahead of your time!
           | Other apps and sites outside of VPN services could do so well
           | to follow your example.
        
         | handsclean wrote:
         | Let's not frame this as trust them vs don't, it's trust them vs
         | trust your ISP. On one hand, you have a company that seems to
         | be doing as much as possible to commit to privacy, and on the
         | other, a company that straight up tells you they're monitoring
         | you and sending the data all over the place. Does that scale
         | really tilt differently if you point out there's a non-zero
         | chance the first company is secretly just as bad as the second?
        
       | fleventynine wrote:
       | From the photo, that looks like a stock iCE40 FPGA, which does
       | not support hardware attestation of the loaded bitstream. How
       | does the user verify that the FPGA loaded the expected bitstream
       | instead of something with a backdoor? A DICE chain that is not
       | rooted in physical, immutable hardware isn't very useful.
        
         | kfreds wrote:
         | > From the photo, that looks like a stock iCE40 FPGA, which
         | does not support hardware attestation of the loaded bitstream.
         | 
         | Which FPGA models support _attestation_ of the loaded
         | bitstream? Do any?
         | 
         | > How does the user verify that the FPGA loaded the expected
         | bitstream instead of something with a backdoor?
         | 
         | It's a Lattice ice40up5k, which contains a programmable and
         | lockable NVCM memory in-package. The engineering samples we
         | handed out today at OSFC store the FPGA configuration bitstream
         | on a SPI flash memory though.
         | 
         | > A DICE chain that is not rooted in physical, immutable
         | hardware isn't very useful.
         | 
         | When we start selling them we'll likely sell both security keys
         | with pre-provisioned bitstreams in NVCM as well as
         | unprovisioned security keys so you can provision your own.
        
       | dtx1 wrote:
       | Being FPGA Based from what i can tell is a brilliant idea. This
       | makes it possible to fix hardware level security issues. I've
       | been shocked by the recent exploit found in google pixels
       | security chip since i rely heavily on it (using graphene os). An
       | unfixable hardware level bug in it would turn my phone to e-waste
       | for me. This solves this quite elegantly.
        
       | dbrgn wrote:
       | Are you aware of Trussed, an initiative by SoloKeys and Nitrokey?
       | https://solokeys.com/blogs/news/trussed-announcement /
       | https://trussed.dev/
       | 
       | From what I understand, this is an API to write applications
       | against a common interface, which can run on different hardware
       | devices. An abstraction layer for security key apps. Similar to
       | Java Card, but in a more modern way. Is this something that would
       | or could be compatible with Tillitis?
        
         | Perseids wrote:
         | A bit off-topic: Can anyone recommend a platform that is
         | production ready today, if I want to (develop and) deploy a
         | custom Smartcard / HSM application in small scale? JavaCard
         | seems to fit the bill, but I've not yet found an approachable
         | tutorial.
        
         | ecesena wrote:
         | I've been dreaming of a fpga-based key since I read about
         | precursor. Not sure if it's yet possible to power it via NFC.
         | But with this said, sharing at least the FIDO implementation
         | would be outstanding.
        
         | kfreds wrote:
         | Yes, I'm aware of it. I'm not sure if it's small enough for the
         | Tillitis Key to be able to use it.
        
       | irusensei wrote:
       | Quick question about such devices: can I use stuff like Yubikey
       | or similar to luksOpen a crypt device during boot or operation?
       | 
       | Thanks in advance.
        
         | VTimofeenko wrote:
         | Yes, there's multiple ways. Systemd offers systemd-cryptenroll
         | that works with FIDO2 and X509 certificates on the hardware key
         | to unlock a drive.
         | 
         | The key is embedded as a luks header into the partition.
         | 
         | The information about the key and the device is passed to
         | initrd through /etc/crypttab for unlocking during boot.
         | 
         | I wrote a couple of posts describing how this can be sort-of-
         | handrolled with nitrokey and gpg key for x509 cert:
         | 
         | https://vtimofeenko.com/posts/unlocking-luks2-with-x509-nitr...
        
         | vinay_ys wrote:
         | Do you mean something like this:
         | https://github.com/agherzan/yubikey-full-disk-encryption
        
       | jeroenhd wrote:
       | I'm not sure what problem this solves. I see per-application keys
       | based on the hash of the application, but wouldn't this prevent
       | updates of those applications without key loss? It's clear to me
       | that this device can be used for _some_ kind of cryptographic
       | operation/verification mechanism, but I'm at a loss for what
       | problem this is actually designed to solve.
       | 
       | What's the practical application of this key?
        
         | xani_ wrote:
         | The app key would need to stay the same, but I can't think of a
         | mechanism that would deny one app trying to pretend it's
         | another.
         | 
         | Also the fact it doesn't emulate smartcard means every single
         | software supporting it would have to make a special client so
         | yeah, that's a problem.
         | 
         | "Just" smartcard allows for at the very least GPG signing and
         | SSH agent without much fuss, and also HTTPS client cert auth
        
           | layer8 wrote:
           | I believe the only thing needed is someone writing a PKCS #11
           | driver for it, then it should be interoperable.
        
         | dmurray wrote:
         | We used to use the same version of applications for _years_.
         | 
         | It's OK to say this has a serious limitation in that it can't
         | easily support updating applications, but that hardly rules out
         | it being useful at all.
        
         | kfreds wrote:
         | Tillitis Key's design encourages developers to experiment with
         | new security key applications and models in a way that makes
         | adoption easier and less risky for end-users.
         | 
         | You can read more on tillitis.se or in the comment I made
         | below.
         | 
         | Tillitis Key will allow you to chain-load applications. This
         | means that you could have a thin loader which does code signing
         | verification of the next application stage, and hand off the
         | secret to it. Basically it's a trust policy that defines under
         | what circumstances the next application stage gets the secret.
         | 
         | Another trust policy the loader could have is requiring m-of-n
         | code signatures, or perhaps that as well as transparency log
         | inclusion. Check out sigsum.org.
        
         | Semaphor wrote:
         | This comment explains it:
         | https://news.ycombinator.com/item?id=32897307
        
       | switch007 wrote:
       | The name sounds like a disease.
        
         | MisterTea wrote:
         | The letters themselves form the side profile of a restaurant
         | dining room in disarray.
        
         | notemaker wrote:
         | Probably a play on words in Swedish, "tillit" means trust /
         | confidence.
        
           | alrlroipsp wrote:
           | Move over, IKEA.
        
         | encode wrote:
         | It's a play on the Swedish word "tillit", which means trust. So
         | tillitiskey = trust is key.
        
         | layer8 wrote:
         | It's tillitating.
        
       | filleokus wrote:
       | Cool! Always nice to see extra competition in this space.
       | 
       | One thing I've wanted for a while is a way to properly backup a
       | webauthn token. An approach I discussed a couple of weeks ago [1]
       | was:
       | 
       | 1: Generate on-hardware webauthn master key on device A.
       | 
       | 2: Generate on-hardware key-pair on device B
       | 
       | 3: Export B's public key, import to A
       | 
       | 4: On Device A: Encrypt master key with B's public key
       | 
       | 5: Export encrypted master key to B
       | 
       | 6: Decrypt on B
       | 
       | I guess this would probably be possible with this device? Perhaps
       | there are some even more clever way to do it.
       | 
       | [1]: https://news.ycombinator.com/item?id=32621426
        
         | kfreds wrote:
         | Hi! Interesting. Which company do you work for?
         | 
         | Yes, that'd be possible. I don't know how webauthn works, but
         | if it relies on ECC you could probably do ECDH between all
         | security keys you wanted to carry your master key, and then use
         | the combined ECDH values as the master key.
        
       | Mandatum wrote:
       | This doesn't solve cookie theft, but this does massively raise
       | the bar for hackers getting into service provider environments.
       | Couldn't think of a better time to launch.
        
       | Hacker_Yogi wrote:
        
       | kreetx wrote:
       | dupe, https://news.ycombinator.com/item?id=32896658
        
         | SadTrombone wrote:
         | To be fair, this thread was posted first.
        
           | dang wrote:
           | OK, we'll merge that one hither.
        
       ___________________________________________________________________
       (page generated 2022-09-19 23:00 UTC)