[HN Gopher] Apple Security Research Device Program
       ___________________________________________________________________
        
       Apple Security Research Device Program
        
       Author : dewey
       Score  : 233 points
       Date   : 2020-07-22 17:09 UTC (5 hours ago)
        
 (HTM) web link (developer.apple.com)
 (TXT) w3m dump (developer.apple.com)
        
       | saurik wrote:
       | Anyone who wants to should be able to buy such a device, as it
       | isn't like any of the machine code you are getting elevated
       | access to is even secret (you can download, from Apple,
       | unencrypted copies of the entire operating system). (You can try
       | to make an argument that this is about keeping you from getting
       | access to third-party encrypted assets to prevent some aspect of
       | piracy in the App Store, but this doesn't accomplish that either
       | as you need only have a single supported jailbroken device for
       | that to be easy, and the world already has millions of those and
       | you can't really prevent them as the act of fixing bugs discloses
       | the bug for the older firmware.)
       | 
       | The real problem here is that Apple is so ridiculously
       | controlling with respect to who is allowed to develop software
       | (in Apple's perfect world, all software development would require
       | an Apple license and all software would require Apple review)--in
       | a legal area that isn't really conducive to that (see Sega v.
       | Accolade, which was important enough to later ensure permanent
       | exemptions on reverse engineering and even jailbreaking for
       | software interoperability purposes in the original DMCA anti-
       | tampering laws)--that they are even working right now on suing
       | Corellium, a company which makes an iPhone emulator (which again,
       | has strong legal precedent), in order to prevent anyone but a
       | handful of highly controlled people from being able to debug
       | their platform.
       | 
       | Apple just has such a history of being anti-security researcher--
       | banning people like Charlie Miller from the App Store for showing
       | faults in their review process, pulling the vulnerability
       | detection app from Stefan Esser, slandering Google Project Zero,
       | denying the iPhone 11 location tracking until proven wrong,
       | requiring people in their bug bounty program to be willing to
       | irresponsibly hold bugs indefinitely so Apple can fix things only
       | at their leisure, and using the DMCA to try to squelch research
       | via takedowns--that this ends up feeling like yet another flat
       | gesture: they should have done much more than this device at
       | least a decade ago. I'd say Apple is in store for a pretty big
       | fall if anyone ever manages to get a bankroll large enough to
       | actually fight them in court for any protracted length of time
       | :/.
        
         | rkagerer wrote:
         | The move is in line with their reputation. Handing out a bunch
         | of research devices which come with a catch is a great way to
         | exert more control and influence over vulnerability reporting,
         | and skew the bargaining power when it comes to disclosure. I
         | expect the motivation is largely genuine to encourage security
         | research that bolsters their platform, but also stems in part
         | from an increasing fear of PR ramifications outside their
         | control.
         | 
         | "I'm handing out a bunch of water bottles; sign up here. The
         | contents remain my property so when you're done please urinate
         | back into them and return to me on demand."
        
         | ladberg wrote:
         | If anyone could buy this device, then tons of scammers would
         | buy them, install malware, and sell them to people as normal
         | phones. They could then control banking apps and whatever else
         | they wanted.
        
           | except wrote:
           | I doubt they would cost the same as regular iPhones.
        
             | ladberg wrote:
             | Does it matter what they cost if scammers could net
             | tens/hundreds of thousands for a single one? (Assuming they
             | pick targets right)
        
               | pvg wrote:
               | A scam that requires an individually targeted bespoke
               | device that nets tens or hundreds of thousands (how does
               | that even work? how would the proceeds be exfiltrated
               | untraceably?) is just a really expensive way to have a
               | very short career as a scammer.
        
               | strbean wrote:
               | People with tens/hundreds of thousands of dollars are
               | buying iPhones from random third parties?
        
           | derefr wrote:
           | Most such devices (e.g. "development kit" devices for game
           | consoles) look very different than the release product.
           | Usually in such a way that it'd be impractical to use them
           | casually for your personal needs.
           | 
           | In other cases, e.g. Android/Chromebooks, there's a common,
           | immutable early chain-of-trust that stays the same between
           | production and development devices (or in this case, between
           | rooted and unrooted devices); which pops up a message during
           | boot warning that a device is currently a development/rooted
           | device, and therefore should not be trusted for production
           | use-cases. It could just-as-well also say "DO NOT BUY THIS
           | PHONE ON THE SECONDARY MARKET; IT HAS BEEN TAMPERED WITH, AND
           | CANNOT BE TRUSTED WITHOUT FACTORY RE-VERIFICATION" -- and
           | then users told repeatedly in the company's messaging to look
           | for messages at boot before buying.
        
             | ladberg wrote:
             | I agree, but I think there are non-technical people who
             | would happily buy a cheap iPhone on Craigslist or Facebook
             | and enter in all their iCloud or banking info without
             | rebooting or looking at warnings.
        
               | saagarjha wrote:
               | Who is honestly going to buy this highly sought-after
               | iPhone, backdoor it and flip it for a discount on
               | Craigslist in an attempt to what, hack a random person?
               | And you can just prevent that by doing something simple
               | like disabling the App Store...
        
               | Wowfunhappy wrote:
               | I agree with the first sentence, but disabling the app
               | store would greatly hamper security research, since you
               | can't debug third party apps.
        
               | saagarjha wrote:
               | You'd just download the IPA manually and self-sign it or
               | whatever. Basically just make it unmistakable that this
               | device is not normal and block normal people from being
               | able to use it as normal without realizing it's a
               | development device.
        
         | markstos wrote:
         | Consider Android is open source and people have been able to
         | get root access on devices since the beginning.
         | 
         | Still, it seems for overall security to have this program exist
         | then not have a program at all.
        
           | saurik wrote:
           | To be fair (as a problem we have is that there is a small
           | oligopoly of players and they all mostly suck), the original
           | Android G1 was a locked down device: to get root on it
           | required you to buy the Android ADP1 (which cost a lot more
           | and had serious limitations with respect to apps from the
           | Market); we only got root on it due to a bug (so the same as
           | with iOS... it was just a dumber bug ;P). But yeah: Android
           | wouldn't be usefully open source to anyone if you couldn't at
           | least build your own device to test and run it on, and they
           | clearly support that and they even provide emulators (and,
           | thankfully, more recent, first-party Google devices are nice
           | and open: Google learned their lesson there pretty quickly).
        
           | oneplane wrote:
           | The 'android' that you get from your phone vendor definitely
           | isn't open source. The AOSP project is and the distro from
           | your vendor will have parts of it that can be traced back to
           | the source.
           | 
           | But the android ecosystem as a whole is not 'open source' and
           | not 'open' like people like you tell about. Almost all access
           | is due to hacks and bugs, not because the boot loader and the
           | OS came with the option for user access on the inside.
        
             | Wowfunhappy wrote:
             | > The 'android' that you get from your phone vendor
             | definitely isn't open source.
             | 
             | But it's so, so far ahead of iOS in openness. You can
             | sideload apps. You can spin up Android in a VM. You can buy
             | Android devices with unlocked bootloaders and install your
             | own roms or drivers or kernel modules or whatever else. The
             | system can be made yours to command.
             | 
             | iOS doesn't even let me run my own code in userland.
        
         | [deleted]
        
         | saagarjha wrote:
         | Not only a flat gesture, I think this by this they are actively
         | gunning for companies like Corellium and will have a huge
         | amount of control over security researchers who join the
         | program. Disclose your bugs to us on our terms or have your
         | access yanked? Pretty yikes. (And this is completely ignoring
         | the rest of your comment, because it's pretty clear that they
         | don't want consumers with debuggable iPhones.)
        
           | tptacek wrote:
           | We don't have to speculate about this; Apple is actively
           | gunning for Corellium. Your best tip-off to that was them
           | suing Corellium.
        
             | saagarjha wrote:
             | Oh, I know that Apple is gunning for Corellium, my question
             | was if this specific thing was meant to be another arm of
             | that dispute (it probably is).
        
               | tptacek wrote:
               | In the sense that Apple probably wouldn't have done it if
               | Corellium had never existed, sure.
        
         | devwastaken wrote:
         | If only open source software licenses could have predicted the
         | level of vertical integration control their software would be
         | used in. Apple continually violates the good will of developers
         | and puts forth their own bad will. I'm tempted to make up an
         | 'MIT minus non-free platforms' agreement. If the OS can't be
         | completely emulated and freeley installed without restriction,
         | then you can't use the library.
         | 
         | I'd like to see Apple survive having to recreate half their
         | software from scratch.
        
           | saagarjha wrote:
           | Sounds like you might want GPL?
        
             | searchableguy wrote:
             | Probably AGPL
        
           | exikyut wrote:
           | They made APFS in (probably?) less than ten years :(
           | 
           | The license structure you suggest sounds kind of interesting.
        
         | Despegar wrote:
         | >that they are even working right now on suing Corellium, a
         | company which makes an iPhone emulator (which again, has strong
         | legal precedent), in order to prevent anyone but a handful of
         | highly controlled people from being able to debug their
         | platform.
         | 
         | Anyone can debug their platform, as they have been. You just
         | need to be approved for this specific program.
         | 
         | Apple's case against Corellium is about intellectual property,
         | and it's frankly going to be a slam dunk in court. There's
         | already established precedent with Apple v. Psystar with an
         | almost identical set of facts.
        
           | Wowfunhappy wrote:
           | > You just need to be approved for this specific program.
           | 
           | Since not just anyone can be approved, I don't think I"d
           | consider that "anyone".
           | 
           | I mean, I guess technically "anyone" can learn to become a
           | security researcher and spend years building up a "proven
           | track record of success in finding security issues on Apple
           | platforms, or other modern operating systems and platforms",
           | but that's not generally how I think of the term "anyone". ;)
        
             | ganoushoreilly wrote:
             | Yep, definitely designed to be a barrier that's just enough
             | to say "look we're doing something", but restrict enough so
             | that most won't bother. Sort of like their repair program.
             | 
             | I tend to like Apple in general, I think they do a lot of
             | things right, but I feel there are a few things they do
             | that are clearly about money and not the purported reasons.
             | I guess no company is impervious to that.
        
               | tptacek wrote:
               | "Look we're doing something"? Apple has a well-regarded
               | security team that is among the largest in the industry.
               | The locked platform is part of the premise of their
               | security model. You can disagree with that; many smart
               | people do. But you can't pretend that anything other than
               | unlocking the platform constitutes a half-measure.
        
               | saagarjha wrote:
               | Apple's security model is not the only one that exists,
               | and its model has the additional benefit of giving them
               | the ability to control software distribution for the
               | platform.
        
               | ganoushoreilly wrote:
               | I'm actually not in disagreement of either points. I
               | think they just need to stop with the whole _see people
               | can do stuff_. Own it, I like the locked environment and
               | security.
               | 
               | That said, on the back end outside of devices, they don't
               | implement the same precautions. IE encrypting iCloud data
               | / backups where they have limited access. I would like
               | consistency is all.
               | 
               | One way or the other.
        
               | Despegar wrote:
               | In what way do they not "own it"? They've been owning the
               | fact that it's a closed system for 40 years, when tech
               | nerds complained about their products being "appliances".
        
               | ganoushoreilly wrote:
               | I'm more on the side of their argument that they have to
               | lock the phone down without their own ability to access
               | the data (which I agree with), but falling back on
               | encrypting the backend the same way (buckling to Federal
               | pressure).
               | 
               | I would love it to be encrypted through out. Even it if
               | means, that if I lose access I may not be able to get it
               | back. It's a tradeoff. The marketing position of being
               | secure is factually true for the phone, but they imply
               | the data is when in fact it really isn't.
        
           | saagarjha wrote:
           | > Anyone can debug their platform, as they have been.
           | 
           | Apple would not like "anyone" debugging their platform. The
           | fact that people have jumped through hoops to do so is
           | considered a bug and something that they actively try to
           | prevent.
        
             | Despegar wrote:
             | That doesn't change the fact that you're able to do
             | security research on Apple products without their
             | permission.
        
               | saagarjha wrote:
               | Apple gives me a locked box, I complain that I can't open
               | it. You're saying that I shouldn't because some people
               | have figured out how to pick the specific lock they're
               | using, even though Apple doesn't want to people to do
               | that and their next lock is not pickable using that
               | technique anymore. Oh, and the pickable locks will be
               | obsolete in a few years. Do you see the problem?
        
               | xondono wrote:
               | > Apple gives me a locked box, I complain that I can't
               | open it.
               | 
               | Well, if you don't like closed boxes, just don't buy a
               | closed box.
        
               | saagarjha wrote:
               | There's a locked box, and there's the box that spies on
               | you and also looks uglier to some people. Not much of a
               | choice, is there?
        
               | strbean wrote:
               | Part of the point of security research is that
               | practically 0% of consumers are capable of doing this
               | research themselves and making informed purchasing
               | decisions. Security researchers hold businesses
               | accountable when consumers can't.
               | 
               | In light of that, your argument is "Just don't hold Apple
               | accountable!", which Apple would love... but that would
               | also be harmful to consumers.
        
               | xondono wrote:
               | > Part of the point of security research is that
               | practically 0% of consumers are capable of doing this
               | research
               | 
               | Well, then by that statement alone security researchers
               | aren't just "anyone".
               | 
               | I don't see a problem in a company imposing restrictions
               | that make finding vulnerabilities harder for everyone in
               | general _provided_ they are willing to allow security
               | researchers to jump those restrictions.
               | 
               | In fact I actually see it as a big win, since now bad
               | actors have _significantly higher costs_ while good
               | actors that are legally liable have lower barriers of
               | entry.
        
               | saurik wrote:
               | ...but these hurdles (at least the ones I know of from
               | the bug bounty program; and which I see elsewhere in this
               | thread do seem to apply to these devices also) contain
               | things like clauses most security researchers consider
               | unethical (holding bugs indefinitely without public
               | disclosure no matter how long it takes Apple to fix the
               | issue) and seems to exclude people who don't generally
               | show Apple in a favorable light.
               | 
               | (And no: I entirely disagree that "bad actors" have
               | significantly higher costs because of this, as bad actors
               | can do illegal stuff like buy internal developer
               | protocols off the black market from corrupt factory
               | employees: there was a massive expose about this in
               | Forbes last year. Hell: Apple bugs are actually less
               | valuable on the black market now than Android bugs
               | because there are so many of them! Apple's attempts to
               | hide their devices from public scrutiny is about PR, not
               | part of some coherent security strategy.)
        
               | tptacek wrote:
               | If this doesn't impact costs for bad actors, it's hard to
               | see how it impacts costs for good actors, since, in the
               | _status quo ante_ of this program, both good and bad
               | actors shared the same vectors to get kernel access to
               | devices. Apple is, on this page, explicit about the
               | notion that this program doesn 't impact vulnerability
               | research done outside the program. In what way does this
               | program do anything but add an option for good actors?
               | 
               | I may just not be understanding you; maybe we just agree
               | that this program doesn't change a whole lot.
        
               | saagarjha wrote:
               | It gives researchers who don't want to do illegal things
               | debugging access to the kernel, whereas previously this
               | was not possible on newer devices because the only way to
               | do that outside of Apple was to somehow (illegally)
               | obtain access to a development-fused iPhone.
        
               | tptacek wrote:
               | Yes; I'm asking, how does providing that new option harm
               | software security researchers?
               | 
               | I understand the subtext that Apple could more
               | efficiently help software security researchers by freely
               | unlocking phones, but I'm not here to litigate that.
        
               | briandear wrote:
               | How many consumers do in depth security analysis on
               | devices before buying them? You are in a bubble of you
               | think that's normal behavior. And to do such research on
               | a device, wouldn't you need to buy a device? You're going
               | to buy a device to research if you should buy the device?
               | And sales numbers would indicate that most people are
               | confident enough in Apple's security.
        
               | Despegar wrote:
               | It's not illegal for you to do security research and
               | Apple is not attempting to make it so. You'd like it to
               | be more convenient to do security research and that is
               | what this program is designed to do. I don't see why it's
               | unreasonable for Apple to have terms you need to agree to
               | to benefit from this program.
        
               | saagarjha wrote:
               | I think it is not unreasonable for Apple to make security
               | research convenient without adding onerous restrictions
               | on how it is done. Many other platforms do this already,
               | too-actually it's the norm for most of them.
        
       | ChrisMarshallNY wrote:
       | Looks fairly cool, but I'll bet it isn't that popular with
       | security boffins. I would be cautious about something that might
       | not actually reflect a current "in the wild" device.
       | 
       | For example, if the OS isn't quite at the same level as the
       | release OS, it could be an issue.
       | 
       | That said, this is not my field, and I am not qualified to offer
       | much more than the vague speculation, above.
        
         | saagarjha wrote:
         | I would expect it to be exactly the same except that you can
         | debug it, basically. iPhones have a special fuse in them that
         | prevents that from being done on production hardware, and these
         | will presumably have that "unblown". If you want to test on
         | production hardware you always can, this just lets you do
         | research (a metaphor might be that this is "a debug build with
         | symbols, normal iPhones are a "release build".)
        
       | shantara wrote:
       | >If you report a vulnerability affecting Apple products, Apple
       | will provide you with a publication date... Until the publication
       | date, you cannot discuss the vulnerability with others.
       | 
       | In addition to the mandatory bug reporting, Apple reserve a right
       | to dictate the researchers a mandatory publication date. No more
       | 90/180 days responsible disclosure deadline policy. I highly
       | doubt any serious researcher would agree to work with such
       | conditions.
        
         | saagarjha wrote:
         | Would be interesting to see if Google Project Zero joins the
         | program, given their inflexibility in disclosure.
        
           | bobviolier wrote:
           | Looks like they won't https://twitter.com/benhawkes/status/12
           | 86021329246801921?s=1...
        
           | shantara wrote:
           | That's what I've been thinking about as well. I'm leaning
           | towards "no", but let's wait and see.
        
       | jedieaston wrote:
       | I wonder how much people are able to publish about the device.
       | I'd expect not much, but it'd be nice to be able to compare a
       | iPhone that was completely unlocked (at least, to whatever that
       | means for Apple) with whatever security they put on the ARM Macs
       | which are supposed to be "open for hobbyists". I'd expect that
       | the ARM Macs have much of the same security stack (by default)
       | that iOS devices have given what they said in the WWDC talks, but
       | maybe that's not the case.
       | 
       | Also, if you found an exploit on a research iPhone because you
       | made use of entitlements that were Apple-only, I wonder if that'd
       | be worth anything bounty wise. Nobody can/should be able to write
       | an that'll get through App Store checks if they asked for
       | PLZ_NO_SANDBOX_ILL_BE_GOOD or something (at least, that's what I
       | thought before the whole Snapchat system call thing happened).
       | But hypothetically the App Store review process is vulnerable to
       | a bad actor inside Apple pushing an update to a big app that
       | included malware, so I'd think that private entitlements
       | shouldn't be available at all to binaries that didn't ship with
       | the device/in a system update (unless some kind of hobbyist flag
       | was flipped by the consumer). So I'd say that would be worth
       | something, even if smaller than a more interesting exploit.
        
         | bluesign wrote:
         | > But hypothetically the App Store review process is vulnerable
         | to a bad actor inside Apple pushing an update to a big app that
         | included malware.
         | 
         | I don't think this is technically possible.
        
         | saagarjha wrote:
         | We'll see how the shipping ARM Macs are "fused" when they come
         | out, but my guess is that they will be more locked down than
         | these devices: their OS will be more permissive but you will
         | not have meaningful kernel debugging.
         | 
         | > Nobody can/should be able to write an that'll get through App
         | Store checks if they asked for PLZ_NO_SANDBOX_ILL_BE_GOOD or
         | something (at least, that's what I thought before the whole
         | Snapchat system call thing happened).
         | 
         | Snapchat (on iOS at least) is still subject to the app sandbox,
         | no app has on iOS has been granted an exception there to my
         | knowledge. On macOS there are apps that are "grandfathered in"
         | to not require the sandbox on the App Store, but new apps are
         | supposed to have it. Due to the way the dynamic linker works,
         | until recently it was possible to upload an app that could
         | bypass the sandbox, but Apple has said they have fixed this.
         | Some apps do have an exception to this as well, as the broad
         | way they fixed one of the issues broke legitimate functionality
         | in library loading. You can find those hardcoded in AMFI.kext,
         | theoretically they could turn off the sandbox for themselves if
         | they wanted.
        
           | mrpippy wrote:
           | > you will not have meaningful kernel debugging
           | 
           | Given that kext development is still supported (although
           | highly discouraged), won't they have to support the same
           | level of kernel debugging as usual?
           | 
           | > On macOS there are apps that are "grandfathered in" to not
           | require the sandbox on the App Store
           | 
           | Can you name any of these apps? Apple's own apps don't have
           | to be sandboxed (like Xcode or macOS installers), but I don't
           | know of anything else that gets an exception. Some apps like
           | Office get special "holes" out of the sandbox (in the form of
           | additional SBPL), but fundamentally they're still sandboxed.
        
             | saagarjha wrote:
             | > Given that kext development is still supported (although
             | highly discouraged), won't they have to support the same
             | level of kernel debugging as usual?
             | 
             | They just need to support loading kernel extensions. As
             | watchOS has shown, developers will figure out a way to get
             | their thing working on your device even if your make
             | debugging extremely painful. (Apple's current silicon
             | prevents debugging entirely because the kernel is prevented
             | from being patched in hardware.)
             | 
             | > Can you name any of these apps?
             | 
             | Sure. If your app's bundle ID matches one of
             | com.aspyr.civ6.appstore
             | com.aspyr.civ6.appstore.Civ6MetalExe
             | com.aspyr.civ6.appstore.Civ6Exe       com.tencent.WeWorkMac
             | com.tencent.WeWork-Helper
             | com.igeekinc.DriveGenius3LEJ
             | com.igeekinc.DriveGenius3LEJ.DriveGenius
             | com.igeekinc.DriveGenius3LEJ.dgdefrag
             | com.igeekinc.DriveGenius3LEJ.dgse
             | com.igeekinc.DriveGenius3LEJ.dgprobe
             | com.prosofteng.DGLEAgent       com.prosofteng.DriveGeniusLE
             | com.prosofteng.DriveGenius.Locum
             | com.prosofteng.DriveGenius.Duplicate
             | com.prosofteng.DriveGenius.Benchtest
             | com.prosofteng.DriveGenius.FSTools
             | com.prosofteng.DriveGenius.Scan
             | com.prosofteng.DriveGenius.Probe
             | com.prosofteng.DriveGenius.SecureErase
             | com.prosofteng.DriveGenius.Defrag
             | 
             | dyld interposing is enabled for your app even if it comes
             | from the App Store, opening the door for subverting the
             | mechanism for applying the sandbox.
        
           | GekkePrutser wrote:
           | > We'll see how the shipping ARM Macs are "fused" when they
           | come out, but my guess is that they will be more locked down
           | than these devices: their OS will be more permissive but you
           | will not have meaningful kernel debugging.
           | 
           | My big worry is them dropping terminal access altogether like
           | on iOS. That would really make the platform useless to me.
           | 
           | However I don't think they would do this at this point.
           | There's many user groups (like cloud developers) specifically
           | favouring Mac because of the strong terminal access.
        
             | saagarjha wrote:
             | There will be a terminal, and virtual machines, and kernel
             | extensions. It's pretty much a "full" macOS experience to
             | all but the most serious users.
        
             | easton wrote:
             | Craig specifically said that this wasn't going to happen,
             | in one of the podcasts he said people came up to him
             | internally and said "Wait. There's still Terminal, right?"
             | and he said "Yeah, it's a Mac.". The Platforms State of the
             | Union host also said that they had made contact with a
             | bunch of open-source projects with assistance (and in some
             | cases, iirc the OpenJDK and CPython, pull requests) on
             | moving to ARM.
        
               | GekkePrutser wrote:
               | Thanks, I didn't know that. Good to hear!
        
       | natvert wrote:
       | :O
       | 
       | This is huge. Not as a security device, but if this were the
       | normal permission model on all iPhones (e.g. owners of devices
       | get root on the devices they own... like a normal general purpose
       | computing device) I could ditch my android and my mac and use an
       | iPhone for everything.
       | 
       | I'm not saying this will ever happen, but in my mind this paints
       | a bright picture of what the iPhone could be.
       | 
       | It's also a bit sobering as I'm quite concerned Apple is actually
       | pushing the other direction in their shift from Intel to ARM.
        
         | cookiengineer wrote:
         | > like a normal general purpose computing device
         | 
         | Maybe just go for a PinePhone instead? [1] I mean, Linux GUIs
         | aren't fully mobile and touchscreen friendly yet, but it's
         | getting there real quick. I mean, they started in November
         | 2019.
         | 
         | In my opinion the PinePhone is the most promising device, as
         | all upstream projects use it as an official developer device
         | and upstream linux has integrated support.
         | 
         | [1]
         | https://wiki.pine64.org/index.php/PinePhone_Software_Release...
        
         | yalogin wrote:
         | I dont get the allure of this. As someone working in security,
         | the phone is an extremely leaky thing and very bad for privacy
         | to begin with. On top of that you want to remove all
         | restrictions and make it a security nightmare too? I get that
         | you want to install what you like. Sure, but I don't think the
         | convenience is worth the security trade off.
         | 
         | Honestly the mac or desktop is where I enjoy the openness and
         | do stuff I want to do. I would want to leave the phone
         | untouched and as secure as possible.
         | 
         | I would like to hear your and others' take on it though.
        
           | saagarjha wrote:
           | > As someone working in security, the phone is an extremely
           | leaky thing and very bad for privacy to begin with. On top of
           | that you want to remove all restrictions and make it a
           | security nightmare too?
           | 
           | One major issue is that Apple's security model is "we don't
           | trust you". And by that I mean everything works from _their_
           | root of trust; not yours. This isn 't the usual "I think
           | Apple is backdooring my iPhone, FWIW", what I'm really saying
           | is that I want the ability to elevate some of _my_ software
           | to the same permissions that Apple gives theirs. There is no
           | reason that I should not be able to vet my own code and add
           | it to the  "trust cache". So this isn't just "every app
           | should run without a sandbox", but it should be "I think GDB
           | that I personally should be able to attach to other apps, but
           | nothing else".
        
             | donor20 wrote:
             | This is because apple feels its too easy to trick users
             | into elevating software permissions - which in turn may
             | cause risks and harms to their user base.
             | 
             | Let me ask you - do you have your elderly parents on an
             | android? Then you will know already how totally owned those
             | phones can become.
        
               | saagarjha wrote:
               | If you don't mind, here's a comment I wrote recently that
               | I think is very relevant and I figured it just be easier
               | to link to rather than retype:
               | https://news.ycombinator.com/item?id=23784763
        
               | jedieaston wrote:
               | If you require the user to hook into iTunes/Xcode, flip
               | the device into recovery mode, click a few buttons, and
               | agree to a "You're hecked if you break it now" policy,
               | it'll be enough to scare off 99.9% of people from getting
               | owned. After that, just have it work like the current
               | profiles/supervision system where Settings makes it clear
               | that non-verified code is running and has a big "make it
               | go away!" button (sideloaded IPAs show up in profiles
               | with a delete app button, and that works well enough
               | except for the time limit).
        
               | stqism wrote:
               | I don't really agree to this, the end result is going to
               | be a large number of YouTube tutorials instructing people
               | on how to do this with captions like: watch free movies
               | on iPhone, "popular mobile game" money hack, and Snapchat
               | take screenshots without notifying hack.
               | 
               | Half of these developer / root mode required secrets are
               | going to be occasionally working mods and tweaks except
               | with tons of baked in spyware and ads that can no longer
               | easily be removed.
               | 
               | Perhaps some sort of per device profile which requires a
               | paid developer account could work, but I've gotten a
               | number of odd calls about YouTube videos involving Kodi
               | from family before, so I'm not sure trusting in the give
               | users freedom front.
        
               | GekkePrutser wrote:
               | This proves exactly the point made above of Apple not
               | trusting the user.
               | 
               | However if someone wants to be an idiot, how far do you
               | go to stop them? Apple's approach stops too many great
               | possibilities for knowledgeable users. It should be in
               | the same category as those "will it blend" types. Screw
               | it up? No warranty.
               | 
               | For me there's several things I need it that are
               | impossible because Apple won't allow them, so I have to
               | use Android. But that's comes infected with Google
               | spyware out of the box :(
        
               | giovannibajo1 wrote:
               | I think Apple point is that users that need being
               | protected from themselves without even realizing it are
               | far more than those who might get a benefit from root
               | without getting burnt. Since the two things can't exist
               | at the same time, they're going for the road that makes
               | the majority happy.
        
               | Wowfunhappy wrote:
               | > This is because apple feels its too easy to trick users
               | into elevating software permissions
               | 
               | Great, which is why I think offering a separate SKU to
               | people who want control over their devices would be a
               | wonderful compromise. Your parents can buy the normal
               | locked-down iPhone that's sold in the Apple Store, and
               | I'll buy the special one from the hidden page on Apple's
               | website.
        
             | xondono wrote:
             | My security model actually is "I don't trust myself".
             | 
             | One of the obvious conclusions from modern security
             | research is that the user has become the number one
             | vulnerability in pretty much all systems.
             | 
             | The corollary that a lot of people miss is that developers
             | and security researchers _are users too_. They get pwned
             | too. They some give the wrong permissions to an executable.
        
           | legulere wrote:
           | > As someone working in security, the phone is an extremely
           | leaky thing and very bad for privacy to begin with.
           | 
           | Compared with a computer where every program can steal all
           | your data?
        
             | gruez wrote:
             | A computer doesn't follow you around. It also doesn't have
             | a ton of built in sensors (accelerometer, dual cameras,
             | microphones, gps).
        
         | saagarjha wrote:
         | I see it as the opposite: these iPhones are rented to you, and
         | are clearly not what they want to "sell" to people. It's
         | certainly a huge surprise that this exists at all, and I would
         | certainly like more moves in the direction that you mentioned,
         | but I am not sure that this is it.
        
           | Wowfunhappy wrote:
           | Nitpick--I wouldn't say it's a huge surprise, since they said
           | last year they planned to do this.
           | 
           | I was actually wondering what was taking so long.
        
             | saagarjha wrote:
             | Huge surprise for me, because I personally expected them to
             | pull an AirPower with this and just forget about it
             | (especially since the demand was gone with checkm8).
        
         | jacobkania wrote:
         | Curious what you mean by "pushing the other direction." I would
         | say the opposite -- it seems like everything running on ARM is
         | exactly what it would take for your phone to run desktop
         | programs.
         | 
         | I think there are other downsides to switching off of x86, but
         | I think it strengthens the case for having one small portable
         | computer to do everything. The question is if that device will
         | allow real work like macOS, or if it'll be stuck as a fancy
         | consumer-only device..
        
         | dcow wrote:
         | Don't get your hopes up. This is not what you think it is,
         | sadly.
        
       | ebg13 wrote:
       | This involves an interesting set of assumptions about the
       | plausibility of deep-cover hacking operations.
       | 
       | > _If you use the SRD to find, test, validate, verify, or confirm
       | a vulnerability, you must promptly report it to Apple_
       | 
       | But let's say you pass their review, get a device, find a
       | vulnerability, and don't report it. Then what? You're breaching
       | the contract, but they have no way to know that, so there's no
       | consequence?
        
         | saagarjha wrote:
         | I would expect that if you put up jailbreakmeios14.com and they
         | find you have one of these devices, they will remove you from
         | the program.
        
           | ebg13 wrote:
           | Yes, but most exploits are deployed in secret by malicious
           | groups trying to hack your shit and steal your
           | money/identity/whatever, not publicized on consumer-facing
           | websites with your name attached.
        
             | mschuster91 wrote:
             | It's like with Al Capone and the taxman: at least Apple
             | that way has an angle of legal attack against shady
             | companies (Hacking Team, Finfisher, ...).
        
       | hendersoon wrote:
       | I agree this is theater; no serious whitehat researcher would
       | sign a deal forcing them to accept dates from the manufacturer.
       | It won't be useful for its intended purpose.
       | 
       | On the bright side, it will be very useful for jailbreak research
       | and in a way, those bugs _do_ get disclosed to Apple for them to
       | subsequently fix. Not necessarily the way Apple wants, but it
       | does shine daylight on their code.
       | 
       | These guys keep working exploits close to their hearts and don't
       | release them specifically so they can get a look at new hardware.
       | That will no longer be necessary. You find an exploit, you can
       | release it right away.
       | 
       | And on the gripping hand, it will also be used by malicious
       | criminals and state actors to develop zero days for various evil
       | purposes.
        
         | saagarjha wrote:
         | > On the bright side, it will be very useful for jailbreak
         | research and in a way, those bugs _do_ get disclosed to Apple
         | for them to subsequently fix.
         | 
         | It's useless for jailbreak research because Apple will force
         | you to shut up about it at least until they patch it, so now
         | you can't jailbreak.
        
           | GekkePrutser wrote:
           | That is if people obey the NDA of course. I'm sure not
           | everyone will do so.
           | 
           | However finding a bug, reporting it and then 'suddenly' a
           | jailbreak appearing that would use it, would be highly
           | suspicious indeed. So they'd probably have to give up the
           | chance of getting the bug bounty.
           | 
           | PS: I'm certainly not signing that NDA myself :)
        
             | hendersoon wrote:
             | Yes. Once these devices exist they will be used by
             | everybody interested in that sort of access. Ironically,
             | pretty much everybody _other_ than whitehats.
        
       | guidovranken wrote:
       | > If you use the SRD to find, test, validate, verify, or confirm
       | a vulnerability, you must promptly report it to Apple and, if the
       | bug is in third-party code, to the appropriate third party. If
       | you didn't use the SRD for any aspect of your work with a
       | vulnerability, Apple strongly encourages (and rewards, through
       | the Apple Security Bounty) that you report the vulnerability, but
       | you are not required to do so.
       | 
       | So vulnerabilities found through this program are not eligible
       | for any reward. Then what would be the incentive to enroll (and
       | accepting liabilities like losing the device, Apple suspecting
       | you of breach of contract etc)? Just bragging rights?
        
         | saagarjha wrote:
         | I think that is supposed to be read as "you must report any
         | vulnerabilities, which will be treated as any vulnerability you
         | chose to voluntarily submit".
        
           | guidovranken wrote:
           | You are correct because they have now added a bullet point:
           | 
           | > Vulnerabilities found with an SRD are automatically
           | considered for reward through the Apple Security Bounty.
        
       | dpifke wrote:
       | Has anyone here ever been paid a bounty for a vulnerability
       | reported to Apple?
        
         | GekkePrutser wrote:
         | Yes it does happen: https://gizmodo.com/apple-pays-
         | developer-100-000-for-finding...
        
           | saagarjha wrote:
           | Any for kernel bugs?
        
         | thamer wrote:
         | I don't know if the author is on HN (also not sure what
         | difference this makes) but a payout of $100k made the news
         | recently for https://bhavukjain.com/blog/2020/05/30/zeroday-
         | signin-with-a...
         | 
         | A few days later another researcher reported earning $75k for
         | webcam access vulnerabilities:
         | https://www.ryanpickren.com/webcam-hacking
         | 
         | These payments are not uncommon.
        
       | brutopia wrote:
       | As a long time iOS user this single aspect has made me look over
       | the fence to the android side the whole time. Not having full
       | access to my own devices is insane. The poor security on android
       | side has kept me away, but they've just recently been catching up
       | enough that the scales are almost tilted.
        
       | keyme wrote:
       | From experience, I'll suggest all serious security researchers to
       | never, ever, sign any agreement with the company whose products
       | they are researching.
       | 
       | This particular case is also outrageous for other reasons:
       | 
       | 1) They are only doing this now because Corellium has been
       | selling virtually the same thing for a while already.
       | 
       | 2) They are doing this to try and hurt Corellium financially,
       | while they're already suing them in parallel.
       | 
       | 3) Agreeing to their terms here, effectively makes you a
       | glorified Apple QA engineer. Only you don't get a salary, but
       | rather, a bounty for whenever you find a bug. For most people
       | that would be way, way less money than just being employed
       | wherever.
        
         | ballenf wrote:
         | Kind of humorous to imagine a researcher suing apple under the
         | anti-gig California law. Would be a factual question of whether
         | the researcher has sufficient control over their work under the
         | agreement.
         | 
         | Apple would almost certainly win the suit, but I think there's
         | reasonable odds the suit would survive an early motion to
         | dismiss before factual discovery.
        
         | donarb wrote:
         | > They are only doing this now because ...
         | 
         | Apple announced these devices last year at Black Hat.
        
           | saagarjha wrote:
           | Right before they promptly sued Corellium.
        
         | ghshephard wrote:
         | I read the terms of the SRD [1] to suggest if you get one, and
         | use it, you aren't eligible for bounties on any bugs you find
         | while using it. So, you are an _entirely_ unpaid Apple QA
         | engineer. Knowledge is its own reward I guess.
         | 
         | [1] "If you use the SRD to find, test, validate, verify, or
         | confirm a vulnerability, you must promptly report it to Apple
         | and, if the bug is in third-party code, to the appropriate
         | third party. If you didn't use the SRD for any aspect of your
         | work with a vulnerability, Apple strongly encourages (and
         | rewards, through the Apple Security Bounty) that you report the
         | vulnerability, but you are not required to do so."
        
           | vngzs wrote:
           | The (and rewards) bit isn't saying that SRD users are
           | ineligible for rewards! Rather, it's trying to encourage non-
           | SRD users to report vulnerabilities they find. If Apple
           | explicitly stated that SRD users are ineligible for bounties,
           | I'd be pretty confident Apple has lost their minds, as the
           | SRD devices would be completely worthless to vulnerability
           | researchers - only serving to contractually restrict them,
           | and offering no practical benefit.
           | 
           | To show that "you aren't eligible for bounties on any bugs
           | you find while using it" is false, let's break Apple's quote
           | into two separate statements, and only consider things that
           | are explicitly stated in them.
           | 
           | First:
           | 
           | > If you use the SRD to find, test, validate, verify, or
           | confirm a vulnerability, you must promptly report it to Apple
           | and, if the bug is in third-party code, to the appropriate
           | third party.
           | 
           | If you use SRD in the process of discovering a vulnerability,
           | you have to disclose it to the software authors. Got it.
           | 
           | Second:
           | 
           | > If you didn't use the SRD for any aspect of your work with
           | a vulnerability, Apple strongly encourages (and rewards,
           | through the Apple Security Bounty) that you report the
           | vulnerability, but you are not required to do so."
           | 
           | If you don't use SRD, Apple strongly encourages you to report
           | the vulnerability. But they have no way to force it.
           | 
           | I understand how a casual interpretation of those quotes
           | could be seen to imply that SRD excludes you from bounties,
           | but that's not what Apple is saying.
        
           | xondono wrote:
           | It doesn't suggest that you aren't eligible for bounties, but
           | rather that you are not allowed to _not disclose_ a
           | vulnerability.
        
             | usmannk wrote:
             | It certainly suggests this.
             | 
             | Full bullet: If you use the SRD to find, test, validate,
             | verify, or confirm a vulnerability, you must promptly
             | report it to Apple and, if the bug is in third-party code,
             | to the appropriate third party. If you didn't use the SRD
             | for any aspect of your work with a vulnerability, Apple
             | strongly encourages (and rewards, through the Apple
             | Security Bounty) that you report the vulnerability, but you
             | are not required to do so.
        
               | xondono wrote:
               | Maybe it's me but what I read in that paragraph is:
               | 
               | If you use the SRD, you are required to report any
               | vulnerability. If you didn't, you are not required but
               | encouraged.
               | 
               | It doesn't say if you used it you aren't eligible for
               | reward
        
             | wdb wrote:
             | It's probably hard to resist to use this SRD while trying
             | to find a vulnerability if you have it such a device. If
             | you use it for something you won't be eligible for the
             | bounty.
        
         | tptacek wrote:
         | To whatever extent these devices are distributed, my guess is
         | that they land predominantly in the hands of consultancies and
         | security product firms, where the bulk of bread-and-butter
         | security research is done. Those firms will all have their
         | legal vet the actual contract (which this page is not).
         | 
         | And, of course, that's the case with Corellium as well; it's
         | not like Hopper or Binja, a tool that random people just buy to
         | kick the tires on. The front page of Corellium's site is a
         | "contact sales" mailto; the term of art we use for that pricing
         | plan is "if you have to ask...".
        
       | gowld wrote:
       | Why do experts make absurd comments like ""Everybody thinks
       | basically that the method you learn in school is the best one" ?
        
         | saagarjha wrote:
         | Wrong thread?
        
       | saagarjha wrote:
       | I want to apply (not that I am sure that Apple would consider me
       | a security researcher) but am unsure to what extend they're going
       | to go with
       | 
       | > If you use the SRD to find, test, validate, verify, or confirm
       | a vulnerability, you must promptly report it to Apple and, if the
       | bug is in third-party code, to the appropriate third party.
       | 
       | I mean, if I find a bug I might report it, but I know people who
       | work on jailbreaks and stuff-if they tell me something will I
       | have to promptly report it? What if I find something on a non-SRD
       | device? If I ever hypothetically "write a jailbreak", will Apple
       | come after me even if I say I didn't use that device for it? I
       | can get 90% of the benefit from using a device with a bootroom
       | exploit, with none of the restrictions here...
        
         | jentist_retol wrote:
         | >if they tell me something will I have to promptly report it
         | 
         | according to the terms no, unless you use the SRD to verify the
         | information or vulnerability
         | 
         | >If I ever hypothetically "write a jailbreak", will Apple come
         | after me even if I say I didn't use that device for it
         | 
         | I imagine that if you sold a jailbreak for $$$$ that Apple
         | would probably take a close look at the telemetry the device is
         | sending. If you're confident in your ability to terminate all
         | telemetry, and keep good opsec, and defend yourself in court,
         | then maybe that avenue would be feasible. It certainly wouldn't
         | be ethical.
        
           | saagarjha wrote:
           | You're taking this question the wrong way: my scenario isn't
           | "I want to trick Apple", it's "will Apple believe me even if
           | I am being honest" and "even if Apple thinks I am being
           | honest will they hold it over my head anyways as a way to
           | control what I disclose".
        
         | phnofive wrote:
         | I'm not a lawyer nor your lawyer, but I read that to mean any
         | vulnerability you discover as a result of your research using
         | the SRD, not any vulnerability you otherwise discover or of
         | which you have knowledge.
        
           | saagarjha wrote:
           | Right, but is Apple going to believe me when I say that I
           | didn't? They could just revoke my access anyways. (I'm being
           | honest here, this isn't a question of "can I trick Apple into
           | thinking I didn't do this on the SRD".)
        
             | phnofive wrote:
             | I'd think it'd be difficult to run in both circles for too
             | long.
        
       ___________________________________________________________________
       (page generated 2020-07-22 23:00 UTC)