[HN Gopher] AI-designed camera only records objects of interest ...
       ___________________________________________________________________
        
       AI-designed camera only records objects of interest while being
       blind to others
        
       Author : m-watson
       Score  : 144 points
       Date   : 2022-08-24 14:27 UTC (8 hours ago)
        
 (HTM) web link (cnsi.ucla.edu)
 (TXT) w3m dump (cnsi.ucla.edu)
        
       | angrycontrarian wrote:
       | I'm not sure the authors appreciate the impact of their own
       | invention. This isn't a camera that censors things: it's a
       | passive image segmentation model that runs in real time and
       | consumes zero power. This would have huge implications for
       | robotics applications.
        
         | aaaaaaaaaaab wrote:
         | Optical pattern recognition is an established field.
         | 
         | Here's a simple version via Fresnel zone plates:
         | https://youtu.be/Y9FZ4igNxNA
        
         | [deleted]
        
       | tablespoon wrote:
       | > Since the characteristic information of undesired classes of
       | objects is all-optically erased at the camera output through
       | light diffraction, this AI-designed camera never records their
       | direct images. Therefore, the protection of privacy is maximized
       | since an adversarial attack that has access to the recorded
       | images of this camera cannot bring the information back. This
       | feature can also reduce cameras' data storage and transmission
       | load since the images of undesired objects are not recorded.
       | 
       | That seems overstated. In the third example image pair, I can
       | easily see a _shadow_ of the input 5 in the output. I 'm pretty
       | sure the 9 is also there in the fourth pair, but the shadow is
       | not as clear.
        
       | Mockapapella wrote:
       | Earworm by Tom Scott comes to mind:
       | https://www.youtube.com/watch?v=-JlxuQ7tPgQ
        
       | jstrieb wrote:
       | Reminds me of this really cool video about using Fourier Optics
       | for optical pattern recognition.[1] The video happens to have one
       | of the best explanations of Fourier transforms I've yet
       | encountered.
       | 
       | 1: https://www.youtube.com/watch?v=Y9FZ4igNxNA
        
       | beambot wrote:
       | What guarantees do you have that information doesn't still bleed
       | through -- e.g. that compressed sensing techniques could still
       | recreate meaningful parts of the obscured image?
        
       | ksr wrote:
       | You're saying this as if it were a good thing.
        
       | [deleted]
        
       | oldstrangers wrote:
       | Given the rise of UFO / UAP sightings I always wondered why there
       | wasn't just an army of cameras pointed at strategic regions
       | around the globe 24/7 (that aren't government owned). A camera
       | like this would be great for catching only what's really
       | interesting.
        
         | NavinF wrote:
         | Are UFO sightings on the rise? I thought they stayed constant
         | while https://xkcd.com/1235/
        
       | thow_away_soon wrote:
       | That's interesting, could it be used on medical imaging to erase
       | noise or somehow highlight tumors or fractures, without software
       | post processing?
        
         | [deleted]
        
       | staindk wrote:
       | No mention of Black Mirror in the comments yet - I'm surprised!
       | 
       | A lot of the other theorising here runs right alongside the
       | premise of one of the stories told in the White Christmas
       | episode[1].
       | 
       | IIRC the episode was very well done, as most of them are.
       | 
       | [1] https://www.youtube.com/watch?v=AOHy4Ca9bkw
        
       | fbanon wrote:
       | It's an optical MNIST recognizer.
        
       | themaninthedark wrote:
       | Get the digital equivalent of fnord in optical algorithms, feel
       | free to rob/~murder~ assassinate with no evidence. Bonus points
       | for when implants become widespread, then people won't be able to
       | see you either!
        
       | kzrdude wrote:
       | Is there an adversarial network that can take the redacted result
       | and try to reconstruct what the camera actually saw?
        
         | MauranKilom wrote:
         | That was also my first thought. They claim that the data is
         | "instantaneously erased", but "doesn't look like the input" is
         | different from "erased".
        
       | mojo74 wrote:
       | If this AI had designed an animal's optics it would have been the
       | frog's: https://indianapublicmedia.org/amomentofscience/eat.php
        
       | m-watson wrote:
       | The paper: To image, or not to image: class-specific diffractive
       | cameras with all-optical erasure of undesired objects
       | 
       | https://elight.springeropen.com/articles/10.1186/s43593-022-...
        
       | sudden_dystopia wrote:
       | Wouldn't the underlying data that the camera parsed to determine
       | what to record still be recorded or be otherwise retrievable
       | somewhere? Meaning everything is still recorded in some way?
        
         | bradyd wrote:
         | This system is fully optical, so the data is filtered before it
         | reaches the camera.
        
       | theanonymousone wrote:
       | Reminds me of a book with a 4-digit title.
        
       | apocalypstyx wrote:
       | The real problem is that humans have always deluded themselves
       | that some technology was a 'truth technology'. It's been done
       | with everything from typewriters to cameras. However, the camera
       | has always lied, always rendered a counterfactual to a
       | hypothesized truth state, denied access to fundamental reality.
       | That a camera may now lie in a slightly different fashion does
       | not alter that.
       | 
       | Are you going to believe your lying eyes?
        
       | JKCalhoun wrote:
       | I believe my own brain works this way.
        
       | whatshisface wrote:
       | This was not designed by an "AI", it was designed through
       | gradient descent optimization. It is an interesting application
       | but it has nothing to do with AI.
        
       | siculars wrote:
       | "...erasure of undesired objects"
       | 
       | This is not going to turn out well. Do we really want to edit
       | reality in this way? This is like the printer that automatically
       | watermarks your prints - for your security and protection! Coming
       | to a child protection law near you real soon.
       | 
       | Want to take a picture of that Ferrari? That'll be an extra $5.
       | 
       | No, you really can't take photos in airports.
       | 
       | Thats a police officer(s). (Literally) Nothing to see here, move
       | along.
       | 
       | Vampires/Ghosts. A class of people who's faces are in a master
       | redact database. You know, like some real CIA Jason Borne stuff.
       | 
       | Military installation? What military installation? Replace with
       | slave labor camp or, a more economically favorable rendition -
       | "sweatshop."
        
         | wongarsu wrote:
         | > This is like the printer that automatically watermarks your
         | prints - for your security and protection! Coming to a child
         | protection law near you real soon.
         | 
         | You mean like most (or maybe all) color laser printers sold in
         | the last couple decades?
         | 
         | https://en.wikipedia.org/wiki/Machine_Identification_Code
         | 
         | https://www.eff.org/pages/list-printers-which-do-or-do-not-d...
        
         | cerol wrote:
         | > Vampires/Ghosts. A class of people who's faces are in a
         | master redact database. You know, like some real CIA Jason
         | Borne stuff.
         | 
         | Doesn't make much sense. If a person is a risk, you would want
         | to surveil them _more_ , and not erase them from every camera
         | feed.
        
           | arcticfox wrote:
           | I think it's more like the CCP would load their secret police
           | database in.
        
           | omgwtfbyobbq wrote:
           | A government could do both.
        
         | bmitc wrote:
         | I was just told by Amazon that I can't sell a used book there
         | because the publisher, Morgan Kaufmann, is currently not
         | accepting applications to sell and denied my request to sell.
         | 
         | I was pretty stupefied. Amazon and the publisher have colluded,
         | for whatever reason, to police how used books are sold.
        
           | ddalex wrote:
           | What do you mean, it's your book. Why would even ask Amazon
           | if you can sell it?
        
             | bmitc wrote:
             | You have to to sell in their platform. When I tried to add
             | it to my Amazon Seller inventory, it required approval.
             | Selling on Abebooks (owned by Amazon) requires a
             | subscription now.
        
           | Gh0stRAT wrote:
           | Perhaps Amazon is FINALLY starting to do something about
           | their rampant book piracy problem?
           | 
           | [0]https://mobile.twitter.com/martinkl/status/155114915940266
           | 80...
           | 
           | [1]https://nypost.com/2022/07/31/pirated-books-thrive-on-
           | amazon...
        
             | bmitc wrote:
             | Who knows? Although, I actually bought this particular book
             | off of Amazon.
             | 
             | In this case, after some research, I actually think this is
             | related to the book sometimes being used as a textbook. I
             | wouldn't really call it a textbook, as in the normal 30
             | editions type of textbooks, as it's just a really good book
             | on its topic (and only the second edition, which is the
             | latest). Apparently publishers want to funnel the used sale
             | of such books through certain approved sellers? I imagine
             | it's to keep the price artificially high and for the
             | publisher to recover some of the money back themselves.
             | Seems ridiculous, and it would be surprising if it's legal.
             | It's basically a racket.
        
         | pishpash wrote:
         | I've got news for you. "Reality" is already edited. Actually is
         | a model made up by the brain.
        
         | WaitWaitWha wrote:
         | "artisanal mining" for cobalt.
        
         | [deleted]
        
         | thatjoeoverthr wrote:
         | It's not that sophisticated and is a physical artifact that had
         | to be forged for some purpose. Effectively it's like having a
         | zero power neural network. You could make something like a
         | motion sensor that only spots human faces, but very low power.
        
         | _the_inflator wrote:
         | We should really consider conserving analog versions of taking
         | pictures.
        
         | jollyllama wrote:
         | It's bad, but it's just moving reality editing a few levels
         | down the stack.
        
           | autoexec wrote:
           | It's moving it away from your control. Right now we have the
           | option to edit the images and videos we capture. This kind of
           | technology allows those same choices to be made by someone
           | else without any regard for your wishes. Your options can be
           | limited to editing only what they allow to be captured in the
           | first place.
        
         | JofArnold wrote:
         | It's already happening with AR; there was a demo on Twitter
         | showing how using Apple's new AR SDK you can just plaster over
         | things you don't want to see. This for me puts AR right up
         | there with AI as a huge risk to society, for precisely the
         | reasons you point out. "Pay $9.99 a month not to see homeless
         | people" "Pay $2.99 a month to see enhanced street signs so only
         | you can find your way quickly" etc
        
           | edgyquant wrote:
           | What exactly is this problem with this?
        
             | kbenson wrote:
             | We have enough of people acting like their own experiences
             | are indicative of the norm or are evidence of something
             | happening or not happening, and using that to spread that
             | message, that making this even easier seems like it would
             | be a real problem.
             | 
             | Are news bubbles a problem? Imagine if people actually
             | block out reality on an even more direct level and what
             | that means to their perception of the world. What if people
             | can opt into trusty AR programs to "show" them the stuff in
             | the world they're missing (the conservative conspiracy or
             | liberal agenda), and those also selectively omit some other
             | things?
        
             | bee_rider wrote:
             | I mean, maybe some people are homeless by choice, but some
             | are due to misfortune and poverty. Using technology to turn
             | a blind eye to poverty in our communities seems bad. Also
             | you may trip over a homeless person.
        
               | 6510 wrote:
               | It would defeat the entire point of having them: As an
               | example why to obey. It is not like you cant scale a tax
               | with housing requirements. Could give them jobs too. It
               | would take a bit of getting used to but if the only thing
               | a person wants is drugs their potential productivity
               | could be 10x that of ours combined.
        
           | colejohnson66 wrote:
           | But you're not forced to use AR. No one is going around
           | slapping VR headsets on people to "hide the homeless".
        
             | jsharf wrote:
             | You're not forced to use your cellphone. But society might
             | hypothetically get to a point where it's extremely
             | inconvenient not to.
        
               | siculars wrote:
               | Hypothetically? We're already there.
        
               | ClassyJacket wrote:
               | You are absolutely forced to use a smartphone. To have
               | just about any kind of job, you require one.
        
               | andai wrote:
               | Would a dumb mobile phone suffice? Or do they need you
               | answering email while you commute?
        
               | guerrilla wrote:
               | It'd be impossible to live in Sweden without a mobile.
        
             | riversflow wrote:
             | Just like you're not forced to use a smartphone, or a car?
        
             | selfhoster11 wrote:
             | You're naive if you think this way. You're not forced to
             | use AR, _yet_. That 's different to _never will be forced
             | to_.
             | 
             | Smartphones are already all but mandatory for certain
             | locations/demographies. Not owning one carries an immense
             | penalty when it comes to access to government services,
             | banking, and general daily functioning.
        
               | InCityDreams wrote:
               | ....3 hours with my bank and EUR20 for a token-generator
               | (pass-number generator ) says different.
               | 
               | Even the pass-generator is diffiicult to fiind on the
               | login page....keeps telling me 'not authorised' without
               | the phone ok.
               | 
               | *don't drop your phone!!* Especially when you'll suddenly
               | need it to buy a new phone.
        
           | sbierwagen wrote:
           | >Pay $9.99 a month not to see homeless people
           | 
           | Better hope that's context-sensitive. Street meth addicts
           | commit enough random assaults and smash enough car windows
           | now, just imagine what they would do if they were literally
           | invisible.
        
         | [deleted]
        
         | [deleted]
        
         | undersuit wrote:
         | > Do we really want to edit reality in this way?
         | 
         | Do you have a solution to stop this hypothetical future you've
         | envisioned that isn't also just as bad?
         | 
         | "Hey you can't code that feature!"
        
           | edgyquant wrote:
           | Not to mention that there is nothing wrong with allowing
           | people to see different things lmao. What even is this
           | conversation ?
        
         | cerol wrote:
         | Would a government even _want_ that? Not even from the moral
         | stance, just the strategic point of view. We know from the
         | social media experiment how bad things can go if you hook an
         | entire population to some  "product".
         | 
         | Sure, you can make (even though incentives) everyone wear AR
         | glasses (because they're so cool), and they'll censor out
         | undesired things. That's as much a form of control as it is a
         | form of being controlled. Hooking your entire population's
         | _vision_ to the internet means it could possibly be maliciously
         | used by bad actors.
        
         | stavros wrote:
         | If you don't want a camera that can't take photos of undesired
         | objects, don't buy one.
        
           | selfhoster11 wrote:
           | If you don't want a TV that doesn't track your viewing
           | habits, force automatic software updates, show ads, and do
           | other objectionable things while claiming it's "smart", don't
           | buy one.
        
             | stavros wrote:
             | Yeah, I don't.
        
               | ekianjo wrote:
               | Good luck finding one easily.
        
               | stavros wrote:
               | I only need one, and there are display monitors that
               | companies use for offices etc.
        
               | NavinF wrote:
               | Monitors still exist and they don't suffer from the
               | terrible input lag that most smart TVs have.
        
               | pelorat wrote:
               | Every modern TV has a gaming mode that disables post-
               | processing.
        
               | NavinF wrote:
               | I meant to say pixel response time.
        
           | WaitWaitWha wrote:
           | What when there are laws preventing the manufacturing of such
           | _desired_ cameras?
           | 
           | I am all for great ideas and tools and implementations. I am
           | just very leery of humans. ;)
        
             | stavros wrote:
             | Regimes don't generally bother to mandate the use of a
             | specific technology, they just mandate the act illegal.
        
               | autoexec wrote:
               | The example of yellow tracking dots in printers has
               | already been mentioned. Our governments had zero problems
               | mandating the use of that specific technology. Same with
               | kill switches in cars so that police can remotely disable
               | your vehicle.
        
           | siculars wrote:
           | Wait till some new laws show up. Wait till it is economically
           | incentivized to buy redactocams.
        
             | undersuit wrote:
             | Maybe we could have laws that protect us from these kind of
             | cameras instead of enforcing them. I'm against saying "No
             | you can't do this" but I'm all for "You must show us how
             | you do this" or "This thing must be optional".
        
             | stavros wrote:
             | You can already make cameras do this in software, why
             | aren't we buying redactocams now?
        
               | olyjohn wrote:
               | If it's already in software, then your phone is just one
               | irreversible update away from getting that feature for
               | free! People love free stuff!
        
       | bell-cot wrote:
       | And, with a bit of poisoning in the image training data, all of
       | the security cameras at $Critical_Facility will be utterly blind
       | to anyone who wears a North Korean Military Intelligence full-
       | dress uniform...
        
         | bee_rider wrote:
         | I thought what I'd do was, I'd pretend I was one of those deaf-
         | mutes.
        
       | r-bryan wrote:
       | "Doesn't look like anything to me." --Westworld
        
         | kbns wrote:
        
           | jpbadan wrote:
           | Oh the irony
        
       | [deleted]
        
       | IanCal wrote:
       | Something really interesting here if you read the title and
       | comments first is that this is an _optical_ thing. It 's not
       | software running on the camera, it's physical.
        
         | gonzo41 wrote:
         | Interesting article. Though I don't think the cryptography
         | angle will pan out. I wonder if it was added because crypto's
         | been a buzz word of late, and the researchers just really
         | wanted to build this camera.
        
         | culi wrote:
         | biology inspires software inspires hardware
         | 
         | I guess with more and more stuff coming out of nano- and bio-
         | tech we can append "inspires biology" and bring it back around
        
         | JKCalhoun wrote:
         | Also blurs the line (hrumf) between what is hardware and what
         | is software? I mean software designed the hardware to behave
         | with a certain ... algorithm?
        
           | dekhn wrote:
           | While that idea might seem somewhat out-there, it's fairly
           | straightforward once you think about it. We know the transfer
           | function for light through matter, and can calculate its
           | derivative. Therefore, we can use ML to design matter shapes
           | that have desired properties.
           | 
           | All computers are effectively physical systems that control
           | their noise levels to achieve logical operations. In this
           | case, it's an analog system with no moving parts, but I
           | imagine that given the existence of spatial light modulators
           | and mems mirrors, you could probably reprogram the system in
           | realtime to erase what you wanted on the fly.
        
             | daniel-cussen wrote:
             | This COULD provide privacy.
        
           | YetAnotherNick wrote:
           | ICs are hardware designed by software to run any software for
           | decades.
        
         | fudged71 wrote:
         | This seems to be a new PR spin on the same technology that was
         | posted a while ago. 3D printed optical neural networks. I'm
         | surprised I haven't seen more interest considering the energy
         | efficiency and speed of computation.
        
           | robryk wrote:
           | The problem with them is that you don't get multiple layers
           | of nonlinear operations: wave functions in simple media form
           | a linear space after all.
        
             | dekhn wrote:
             | I don't see any technical reason you couldn't implement a
             | relu in optics. There is a whole area, nonlinear optics (I
             | think you may have intended that when you you said 'simple
             | media?' Well, let me see. https://arxiv.org/abs/2201.03787
             | 
             | From two years ago: https://opg.optica.org/ol/fulltext.cfm?
             | uri=ol-45-17-4819&id=... but again, this isn't simple
             | media.
        
         | GaryNumanVevo wrote:
         | Maybe HN should make the comment button only available after
         | first clicking the link. I see more and more of this "omg
         | title" behavior on HN these days
        
           | edgyquant wrote:
           | I'd prefer they enforced the rules and kick people who
           | mention reading the article
        
             | culi wrote:
             | like all 3 of us rn?
        
               | edgyquant wrote:
               | Low effort trolling as also against the rules
        
             | 6510 wrote:
             | Let ML decide!
        
       | jhallenworld wrote:
       | This is how the "violet cusps" from The Eyes of the Overworld
       | were made..
       | 
       | https://en.wikipedia.org/wiki/The_Eyes_of_the_Overworld
       | 
       | "There, Cugel finds two bizarre villages, one occupied by wearers
       | of the magic violet lenses, the other by peasants who work on
       | behalf of the lens-wearers, in hopes of being promoted to their
       | ranks. The lenses cause their wearers to see, not their squalid
       | surroundings, but the Overworld, a vastly superior version of
       | reality where a hut is a palace, gruel is a magnificent feast,
       | and peasant women are princesses -- "seeing the world through
       | rose-colored glasses" on a grand scale."
        
         | jamesjyu wrote:
         | Also similar to Wizard of Oz where everyone has to wear green
         | spectacles to protect their eyes, when in fact, it was to make
         | the Emerald City seem greener and more spectacular than it
         | actually was.
        
           | [deleted]
        
         | adhesive_wombat wrote:
         | Similar thing with the MASS system in a Black Mirror episode (
         | _Men Against Fire_ ) where visual reality could be substituted
         | by your implants (or rather using your implants, and _by_ the
         | people in control of them).
         | 
         | And again in the Christmas special, which was more similar to
         | this device in that it would block out certain things, or
         | everything (though it was in software, and again under external
         | control). Which sounds horrifying enough but was far from the
         | worst thing in the episode.
        
           | mistermann wrote:
           | Also often the same with "The Facts", "is", etc in our world.
        
         | z2 wrote:
         | "Joo Janta 200 Super-Chromatic Peril Sensitive Sunglasses have
         | been specially designed to help people develop a relaxed
         | attitude to danger. At the first hint of trouble, they turn
         | totally black and thus prevent you from seeing anything that
         | might alarm you." --Douglas Adams
         | 
         | Maybe this tech is a continuum, but we've skipped past Adams
         | straight to 2.0, and Overworld is 3.0.
        
         | selfhoster11 wrote:
         | In the same vein, a short sci-fi film "The Nostalgist" [0].
         | This film really opened my eyes regarding why we may not want
         | devices that alter our perception of reality.
         | 
         | [0]: https://www.youtube.com/watch?v=ZzCQtoQ8ypk
        
         | dekhn wrote:
         | BTW, if you enjoy Tales of the Dying Earth, I recently read
         | Cage of Souls, which is drawn in the same vein:
         | https://www.amazon.com/gp/product/B07DPRW17S
        
       | Lramseyer wrote:
       | So this is really cool and useful, but it's important to keep in
       | mind that since this is a diffractive structure, it probably only
       | works with coherent light (what you get from a laser.) Most
       | normal light sources produce incoherent light, and that tends to
       | not work so well with complex diffractive structures.
        
       | freedude wrote:
       | "Privacy protection is a growing concern in the digital era, with
       | machine vision techniques widely used throughout public and
       | private settings. Existing methods address this growing problem
       | by, e.g., encrypting camera images or obscuring/blurring the
       | imaged information through digital algorithms. Here, we
       | demonstrate a camera design that performs class-specific imaging
       | of target objects with instantaneous all-optical erasure of other
       | classes of objects. This diffractive camera consists of
       | transmissive surfaces structured using deep learning to perform
       | selective imaging of target classes of objects positioned at its
       | input field-of-view. After their fabrication, the thin
       | diffractive layers collectively perform optical mode filtering to
       | accurately form images of the objects that belong to a target
       | data class or group of classes, while instantaneously erasing
       | objects of the other data classes at the output field-of-view."
       | 
       | This only works for the privacy-minded, naive among us. If you
       | want to exclude something from a picture or video. Do NOT record
       | it, at all, EVER! If it can record anything it can record the
       | wrong thing.
        
         | freedude wrote:
         | What happens when a camera can't be used in a location that
         | needs added "security" (which is really surveillance and not
         | security) but it cannot be used due to expected privacy reasons
         | (bathrooms, locker rooms, fitting rooms). The claim is "proven"
         | that it cannot "see" your private parts because it is
         | programmed not too. I guarantee the AI will fail at some point
         | or is vulnerable to some attack.
         | 
         | Or what if it is connected to a radar/laser speed enforcement
         | camera and takes your cars photo because it detects the car
         | behind you speeding but it cannot "take" that part of the photo
         | because it mis-detected you as the speeder.
         | 
         | This technology is fraught with problems when it comes to
         | evidence in a court of law. What is not there is just as
         | important as what is there and if you are erasing what is there
         | you are also erasing what is not there?
        
       | Linda703 wrote:
        
       | batuhandirek wrote:
       | Isn't "erased" a bit misleading from the paper? I understand that
       | the camera does not see anything but objects of interest in which
       | it was trained and manufactured.
        
       | MisterBastahrd wrote:
       | About 6 years ago I sat on a jury. I was told by the defense that
       | the plaintiff, who was suing for being critically injured in a
       | workplace scenario, was overstating his injuries. They showed
       | evidence that saw the plaintiff washing his car. The defense
       | pointed out that there were timestamps on those car washing
       | videos and each took place over 4 hours because he had to rest
       | due to the pain from his sustained injuries. Aside from the facts
       | of the case which were clearly in favor of the plaintiff, this
       | attempt at deception pushed the jury to award more money than it
       | likely would have otherwise.
       | 
       | Now that storytime is out of the way, this particular AI reminds
       | me of a photo taken at a lynching.
       | 
       | https://lynchinginamerica.eji.org/report/assets/imgs/14_crow...
       | 
       | If you obscure the top half of this photo from view, how does
       | that change your perspective regarding what is going on at the
       | time? IMO, recordings need to record what is, not what we believe
       | we want to see.
        
         | notahacker wrote:
         | On the flip side, it was possible to make a lynching look like
         | a harmless social gathering with the technology of the era:
         | point the camera selectively or take a pair of scissors to the
         | image. Loss of timestamps is easily achieved by using any
         | recorded media without timestamps, or any third-rate video
         | editing software. Beria was airbrushed out of reproductions of
         | photos that had circulated for years before he fell out of
         | favour, but the revised images look convincing enough in the
         | absence of that context. Cameras have _never_ given a complete
         | picture of everything going on (and people start worrying about
         | panopticons with proposals that fall _well_ short of that).
         | 
         | Anybody that wants to show only what they want to see can
         | choose whether or not to record or edit the recording after the
         | fact. The actual use cases for tech that can decide whether
         | something is worth recording in real time are likely to be
         | comparatively mundane...
        
       | nullc wrote:
       | > AI-based camera design was also used to build encryption
       | cameras, providing an additional layer of security and privacy
       | protection. Such an encryption camera, designed using AI-
       | optimized diffractive layers, optically performs a selected
       | linear transformation,
       | 
       | Differentiable schemes do not generally make for secure
       | cryptography.
        
       | renewiltord wrote:
       | Dude this is insanely cool. It's through light diffraction that
       | it censors. You could make an ad blocker camera haha!
       | 
       | Very cool. Though I wonder if we'll get a Eurion censor camera
       | instead.
        
       | rexreed wrote:
       | "AI-designed optical filter blurs out areas of an image that
       | don't match the pre-trained network design" - seems to be a bit
       | more on point.
        
       | ortusdux wrote:
       | Previous discussion:
       | https://news.ycombinator.com/item?id=32469117
        
       ___________________________________________________________________
       (page generated 2022-08-24 23:00 UTC)