[HN Gopher] James Webb Telescope pictures didn't begin as images
       ___________________________________________________________________
        
       James Webb Telescope pictures didn't begin as images
        
       Author : tontonius
       Score  : 45 points
       Date   : 2022-10-17 08:36 UTC (1 days ago)
        
 (HTM) web link (www.thestar.com)
 (TXT) w3m dump (www.thestar.com)
        
       | j_crick wrote:
       | How am I supposed to read this if it's behind a paywall?
        
         | ipqk wrote:
         | Pay them money to get over the wall?
        
         | bguebert wrote:
         | I could read it after turning off javascript. Maybe try that.
        
           | Am4TIfIsER0ppos wrote:
           | lol the best kind of paywall is one that doesn't appear if
           | you prevent RCE
        
         | geuis wrote:
         | Odd I didn't hit one
         | 
         | https://archive.ph/ZijsH
        
         | throwaway742 wrote:
         | https://gitlab.com/magnolia1234/bypass-paywalls-firefox-clea...
        
         | bigbillheck wrote:
         | Money can be exchanged for goods and services.
        
       | geuis wrote:
       | https://archive.ph/ZijsH
        
       | seanalltogether wrote:
       | Quick question if anyone knows. One of the examples there is
       | showing the "Linear" images compared to the "Stretched" images.
       | I'm assuming that stretched means 0-255 RGB greyscale. But what
       | are the ranges of "Linear" and why is it so dark? Are those
       | floating point values of 0.0 - 1.0? Are they 12.0-18.0 like is
       | shown in the Rosolowsky dataset?
        
         | Cerium wrote:
         | I think the answer is that the raw data has too much dynamic
         | range. The stars are so much brighter than anything else that a
         | naive linear scaling from the native depth to 8 bit results in
         | all the shadows getting washed out and only the highlights
         | showing. Instead, the "stretched" seems to be compressing the
         | highlights to allow the shadow data to become brighter.
        
           | jcims wrote:
           | N00b astrophotographer here. I can't see the images but this
           | sounds correct. (Edit: paste the two images into an image
           | editor and look at the histogram. You'll see it)
        
           | mturmon wrote:
           | Probably true, and analogous to gamma correction
           | (https://en.wikipedia.org/wiki/Gamma_correction) although
           | they don't specifically say whether the range-compressing
           | transformation that they are using is a power law.
        
         | cconstantine wrote:
         | Amateur astrophotographer here. What I'm going to talk about is
         | true for my rig. The JWST is astronomically a better telescope
         | than what I have, but the same basic principles apply.
         | 
         | The cameras used here are more than 8 bit cameras, so there has
         | to be some way to map the higher bit-depth color channels to 8
         | bits for publishing. The term for the pixel values coming off
         | the camera is ADU. For an 8 bit camera, the ADU range is 0-255.
         | For 16bit cameras (like what mine outputs) is 0-65536. That's
         | not really what stretching is about though.
         | 
         | A lot of time, the signal for the nebula in an image might be
         | in the 1k-2k range (for a 16bit camera), and the stars will be
         | in the 30k to 65k range. If you were to compress the pixel
         | values to an 8 bit range linearly (ie, 0 adu = 0 pixel, 65536
         | adu = 255) you're missing out on a ton of detail in the 1k-2k
         | range of the nebula. If you were to say 'ok, let's have 1k adu
         | = 0 in the final image, and 2k adu = 255', then you might be
         | able to see some of the detail, but a lot of the frame will be
         | clipped to white which is kind of awful. That would be a linear
         | remapping of ADU to pixel values.
         | 
         | The solution is to use a power rule (aka, apply an exponent to
         | the ADU, aka create a non-linear stretch). (EDIT: The specific
         | math is probably wrong here) That way you can compress the high
         | adu values where large differences in ADU aren't very
         | interesting, and stretch the low-adu values that have all the
         | visually interesting signal. In the software this is done via a
         | histogram tool that has three sliders; one to set the zero
         | point, one to set the max point, and a middle one to set the
         | curve.
         | 
         | It's kinda like a gamma correction.
        
       | intrasight wrote:
       | No digital "images" begin as images.
        
         | Maursault wrote:
         | Not so. In digital photography, _they all do._ The headline is
         | entirely wrong. JWST 's mirrors are in fact reflecting _an
         | image_ onto its _image_ sensors which use tiny light-sensitive
         | diodes to convert light into electrical charges which are then
         | translated into digital information and recorded as pixels. But
         | the entire thing would be pointless _if there wasn 't an image
         | to begin with._ The fact is we can never see anything, we only
         | see an image. But since this is part of how seeing is defined,
         | it is taken for granted that the thing itself is not pressing
         | on our retinas, only a reflection of light, aka an image, is.
        
         | nomel wrote:
         | Actually, unless you're doing something exotic, like using
         | encoded aperture or light field stuffs, most digital images do
         | begin as a _real image_ [1] focused on a sensor.
         | 
         | 1. https://en.wikipedia.org/wiki/Real_image
        
       | rwmj wrote:
       | https://archive.ph/ZijsH
        
       | supernova87a wrote:
       | I have friends/former colleagues who work on these pipelines, and
       | I can tell you that it's not a stretch to say that there are
       | dozens if not hundreds of people whose entire working lives are
       | about characterizing the sensors, noise, electronics, so that
       | after images are taken, they can be processed well /
       | automatically / with high precision.
       | 
       | (and after all, if these instruments/telescope were 30 years and
       | $10B in the works, you would hope there's a fairly well developed
       | function to make the data as useful as it can be)
       | 
       | The goal is to get the "true" physical measurement of the light
       | that arrives at the telescope. After those photons arrive, the
       | measurements get contaminated by everything to do with the
       | hardware, sensors, electronics, processing artifacts, and there's
       | a whole organization that exists to study and remove these
       | effects to get that true signal out.
       | 
       | Every filter, sensor, system has been studied for thousands of
       | person-hours and there are libraries on libraries of files to
       | calibrate/correct the image that gets taken. How do you add up
       | exposures that are shifted by sub-pixel movements to effectively
       | increase the resolution of the image? How to identify when
       | certain periodic things happen to the telescope and add patterns
       | of noise that you want to remove? What is the pattern that a
       | single point of light should expect to be spread out into after
       | traveling through the mirror/telescope/instrument/sensor system,
       | and how do you use that to improve the image quality? (the 6
       | pointed star you see)
       | 
       | Most fascinating to me is when someone discovers or imagines that
       | some natural phenomenon that you thought was a discovery, turns
       | out to be a really subtle effect of the noise in the instrument?
       | (ADC readout noise / spike that subtly correlates with a high
       | value having passed by during readout of a previous pixel? which
       | makes your supernova discovery actually a fluke? I'm trying to
       | recall the paper discovering that the pixel value on one chip of
       | an instrument was related to the _bitwise_ encoding of the
       | readout on a neighboring chip 's pixel...)
       | 
       | Then there's even a whole industry of how to archive data, make
       | it useful to the field, across telescopes, across projects, and
       | over time.
       | 
       | Lots of science and work here over decades.
        
         | kurthr wrote:
         | Once bytedance put their filters on, they'll be a so much more
         | sexy!
         | 
         | More seriously, I think so many people are only familiar with
         | using image processing techniques to make things look
         | subjectively "better", that they find it harder to believe that
         | scientists don't do the same things to their research images.
         | That is a bit corrosive to society, but real today.
        
         | xani__ wrote:
        
       | chubs wrote:
       | Apologies if off-topic, but does anyone have a good source for
       | downloading the JWST images in good quality? They are inspiring.
        
         | anon_123g987 wrote:
         | https://webbtelescope.org/resource-gallery/images
        
       | awinter-py wrote:
       | ugh I mean yes, and yes when published these images should always
       | say how they were colorized / how the spectrum was compressed
       | 
       | but that title
       | 
       | this is like saying 'if someone pumped raw H264 into your optic
       | nerve you would dance like the guy in the avicii levels video'
       | 
       | like yes, we all know that's how that video got made, but nobody
       | does that to their eye
        
       ___________________________________________________________________
       (page generated 2022-10-18 23:01 UTC)