[HN Gopher] Entropy Is Fatal
       ___________________________________________________________________
        
       Entropy Is Fatal
        
       Author : sylvain_kerkour
       Score  : 120 points
       Date   : 2022-06-08 14:59 UTC (8 hours ago)
        
 (HTM) web link (kerkour.com)
 (TXT) w3m dump (kerkour.com)
        
       | pizza wrote:
       | There may be a slight nit with this enframing of entropy.
       | Actually, the article visualizes it quite nicely:
       | 
       | the assumption that the world is a closed system - depicted here
       | by the black border around the particles, showing it is closed
       | off from the world.
       | 
       | Sure, in a closed system, eventually you get something like heat
       | death, within the box.
       | 
       | But life, and the world, are an open system - at least especially
       | from the human-scale life experience. You can't say that heat
       | death is sure to happen.
       | 
       | Does entropy increase at the macro level? Pretty much yeah. But
       | to define what 'macro' is, is hard enough to make any answer
       | dubious or uninteresting - is it the entire universe? Is it the
       | solar system? In either case the scale at which it appears closed
       | much bigger your life, which is coincidentally a scale at which
       | it may seem open - bc the world is not uniform and ergodic at the
       | living-as-a-human-scale. We each experience a different life
       | story (another debate for the future, perhaps? :^) )
       | 
       | If you like you can imagine that the entropy in your own
       | particular life could always _decrease_ while the entropy
       | somewhere else far away undergoes a commensurate simultaneous
       | increase.
       | 
       | As I remember it, Mihaly Csikszentmihalyi (author of Flow) wrote
       | in the book _The Evolving Self_ that basically the meaningfulness
       | of your life is:                   (the flow you experience ==
       | the entropy/time 'life bandwidth rate' you experience)
       | x               (the good you do for others)
       | 
       | By going for a minimalist approach, you try to maximize the
       | Signal-to-Noise ratio. If you'll allow for some eyebrow-raising
       | application of mathematics to philosophy for a sec...
       | 
       | the Shannon-Hartley theorem tells us that the channel capacity C
       | depends                   B: the channel substrate bandwidth,
       | SNR = Power(Signal) / Power(Noise), both in decibels.
       | 
       | as                   C=B log2(1 - SNR)
       | 
       | Here, C could stand for how good a life you're leading. So you'd
       | either want to increase the signal, or decrease the noise, or
       | increase B (the 'underlying capacity for enjoyment of life'). I
       | guess it is a matter of debate whether you can improve the SNR
       | more by tapping in more, or tapping into noise less. Probably
       | something you can try to adaptively improve by 'gradient descent'
       | ie 'trying out new ways of living', lol
        
       | jussivee wrote:
       | This got me thinking. Minimalism or minimalist systems are often
       | seen as systems with less energy. Or, we seem to think it takes
       | less energy to remove objects than to add objects (complex
       | systems). More stuff, more energy. But, minimalism needs a lot of
       | useful work. Less stuff is not the same as a minimalist system.
       | Using a music analogy, I argue, it's much easier (takes less
       | energy) to fill an empty space with hundreds of notes than with a
       | few carefully selected notes :)
        
       | tdullien wrote:
       | Recommendation: People should read the original "Parkinson's
       | Law".
        
       | iamjbn wrote:
       | Integrate Bitcoin into your business and all the arguments in the
       | write up just unwind.
        
       | nathias wrote:
       | This is what happens when people are philosophically illiterate,
       | they see important concepts out there but just don't have the
       | proper tools to think about them.
        
       | throw10920 wrote:
       | I like the idea presented. The application of it to various
       | constructs brings about ideas as to how to reverse the entropy:
       | 
       | Software: try to build orthogonal features so that maximum value
       | can be obtained with minimum complexity; build tools with well-
       | designed deprecation mechanisms, then have releases where
       | underused/redundant functionality is first deprecated and then
       | removed entirely.
       | 
       | e.g. all the major extension repositories for a particular tool
       | parse the machine-generated deprecation list, scan extensions,
       | and email owners when they use deprecated APIs. Maybe provide
       | "shims" where APIs are internally removed entirely but
       | transitional packages are provided that implement the removed
       | APIs in terms of still-present ones.
       | 
       | Companies: deliberately design policies and organizational
       | structures that are simple and minimize entropy; be prepared to
       | pay some upfront financial costs to do so (e.g. because your
       | policies aren't layers upon layers of hacks that grow every time
       | someone makes a mistake), but reap rewards in the future as
       | incidental costs of running your company are lower and less
       | bureaucracy means workers are more efficient.
       | 
       | e.g. instead of having a bunch of different rules about different
       | kinds of vacation time, consolidate them all into 1 or 2 kinds
       | (normal and medical?) and give everyone a bit extra to compensate
       | for some of the edge cases that were smoothed over.
       | 
       | Governments: legislators should spend some time attempting to
       | "refactor" old laws such that fewer words and less complexity
       | still results in effectively the same legal environment; reduce
       | government functions (as much as some of you may dislike that
       | idea).
       | 
       | e.g. instead of trying to provide hundreds/thousands of different
       | special-case tax breaks for low-income families, use a "sliding"
       | tax rate where e.g. at $1k/year annual income your tax rate is
       | 0%, at $1M a year your tax rate is n%, with linear interpolation
       | between those two extremes (or whatever). Simpler, still somewhat
       | fair, people might actually be able to do their taxes by reading
       | the law, and no suspicious discontinuities[1]. Or something else
       | - with some thought and a little experience in government, I'm
       | sure _someone_ could come up with an income tax system that was
       | an order of magnitude shorter than what the US has now.
       | 
       | [1] https://danluu.com/discontinuities/
        
         | kubanczyk wrote:
         | > Governments
         | 
         | This would be a major constitutional change, by which I mean
         | that it introduces new limits on government. Politicians would
         | have some understandable inclination to oppose it. Not saying
         | it isn't needed, it obviously is.
         | 
         | In this context "be a minimalist" is a thought stopper in all
         | its grace.
        
       | cryptonector wrote:
       | https://www.physics.princeton.edu/ph115/LQ.pdf
       | https://en.wikipedia.org/wiki/The_Last_Question
       | 
       | Asimov was way ahead of TFA :)
       | 
       | I like to think that life is a thermodynamic mechanism that:
       | - locally reduces entropy by consuming         lower-entropy
       | energy and casting out         higher entropy stuff       -
       | reproduces
       | 
       | by which definition stars are _almost_ alive. I say  "almost"
       | because stars only manage to keep local entropy from growing very
       | fast, but they don't manage to reduce it.
       | 
       | For example, life on Earth consumes low-entropy (because
       | colimated and short-ish wavelength) sunlight and molecules of
       | various types (e.g., CO2) and uses that to build locally-lower-
       | entropy things (plankton, plants, animals, ...), in the process
       | emitting lower entropy things like detritus, feces, etc., but
       | especially longer-wavelength light in all directions. Because
       | Earth's climate is roughly in equilibrium, if you examine all the
       | light going in (colimated, low-wavelength sunlight) and all the
       | light going out (longer-wavelength IR in all directions), the
       | energy must balance, but the entropy must be much higher on the
       | outbound side. Similarly, Venus must be dead because it reflects
       | so much sunlight, thus failing to use it to improve local
       | entropy, thus it must be dead.
        
         | Lichtso wrote:
         | Inspired by Erwin Schrodinger - "What Is Life? The Physical
         | Aspect of the Living Cell" from 1944 ?
         | 
         | > Schrodinger explains that living matter evades the decay to
         | thermodynamical equilibrium by homeostatically maintaining
         | negative entropy in an open system.
         | 
         | https://en.wikipedia.org/wiki/What_Is_Life%3F
        
         | proc0 wrote:
         | This is negentropy (not my fav. word, but it is the term used).
         | Indeed it's the signature of life, although I think there would
         | be a threshold that all living creatures meet, but non-living
         | systems do not. In other words life produces lots of
         | negentropy, probably exponentially, unlike other systems like
         | celestial bodies.
        
           | jimmySixDOF wrote:
           | There was a recent podcast episode from Sean Carroll's
           | Mindscape [1] where they focus on and around the Krebs Cycle
           | and discuss it as an example of Entropy. Turns out Entropy
           | really is fatal.
           | 
           | [1] https://www.preposterousuniverse.com/podcast/2022/05/23/1
           | 98-...
        
       | superb-owl wrote:
       | I'm not sure "Entropy" is the right word for what the author
       | wants to talk about, but the article raises some interesting
       | points
       | 
       | > 1st poison: Unlimited and uncontrolled growth
       | 
       | IMO this is by far the biggest poison, and the one our society is
       | most prone to ignoring. Every company, government budget,
       | population, etc is considered "successful" if it's growing, and
       | "failing" if it's shrinking. And the growth is always measured in
       | terms of percentage, meaning the growth is
       | compounding/exponential.
       | 
       | Sustainable living means getting comfortable with flat or even
       | negative growth.
        
         | toss1 wrote:
         | From the article: >> ...a program with more lines of code is
         | better...
         | 
         | Immediately reminded me of Bill Gates' comment on how measuring
         | progress on a designing and building a software project by
         | using a Total_Lines_Of_Code metric is about as dumb as
         | measuring progress on designing and building an airplane using
         | a Total_Weight metric.
         | 
         | What you want in both is the least amount of code/material that
         | will actually do the job, and being smarter than a simple "more
         | is better"approach. Yet so many supposedly intelligent managers
         | use such approaches...
        
           | lazide wrote:
           | Well, one is easy, the other is hard hah!
           | 
           | The big issue near as I can tell, is that defining what the
           | job actually is, and what accomplishing it actually looks
           | like is the hardest part most of the time.
           | 
           | Many of the most innovative solutions also come from changing
           | pre-built assumptions about what they can look like.
        
           | wildmanx wrote:
           | > Yet so many supposedly intelligent managers use such
           | approaches...
           | 
           | Yet so many supposedly intelligent engineers work for such
           | managers...
        
         | nightski wrote:
         | I don't agree. Growth is not the only thing valued. Value
         | companies are a big part of the US Markets for example.
         | Dividend stocks are a thing.
         | 
         | But growth isn't about one company ever increasing. It's about
         | new companies innovating and taking over old ones. As long as
         | there is innovation, there will be growth.
         | 
         | Change should be embraced. I personally see no reason to
         | advocate for stagnation of human progress with the same handful
         | of companies serving humans for all time.
        
           | superb-owl wrote:
           | I guess my point (wrt corporations) is that companies should
           | turn from Growth into Value a lot sooner. E.g. I would have
           | preferred Google to become a Value company once it became the
           | dominant search provider, instead of constantly looking for
           | new revenue streams.
        
             | nightski wrote:
             | I agree with that wholeheartedly. I'd go further and say
             | that I think companies are getting too big to the point
             | where different divisions have no relation to each other at
             | all. Any major tech stock has this problem. The problem is
             | I have no idea how it could be discouraged. It feels like
             | whenever something is regulated it just results in
             | something completely different than intended.
        
               | lazide wrote:
               | Part of the problem is the tax code.
               | 
               | Dividends (what a value company typically produces) are
               | taxed at a much higher rate than long term capital gains
               | (which a 'growth' company produces).
               | 
               | Dividends usually at the same rate as income, from 10-37%
               | at the federal level, and often the same if the state has
               | an income tax.
               | 
               | Long term capital gains are 0-20%, and many states don't
               | tax them at all even if they tax income.
               | 
               | Everyone has a significant incentive to go 'growth' in
               | that environment.
        
               | kgwgk wrote:
               | Do you know what is a share repurchase or buyback? [I
               | find it unlikely that you don't - but your comment
               | doesn't make much sense if you do.]
        
               | lazide wrote:
               | A share buyback is a way to convert extra cash (or new
               | debt) into almost a dividend. It's gotten popular lately
               | for exactly this tax efficiency reason.
               | 
               | It is not a guaranteed way to do it however, as unlike a
               | dividend, there is no way to directly tie the repurchase
               | of shares to x amount of actual long term market value.
               | 
               | It often works though, and when markers were going up, it
               | helps.
        
           | ngvrnd wrote:
           | The author wrote "uncontrolled growth", not "any growth".
        
         | lr4444lr wrote:
         | Growth really means productivity. That doesn't mean simply
         | "more": it means "more with less".
        
         | buescher wrote:
         | "Growth for the sake of growth is the ideology of the cancer
         | cell." - Edward Abbey
        
         | IncRnd wrote:
         | > I'm not sure "Entropy" is the right word for what the author
         | wants to talk about, but the article raises some interesting
         | points
         | 
         | It's not the right word. Entropy is a "term-of-art" that has a
         | specific meaning that differs from that in the general populace
         | or in thermodynamics. This website can't be loaded over
         | HTTPS:// without sufficient Entropy.
        
           | bogeholm wrote:
           | > Entropy is a "term-of-art" that has a specific meaning that
           | differs from that in the general populace or in
           | thermodynamics.
           | 
           | Would you care to explain that? The term 'entropy' originates
           | in statistical mechanics / thermodynamics:
           | https://en.m.wikipedia.org/wiki/Entropy
        
             | vincentmarle wrote:
             | There are many more definitions of entropy:
             | https://en.wikipedia.org/wiki/Entropy_(disambiguation)
        
               | tingletech wrote:
               | none of these seem consistent with how the original
               | article uses "entropy". The original article specifically
               | mentions thermodynamic entropy in its strained analogy.
        
         | Consultant32452 wrote:
         | I would add the whole monetary system is designed this way with
         | inflation.
        
         | sumy23 wrote:
         | Growth is possible by increasing outputs from the same level of
         | inputs. Certain types of growth are unsustainable, yet other
         | types of growth, e.g. productivity growth, is definitely
         | sustainable.
        
           | photochemsyn wrote:
           | Growth should probably be more precisely defined in the vast
           | majority of cases to avoid confusion and misunderstanding. In
           | terms of systems, the quantity that grows or shrinks should
           | be concrete (i.e. not a rate, certainly).
           | 
           | For example, human population. Let's say the birthrate in a
           | population is growing, so a naive conclusion would be that
           | the population, a concrete object, must also be growing. This
           | is not true if the deathrate is growing by the same amount.
           | Now, this is the standard kind of thing you see in a intro to
           | differential equations course: a river feeds a lake at rate
           | X, and another river drains a lake at rate Y, and so is the
           | lake growing or shrinking? Ooops, we neglected to take
           | evaporation into account, so we get the wrong answer.
           | 
           | Economists are among the worst offenders in this misuse of
           | concepts. Take 'productivity growth' - this is actually
           | growth in the rate at which product is produced, right? If
           | productivity is flat and market demand is flat and human
           | population is flat, well, that means everyone is getting what
           | they need, if say the product is cell phones, for example. If
           | everyone has a cell phone, and cell phones are replaced every
           | five years, then what is productivity growth? Maybe you can
           | make the cell phone manufacturing line more efficient, by
           | recycling the materials from old cell phones into new cell
           | phones, or by automation etc. Nevertheless, the desired rate
           | of cell phone production is fixed, and everyone has a cell
           | phone.
           | 
           | Of course the market should be broken up between different
           | producers to encourage competition, but here growth in
           | production of a better cell phone with higher demand is
           | counterbalanced by shrinkage in market share by another
           | producer, as net demand is flat.
           | 
           | Unfortunately, if the cell phone makers form a cartel, and
           | raise their prices in unison, some economist will call that
           | 'economic growth' based on the fact that they're extracting
           | more money from a market with fixed demand, which is just
           | ludicrous - but that's modern neoclassical economics for you.
        
             | [deleted]
        
           | darkerside wrote:
           | In the long run, most J curves are actually S curves
        
             | sophacles wrote:
             | I question the "most" here, rather than "all". Examples of
             | J curves that aren't S curves?
        
               | tlholaday wrote:
               | Dark Energy astrophysics.
        
               | uoaei wrote:
               | You seem confident an "ultraviolet catastrophe"-like
               | scenario won't play out there, too.
        
               | DeathArrow wrote:
               | >Examples of J curves that aren't S curves?
               | 
               | Reindeer population growth in Alaska.
        
               | dron57 wrote:
               | That's absolutely an S-curve. Any animal population will
               | eventually run out of resources.
        
               | uoaei wrote:
               | Wait a few decades and it will probably plateau at, if
               | not shrink from, a maximum.
        
               | sumy23 wrote:
               | 2^x
        
           | toss1 wrote:
           | It seems that productivity growth is still necessarily
           | limited in the end by physics.
           | 
           | Remove the unnecessary actions to produce X, and you're down
           | to the bare minimum set of actions. Now speed up those
           | actions and you'll eventually reach some minimum time
           | requirement, and the output of X is a function of
           | Required_Actions * Required_Time and how many Producing_Units
           | you have and available time.
           | 
           | Seems it'd be asymptotic
        
             | sumy23 wrote:
             | Everything is limited by physics. But I think the limit is
             | not close to where we are right now. Consider a smart
             | phone. Physically, what is it? Some silicon, glass, a
             | lithium-ion battery, and some other trace metals. If you
             | were to have the raw inputs in front of you, it would be a
             | small pile of dust. Yet, with just that small amount of
             | material, a person can get access to a near infinite amount
             | of information and entertainment. And smartphones can run
             | software, which allows the phone to be updated for near-
             | zero marginal cost. And this is only something invented in
             | the last few decades. There are so many amazing things
             | being created around us all the time. I don't know how you
             | can look at this situation and think "yup, we've reached
             | the end of human ingenuity."
        
               | Klapaucius wrote:
               | If you look at the weight of the tech product (phone) in
               | isolation, you are correct although not in a very
               | meaningful way. If you look at the amount of physical
               | material that went into the process leading up to
               | producing that product, the quantity would amount to many
               | tonnes of material in terms of crushed ore, fossil fuels,
               | water consumption, chemicals, packaging and so on. A
               | phone does not only represent its own grammes material,
               | but an enormous tail of environmental impact in form of
               | waste, emissions and extraction remains. (This is not to
               | mention the human labor cost involved in obtaining some
               | of the rare earths used, from countries with, ehrm, lax
               | labor laws).
        
               | toss1 wrote:
               | I don't think at all that we're near the limit of human
               | ingenuity.
               | 
               | The quibble I had was with the "sustainable", which in
               | that context, I read as indefinitely/infinitely
               | sustainable (and it seems other responders have similar
               | issues).
               | 
               | I agree that there should be a lot more human ingenuity
               | ahead of us than behind us (assuming that those seeking
               | power over others, e.g., megalomaniacs and autocrats,
               | don't first destroy us).
               | 
               | That said, productivity of any one thing is certainly
               | never an x^y sort of curve but eventually flattens and
               | becomes asymptotic, if not declining.
        
               | biomcgary wrote:
               | Sustained innovation is finding a series of technologies
               | with S-curve growth that can be transitioned away from as
               | they approach their asymptotic limit. Then, society can
               | stay in exponential phase until it hits
               | https://en.wikipedia.org/wiki/Kardashev_scale#Type_III
        
               | lazide wrote:
               | That's a bit like saying 'we can only make horses so
               | efficient', which is true, but that's why we end up
               | coming up with automobiles, airplanes, etc.
               | 
               | As long as we have free brain cycles focused on solving
               | problems or understanding something we do not yet
               | understand, we'll continue to do better.
        
               | Terry_Roll wrote:
               | > Everything is limited by physics.
               | 
               | Everything is controlled by the maths of physics and
               | chemistry.
        
               | lazide wrote:
               | Some would argue everything fundamentally is physics,
               | including mathematical models of chemistry. I can't say
               | they are wrong.
               | 
               | Physics being math doesn't quite make sense to me yet, if
               | for no other reason than a large body of physics laws are
               | 'because that is what happens in real life' when you get
               | down to it.
               | 
               | It's clear the math is a tool to try to reason about the
               | reality, not the other way around.
        
             | wildmanx wrote:
             | > It seems that productivity growth is still necessarily
             | limited in the end by physics.
             | 
             | Biology will put limits in place long before physics will.
             | 
             | Sadly, most techies completely ignore or miss this point.
        
       | jschveibinz wrote:
       | With respect to the author, the article fails to show that in
       | fact "entropy" is related to "complexity" and how the two are
       | related.
       | 
       | Entropy should not be thought of as "overhead" or "wasted
       | energy," which is what I believe the author is getting at.
       | Instead, entropy is the tendency to disorganization. So, the
       | analogy could be the more stuff you have, the more disorganized
       | it becomes; but this is a weak analogy in my opinion.
       | 
       | Here is a link to another article that discusses the link between
       | complexity and entropy. The two are indeed related: entropy is
       | necessary for complexity to emerge. Entropy is not, as this
       | author suggests, a result of complexity.
       | 
       | https://medium.com/@marktraphagen/entropy-and-complexity-the...
        
         | bob1029 wrote:
         | > entropy is necessary for complexity to emerge.
         | 
         | Something about this particular line does not sit well with me.
         | 
         | How would we define the relationship between entropy,
         | complexity and _information_?
        
           | pizza wrote:
           | Minor lightbulb went off in my head: you might be interested
           | in slide 17 here, from a presentation on Shape Dynamics [0]
           | 
           | tldr:                   (Shannon) entropy [1]: expected
           | description length of a state of a system
           | (Shape) complexity [0]: a 'clumped-ness' metric: clump sizes
           | / inter-clump separations              information: not sure
           | anyone ever really resolved this in a satisfying way :^) [2]
           | 
           | [0] https://philphys.net/wp-content/uploads/2019/Barbour-
           | Saig_Su...
           | 
           | [1] https://physics.stackexchange.com/questions/606722/how-
           | do-di...
           | 
           | [2] https://monoskop.org/images/2/2f/Shannon_Claude_E_1956_Th
           | e_B...
        
           | Comevius wrote:
           | Information thermodynamics is what you are looking for, it's
           | the unification of thermodynamics and information theory.
           | Bear with me because I'm not a physicist, but my
           | understanding is that information needs a medium, in which
           | way it is similar to heat, and you can use the same
           | statistical mechanics to describe it, or fluctuation theorem,
           | which is more precise.
           | 
           | My understanding is that this is pretty cool stuff that
           | solves Maxwell's demon and also sort of explains biological
           | evolution, because apparently a system responding to it's
           | environment is computation, performed by changing the
           | system's state as a function of a driving signal, which
           | results in memory about the driving signal that can be
           | interpreted as computing a model, a model that can be
           | predictive of the future. Now apparently how predictive that
           | model is equals to how thermodynamically efficient the system
           | is. Even the smallest molecular machines with memory thus
           | must conduct predictive inference to approach maximal
           | energetic efficiency.
           | 
           | https://www.researchgate.net/publication/221703137_Thermodyn.
           | ..
        
           | tingletech wrote:
           | thermodynamic entropy is to heat as Shannon entropy is to
           | information?
        
             | contravariant wrote:
             | Hmm, not entirely sure if those terms fit exactly. It's
             | easier to point out you can go from one to the other by
             | setting to hamiltonian to the negative logarithm of the
             | probability density (or use the Boltzmann distribution to
             | go the other way).
        
           | jschveibinz wrote:
           | I agree it doesn't feel right. But complexity, like life,
           | does emerge even though the 2nd law holds. It is a matter of
           | scale. Entropy does not mean everything becomes disordered.
           | And now I defer to the physicists, because as an engineer I
           | am going out of my lane...
        
       | quadcore wrote:
       | As a side note, when John Carmack was asked why he always started
       | a new engine from scratch, he used to say: "to fight code
       | entropy"
        
       | ngvrnd wrote:
       | Also inevitable.
        
       | photochemsyn wrote:
       | > "The second law of thermodynamics states that the entropy of an
       | isolated system always increases because _isolated systems_
       | spontaneously evolve towards thermodynamic equilibrium... "
       | (emphasis added)
       | 
       | There are no isolated systems in the context being discussed,
       | i.e. houses and phones and so on. This is an important point:
       | steady-state thermodynamics is a far more complicated beast than
       | closed-system thermodynamics, and moving-state thermodynamics
       | even more so.
       | 
       | Furthermore, how does one distinguish between useful and useless
       | work? Work is work, the value of work is something humans decide
       | on socially. Say people are put to work building a pyramid, so
       | the kings and priests have a nice high place to sit. Egyptian
       | pyramids are impressive, but are they useful? Maybe in terms of
       | some abstract notion like consolidating the power of the state or
       | impressing one's neighbors.
       | 
       | Anyway, here are some solutions to the author's points:
       | 
       | 1) Unlimited and uncontrolled growth: match inputs to outputs.
       | Delete as many old emails per day as you receive new ones. If
       | it's important, copy and store securely offline. If you buy new
       | clothes, send an equal amount of old clothes to recycler or the
       | thrift store. If that's too much work, cut back on inputs
       | instead.
       | 
       | 2) Decision-makers have no skin in the game: If the decision-
       | maker wants to build a pyramid, the decision-maker should be
       | spending their days building that pyramid alongside their
       | employees. Then they might decide that building pyramids is a
       | useless activity, and perhaps building a bridge or a dam would be
       | a wiser undertaking. Yes, investment capitalism has this problem.
       | Put the shareholders to work on the production line or give them
       | the boot, that's the solution.
       | 
       | 3) Momentum is not a source of entropy I don't think. Entropy is
       | more like diffusion than directed momentum. An object impacting
       | another due to momentum could increase entropy, like a car
       | running into a brick wall. Maybe the author is talking about
       | something abstract like 'the momentum of bad habits is hard to
       | break'? Here is where an injection of entropy ('shaking things
       | up') might be helpful.
       | 
       | Physics analogies can be rather overused, in conclusion.
        
       | xkcd-sucks wrote:
       | Counterpoint, "Entropy is Life": Diffusion is an entropy driven
       | process, and is fundamental to most biological processes. Plus
       | other more specific entropy driven reactions [0 + google it].
       | Lazy metaphors...
       | 
       | [0] https://www.nature.com/articles/srep04142
        
         | molbioguy wrote:
         | Counterpoint to your counterpoint :) I think local entropy has
         | to decrease first to create the concentration gradients that
         | are harvested by diffusion. So life must also rely on
         | decreasing the local entropy. All depends on how you define the
         | system.
        
           | contravariant wrote:
           | You can't decrease local entropy without increasing it
           | overall. We can only live by moving towards higher entropy,
           | it's how we tell the past from the future.
        
       | akhmatova wrote:
       | _In one word: Entropy is fatal._
       | 
       | No it's not, and there's plenty of evidence that a certain amount
       | of disorder is needed to be creative and flexible, to adapt
       | quickly to change, and evolve.
       | 
       | The key is to find a balance.
        
       | Helmut10001 wrote:
       | Great article. I could directly translate these thoughts to self-
       | hosting. Having worked my way through linux, docker, systems,
       | networks (etc.), since starting in 2017, I can say that the most
       | important principle is Minimalism (and Reproducibility, but both
       | go in hand). The points made by the author apply equally: Reject
       | solutions that bring chaos, do not install everything - select
       | services carefully, but make sure those services you host are
       | stable and set up correctly.
        
       | thisisbrians wrote:
       | virtually nobody understands NFTs that doesn't actively
       | participate in the space. do with that information what you will.
        
       | hapiri wrote:
       | When I read "fatal" I don't imagine the "death" but more
       | "inevitable".
        
       | Lyapunov_Lover wrote:
       | The author makes a mistake here.
       | 
       | It's fine to think of entropy as messiness; that's the Boltzmann
       | picture of statistical mechanics. The mistake is thinking that
       | lowering entropy, or getting rid of the mess, is a satisfactory
       | strategy.
       | 
       | Think of it as a negative feedback system, like a thermostat.
       | Keeping entropy low means continually correcting errors. This is
       | a successful strategy only if the world always stays the same,
       | but it notoriously does not. Some degree of messiness is needed
       | to remain flexible, strange as it may sound. There must be room
       | to make the good kind of mistakes and happy little accidents (as
       | Bob Ross would put it).
       | 
       | Because the author chose an analogy rooted in statistical
       | mechanics, here's another: supercooled water. Take a bottle of
       | purified water and put it in the cooler. It gets chilled below
       | freezing temperature without freezing. If you give it a shake, it
       | instantly freezes. The analogy may sound a bit vapid, but noise
       | is the crucial ingredient for the system to "find" its lowest-
       | energy state. The system crystallizes from some nucleation site.
       | 
       | It's the same with evolution. Mutations are a must. Keeping our
       | genetic entropy low isn't a viable option, because that means
       | we'll get stuck and die out. There must be opportunity for
       | randomness, chance, chaos, noise; all that jazz.
       | 
       | This is how China became an economic powerhouse under Deng
       | Xioping, for instance. They experimented with various policies
       | and if something accidentally turned out to work great, it became
       | something of a "nucleation site". The policy that worked in, say,
       | Shaowu, would spread all across China. But it would never have
       | worked if they stuck with a rigid, Confucian strategy of keeping
       | "entropy" low at all times.
       | 
       | Entropy isn't necessarily fatal. Entropy can be used as a
       | strategy for adaptation.
        
         | formerkrogemp wrote:
         | Philosophically, many problems in our society might
         | theoretically be attributed for optimizing for local maxima or
         | other short term goals. Incentives and goals aren't aligned,
         | and rules are far too rigid in favor in too few of the people.
         | Democratic policies as in benefiting the people and democratic
         | as in we elected this policy. Innovation and mutation are the
         | spice of life.
        
         | chrisweekly wrote:
         | Related: itcan be challenging to strike the right balance
         | between efficiency at one pole and flexibility
         | (/agility/resilience) at the other.
        
         | [deleted]
        
         | bryzaguy wrote:
         | Perfect use of "all that jazz"
        
         | googlryas wrote:
         | See also: "Antifragile: Things That Gain from Disorder" by
         | Nassim Taleb.
        
         | anigbrowl wrote:
         | This is why I feel wary whenever I hear the phrase 'best
         | practices'. Although they're generally promoted with good
         | intentions, they're often accompanied by a utilitarian
         | certitude that rapidly hardens into dogmatic inflexibility,
         | followed by defensiveness or outright dishonesty in response to
         | unforeseen consequences.
         | 
         | Most 'best practices' are good defaults, but the superlative
         | rhetoric comes with the unstated assumption that any deviation
         | is necessarily inferior, and that the best cannot be iterated
         | upon. This drains agency from actors within a system, selecting
         | for predictable mediocrity rather than risky innovation.
        
         | agumonkey wrote:
         | noise is the way out of local optimums ?
        
           | lazide wrote:
           | Pretty much the only one, near as anyone can tell, though an
           | easy way to encourage or help someone or something get stuck
           | in a local optimum is also a stable habitat/environment, as
           | it avoids weeding out problematic noise from helpful noise
           | until it is too late for all but the best luck to save it.
        
           | a-nikolaev wrote:
           | pretty much
        
             | titzer wrote:
             | Simulated annealing comes to mind.
        
           | orthecreedence wrote:
           | I've observed this personally! After finding a solution to a
           | problem and repeating it numerous times, I'll often randomly
           | change one parameter of the solution (I'm talking about
           | things like opening jars, not building complex systems, but
           | it could apply there as well) to see if it works better. This
           | often happens randomly because I ask my self "what if I did
           | this?" as I'm performing the action.
           | 
           | The result is that almost invariably, I found a new way of
           | doing something that's better than before. It often takes
           | multiple tries, but it's something that takes little energy
           | because it can be done throughout the day and the stakes are
           | small.
           | 
           | Applied to a larger scale, random adjustments to larger
           | systems can be exactly what's needed.
        
             | agumonkey wrote:
             | I can even see this applied to human existence. Thinking
             | out of the box is basically glitching your ideas.
        
       | metamuas wrote:
       | The main mistake you made was not realizing artificial complexity
       | exists, that it is not natural, and that it is a form of control,
       | possibly the most important. Evidence A: C++.
        
       | GuB-42 wrote:
       | I disagree with the author, while I think there is value in
       | minimalism, I like to embrace messiness, and I can use the
       | "entropy" idea to show my opposite viewpoint.
       | 
       | Entropy is a force of nature, it will always increase, second law
       | of thermodynamics. And yes, it is fatal, we will all die in the
       | end there isn't much we can do about it. But that's where the
       | author backs out, saying that "organizational entropy" doesn't
       | follow the laws of thermodynamics because there is a magic spell
       | called "minimalism"... Why make a parallel with physics then?
       | 
       | I think that just like thermodynamic entropy, there is no
       | solution, we will all die, period. The only thing we can do is
       | make the best of the time we are alive.
       | 
       | And if we look at the author's ideal, it has zero entropy,
       | literally absolute zero, nothing moves, which is not the most
       | enjoyable situation...
       | 
       | Furthermore, the proposed solution (minimalism) involves creating
       | a small pocket of low entropy. In thermodynamics, that would be a
       | freezer. But while freezers can lower entropy locally, they
       | increase entropy globally, freezers need energy to function. And
       | the colder your freezer, the more energy it consumes and the more
       | entropy it creates. It means that minimalism can be
       | counterproductive: the more you try to make things perfect, the
       | messier everything around it becomes.
       | 
       | So, don't try to put every atom at it correct place, you simply
       | can't, absolute zero doesn't exist in nature, just admit that
       | life isn't perfect, that it is sometimes better to do something
       | useless than doing even more work trying to find if it actually
       | is useless. And low entropy (nothing moves) is as boring as high
       | entropy (just noise), the best is somewhere in the middle, life
       | is in the middle.
        
       | sandgiant wrote:
       | I think the author misunderstands what entropy is. It's not a
       | measure of complexity. If anything is the opposite.
        
       | harshreality wrote:
       | I wouldn't call it entropy exactly, but this is also the theory
       | behind Joseph Tainter's _Collapse of Complex Societies_. He
       | proposes that too much governmental overhead through accretion of
       | laws and bureaucracy is the cause (aside from obvious
       | alternatives like being defeated in a war, etc.) of a society's
       | or country's collapse.
        
         | frouge wrote:
         | And more complexity in laws and many more things increases
         | inequalities in our society. The more stuff (i.e. laws) the
         | more you need people (lawyers) to understand them, the more
         | intelligent people you also need (i.e. cryptos). All this
         | creating a larger gap between social classes.
        
       ___________________________________________________________________
       (page generated 2022-06-08 23:00 UTC)