[HN Gopher] The End of the Beginning
       ___________________________________________________________________
        
       The End of the Beginning
        
       Author : nikbackm
       Score  : 85 points
       Date   : 2020-01-07 16:40 UTC (6 hours ago)
        
 (HTM) web link (stratechery.com)
 (TXT) w3m dump (stratechery.com)
        
       | christiansakai wrote:
       | I've been thinking along the same line, albeit on a more personal
       | take as a software engineer.
       | 
       | Basically, starting around 15 years ago, there's the
       | proliferation of bootcamps teaching fullstack development,
       | because software startups were the new hot thing, and they
       | desperately need generalist engineers that were capable of
       | spinning up web app quickly. Rails was the hot thing those days
       | because of this as well. Hence we saw many new grads or even
       | people who change careers to fullstack development and bootcamps
       | churning out these workers at an incredible pace (regardless of
       | quality) but the job market took it because the job market was
       | desperate for fullstack engineers.
       | 
       | During that time, the best career move you can do was to join the
       | startups movement as fullstack engineers and get some equity as
       | compensation. These equities, if you are lucky, can really be
       | life changing.
       | 
       | Fast forward now, the low hanging CRUD apps (i.e., Facebook,
       | Twitter, Instagram, etc) search space has been exhausted, and
       | even new unicorns (i.e., Uber) don't make that much money, if
       | they do for that matter. Now those companies have become big,
       | they are the winners in this winner take all filed that is the
       | cloud tech software. Now these companies these days have no use
       | for fullstack engineers anymore, but more specialists that do few
       | things albeit on a deeper level.
       | 
       | Today, even the startup equity math has changed a lot. Even with
       | a good equity package, a lot of the search space has been
       | exhausted. So being fullstack engineers these days that join
       | startups don't pay as much anymore. Instead, a better move would
       | be to try to get into one of these companies because their pay
       | just dwarfed any startups or even medium / big size companies.
       | 
       | Just my 2c as someone who is very green (5 yrs) doing software
       | engineering. Happy to hear criticism.
        
         | onlyrealcuzzo wrote:
         | My understanding is that startups never paid well. Sure, if you
         | got lucky and were employee #7 at Facebook, it paid off great!
         | But even during that time frame, working at startups instead of
         | MS or Google was not a good proposition. And even during DotCom
         | and the rise of Microsoft, it paid a lot better to work at IBM
         | than these classes of startups.
        
         | timClicks wrote:
         | Isn't the perception of a saturated market persistent though? I
         | mean, there were many social media apps when Facebook started.
         | Twitter was created when microblogging had already become a
         | trend.
        
           | jonny_eh wrote:
           | Anecdotally, it seems like the rate of disruption in this
           | space has been steadily slowing down.
        
         | zozbot234 wrote:
         | Software is still eating the world, and there will be plenty to
         | eat for a long time. Cars (the foremost example in the OP) had
         | basically eaten the world by the 1950s (sometimes even in a
         | fairly literal sense).
        
       | mooreds wrote:
       | The essay reminds me of The Deployment Age:
       | http://reactionwheel.net/2015/10/the-deployment-age.html
        
       | tlarkworthy wrote:
       | Thats a very bold claim, that goes against Ray Kurzweil's
       | hypothesis tech is accelerating. Maybe (unlikely) that
       | cloud/mobiles is the end game for silicon. But what about
       | quantum? What about biological? What about Nano? What about AI?
       | Literally there are a ton of potential generational changes in
       | the making that could turn everything on its head _again_
        
         | the_af wrote:
         | Why is Ray Kurzweil's hypothesis particularly important to
         | contrast other hypotheses against? What sets it apart in
         | relevance and/or authority?
        
           | throw_14JAS wrote:
           | Because it's evidence is pretty straightforward: you can take
           | wikipedia's list of important inventions and plot their
           | frequency on a chart.
           | 
           | Of course, there are debates around which inventions count as
           | significant. And there is recency bias.
           | 
           | Never underestimate the power in something easy to
           | communicate.
        
             | beat wrote:
             | I think population growth also factors in. Population is
             | leveling off. In the past century, the global population
             | has quadrupled, so there are four times as many people to
             | invent things in raw numbers alone. But global population
             | will increase no more than 50% in the next century, which
             | means we aren't creating a lot more inventors than we are
             | now.
        
               | davnicwil wrote:
               | This is a good point, but also consider the proportion of
               | the global population who have the opportunity to become
               | inventors is hopefully going to grow over the next
               | century. As a result, the absolute number of inventors
               | may grow faster than by just growing the overall
               | population size.
        
           | tlarkworthy wrote:
           | It's very well known and makes a compelling argument that
           | tech progress has been accelerating since the Stone age.
        
             | AnimalMuppet wrote:
             | Well known is irrelevant. "Has been accelerating" is also
             | irrelevant. _Will continue to accelerate to something very
             | close to infinity_ is the relevant part. There are plenty
             | of us who do not find Kurzweil 's argument on that topic to
             | be at all compelling.
        
             | the_af wrote:
             | I agree Kurzweil's hypothesis is well known... within
             | techie/Silicon Valley circles, somewhat like the
             | Singularity -- a related concept -- is common in those
             | circles as well. Regardless, there's no particular weight
             | to Kurzweil's hypothesis, tantalizing as it may seem, and
             | it's not reasonable in my opinion to use it as a measuring
             | stick of other hypotheses as if Kurzweil was a _proven_ and
             | _acknowledged_ authority on this topic.
             | 
             | Likewise, if someone said "human aging and death are
             | unavoidable" this wouldn't be bold just because Kurzweil
             | has written a lot about immortality.
        
               | gfodor wrote:
               | I think his hypothesis deserves more credit than that --
               | many of his predictions have come to pass, and many of
               | those that have, at the time they were predicted, seemed
               | somewhat far fetched since they were firmly in the realm
               | of science fiction.
        
         | gfodor wrote:
         | I think it will, at best, be a semantic argument in retrospect.
         | The companies highlighted are all clearly defined as being
         | bolstered by computing technology. But what about next
         | generation, huge companies that are bolstered by computing and
         | _other_ technologies fused together? For example, if a company
         | manages to create a brain-computer interface that gains global
         | adoption and equivalent valuations to the existing tech giants,
         | but the software layer is a mashup of, by that time,
         | commoditized services from the existing tech giants who fail to
         | enter this industry, does it count?
        
           | davidivadavid wrote:
           | This myopia really puzzles me.
           | 
           | It seems like the whole analysis is predicated on the idea
           | that technology = software made in Silicon Valley, with
           | unimportant secondary factors. That 3M and ExxonMobil are not
           | "tech" companies because they don't make iPhone apps.
           | 
           |  _Every_ company is a tech company, not because we 've had
           | computers for a while, but because technology is what we
           | build to get what we want.
           | 
           | These kinds of narrow, myopic, siloed takes miss the forest
           | for the trees.
           | 
           | If you think the epitome of human evolution is going to be
           | people looking at bright rectangles for eternity, you haven't
           | been paying attention to what technologists are doing.
        
         | dsalzman wrote:
         | I don't think the claim is technology in general, but non
         | quantum based computing.
        
           | jdmichal wrote:
           | I don't think that's accurate. The article goes through pains
           | to discuss that sometimes castles are simply routed around.
           | Quantum computing would potentially be one of those.
        
           | bduerst wrote:
           | It's a substitute computing product though - kind of like
           | electric cars for combustion cars.
        
       | legitster wrote:
       | I've been reading Zero to One, and one of the ideas the book
       | pitches is that monopoly and innovation are two sides of the same
       | coin. Only monopoly-like companies have time and money to dump
       | into innovative products (Bell, GE, IBM, Google). And people only
       | invest in an idea if they think they can profit from it (look at
       | how crucial a patent system was for the industrial revolution).
       | 
       | Competition is important, but to drive efficiency - weed out bad
       | ideas and bring down costs of already created innovations. But
       | the thing that usually drives monoliths out of business is... new
       | monoliths.
       | 
       | The somewhat contrarian takeaway is that some (keyword) amount of
       | consolidation is good.
        
       | gz5 wrote:
       | >And, to the extent there are evolutions, it really does seem
       | like the incumbents have insurmountable advantages...
       | 
       | By definition, doesn't it _always_ seem like this?
       | 
       | Jim Barksdale (Netscape) said there are 2 ways to make money -
       | bundling and unbundling. What can be unbundled from the incumbent
       | bundles, in order to be offered in a more fit-for-purpose way, or
       | with a better experience?
       | 
       | How might that answer change if the world's political structure
       | changes? How might that answer change if processing, storage and
       | networking continue their march towards ubiquitous availability?
        
       | tudorw wrote:
       | Don't agree, comparing histories is not a reliable way to predict
       | the future, I think we'll see the growth of governance level
       | disruption, a pushback that will encourage home grown solutions
       | for countries that are not necessarily aligned with US interests.
       | That field is wide open and growing!
        
         | camillomiller wrote:
         | Policy driven disruption is the only option I see to break the
         | cycle. Let's see.
        
       | whatitdobooboo wrote:
       | I think if you abstract away the specific companies mentioned and
       | stuck to the technology, the point about people building on top
       | of already "accepted" paradigms is a good one, in my opinion.
       | 
       | The rest doesn't really seem to have enough evidence for such a
       | bold claim.
        
       | mirimir wrote:
       | > What is notable is that the current environment appears to be
       | the logical endpoint of all of these changes: from batch-
       | processing to continuous computing, from a terminal in a
       | different room to a phone in your pocket, from a tape drive to
       | data centers all over the globe. In this view the personal
       | computer/on-premises server era was simply a stepping stone
       | between two ends of a clearly defined range.
       | 
       | Sure, that's what happened.
       | 
       | But what jumps out for me is that, at both ends of that range,
       | users are relying on remote stuff for processing and data
       | storage. Whether it's mainframes or smartphones, you're still
       | using basically a dumb terminal.
       | 
       | In the middle, there were _personal_ computers. As in under our
       | control. That 's often not the case now. People's accounts get
       | nuked, and they lose years of work. And there's typically no
       | recourse.
       | 
       | As I see it, the next step is P2P.
        
       | oflannabhra wrote:
       | I'm not exactly sure where I fall on this. Ben is a really smart
       | guy (way smarter than me), but I feel like this could be a
       | classic case of hindsight.
       | 
       | Now, looking back, it makes sense that the next logical step
       | after PCs was the Internet. But from each era looking forward,
       | it's not as easy to see the next "horizon".
       | 
       | So, if each next "horizon" is hard to see, and the paradigm it
       | subsequently unlocks is also difficult to discern, why should we
       | assume that there is no other horizon for us?
       | 
       | I also don't know if I agree that we are at a "logical endpoint
       | of all of these changes". Is computing _truly_ continuous?
       | 
       | However, I think Ben's main point here is about incumbents, and I
       | agree that it seems it is getting harder and harder to disrupt
       | the Big Four. But I don't know if disruption for those 4 is as
       | important as he thinks: Netflix carved out a $150B business that
       | none of the four cared about by leveraging continuous computing
       | to disrupt cable & content companies. I sure wasn't able to call
       | that back in 2002 when I was getting discs in the mail. I think
       | there are still plenty of industries ripe for that disruption.
        
         | pthomas551 wrote:
         | Was it really that hard to predict the Internet? SF authors
         | picked up on it almost immediately.
        
           | oflannabhra wrote:
           | I'd say networking was not incredibly difficult to predict,
           | but the businesses and products it allowed for (and how we
           | use them) was very difficult.
        
         | marcosdumay wrote:
         | > it seems it is getting harder and harder to disrupt the Big
         | Four
         | 
         | Microsoft, IBM, Oracle... What is the other one?
         | 
         | Or, right, wrong decade.
         | 
         | (My point is, it completely not obvious if it is getting harder
         | to disrupt the incumbents.)
        
           | umeshunni wrote:
           | > Microsoft, IBM, Oracle... What is the other one?
           | 
           | Cisco, of course.
        
             | ghaff wrote:
             | It was actually Oracle, Sun, Cisco, and EMC who were the
             | four "horsemen of the Internet" in the run up the dot-com
             | bubble.
        
           | oflannabhra wrote:
           | The conclusion of the article is that it _is_ getting harder
           | to disrupt the incumbents. I 'm saying that regardless of
           | whether it is or isn't, there are still lots of new companies
           | to come that can take advantage of technology to disrupt
           | other, old-guard incumbents.
           | 
           | That, I think is where the metaphor Ben uses breaks down. The
           | automobile is a single idea (move people around with an ICE).
           | Tech is more like the ICE than the car. So, there might not
           | be much disruption to consumer hardware (Apple) companies, or
           | search (Google) companies, or cloud computing (Amazon,
           | Microsoft) companies. But there will still be lots of
           | disruption to come as tech (just like the ICE) gets applied
           | to new features.
        
         | myblake wrote:
         | Isn't that kind of his conclusion too though? It matters in as
         | much as we're less likely to see new general purpose public
         | clouds come into play, but he didn't seem to predict there was
         | no more room for change in the industry, just that were
         | unlikely to see those incumbents toppled from certain
         | foundational positions in the ecosystem.
        
           | oflannabhra wrote:
           | Much of Ben's writing recently has been on the topic of
           | regulation and anti-trust, specifically in relation to tech
           | companies. If I had to summarize his thesis, I'd say it's
           | something along the lines of: "Previous antitrust regulation
           | prioritized price. Tech allows for better products by nature
           | of aggregation and network effects, and to promote
           | competition, we need a new prioritization in our regulation".
           | 
           | So, I see this article as being a part of that thread. The
           | conclusion is that the Big Four are _not_ going to get
           | disrupted, which is bad, and _drawing some conclusions_ we
           | need a new framework of antitrust to allow for it. I might be
           | putting words in his mouth, but I don 't think it is really
           | that much of a jump if you read his body of work, especially
           | recently.
        
         | jdmichal wrote:
         | Until I have something resembling Iron Man's Jarvis with at
         | least a subvocal interface, I think there's still a long way to
         | go for "continuous" computing. I currently still have to pull
         | out a discrete device and remove myself from other interactions
         | to deal with it. If I'm not on that device all the time, then I
         | don't have continuous computing. Maybe continuously _available_
         | computing is more accurate?
        
           | repsilat wrote:
           | Right -- today you need to remember to charge your phone, you
           | don't take it everywhere (and don't have signal everywhere,
           | especially internationally), and you need to take it out of
           | your pocket to use it, and type into it with your thumbs
           | (though voice "assistants" are here, and some people get use
           | out of them.)
           | 
           | The end-goal is being able to talk to anyone at any time,
           | remember anything you've seen before, and know the answer to
           | any question you can phrase that someone has already
           | answered.
           | 
           | (Now, you might say that parts of it sound less than ideal,
           | but I think we'll get there by gradient descent, though may
           | be with some legal or counter-cultural hiccups.)
        
       | solidasparagus wrote:
       | Bah. He took three datapoints, built a continuum out of it and
       | says that since the third datapoint is at the end of his
       | continuum, we must be at the end.
       | 
       | But this doesn't fit any of the upcoming trends. The biggest
       | current trend is edge computing where cloud-based services
       | introduce issues around latency, reliability and privacy. These
       | are big money problems - see smart speakers and self-driving
       | cars. The cloud players are aware of this trend - see AWS
       | Outposts that brings the cloud to the needed location and AWS
       | Wavelength where they partnered with Verizon to bring compute
       | closer to people.
       | 
       | But privacy in a world full of data-driven technology is still
       | very much an unsolved problem. And most of the major technology
       | players have public trust issues of one sort or another that
       | present openings for competitors in a world where trust is
       | increasingly important.
        
       | Animats wrote:
       | His graph conveniently stops in the 1980s. Since then, there have
       | been many new US car companies, mostly in the electric of self-
       | driving spaces. Lots of little city cars, new imports from China,
       | too.
        
         | the_watcher wrote:
         | He specifically mentions excluding imports. Outside of Tesla,
         | what are the new American car companies that made any kind of
         | mark?
        
         | kevin_thibedeau wrote:
         | Most of those are NEVs with a 25MPH max speed to bypass safety
         | regulations. $20K golf carts.
        
       | ropiwqefjnpoa wrote:
       | The dealership model really helps manufacturers keep a tight
       | reign on the market, look at all the trouble Tesla had.
       | 
       | In a similar vein, Apple, Google and Microsoft control the medium
       | and have grown so powerful, I can't imagine there ever being a
       | new "Google" that comes about the old grass roots method.
       | 
       | Someday Apple will be bought though, probably by Facebook.
        
       | streetcat1 wrote:
       | This is wrong on merit, and I am not sure why it is presented
       | this way.
       | 
       | The difference between a car company and a software company is
       | economy of scale. I.e. economy of scale dominate the physical
       | world but does not exist in the software world since I can
       | replicate software at zero cost.
       | 
       | In addition, new tools and new processes for software has
       | increased the productivity times fold, which means that you need
       | fewer developers for new software.
       | 
       | I predict two shifts in the tech world:
       | 
       | 1) Move to the edge. Specially for AI, there is really no need
       | for a central public cloud due to latency, privacy, and dedicated
       | hardware chips. I.e. most of AI traffic is inference traffic
       | which should be done on the edge.
       | 
       | 2) Kubernetes operators for replacing cloud services. The value
       | add of the public cloud is managing complexity.
        
         | zebrafish wrote:
         | Your point contradicts itself, does it not? Movement to the
         | edge necessitates edge hardware and edge personnel to manage
         | the hardware.
         | 
         | Network effects are THE factor in software because the marginal
         | cost tends towards zero with each incremental user in the
         | network. The edge adds cost per node.
         | 
         | Up until the point that users are paid to connect to the
         | network and/or the network is directly linked to the user with
         | the I/O line completely obviated, the economics of hardware and
         | management underlying the network will tend towards economies
         | of scale... which is the point Ben is trying to make.
        
         | mooted1 wrote:
         | If you read more of Ben's writing, he talks extensively about
         | how software companies dominate market share through network
         | effects and vertical integration.
         | 
         | You don't hear him talk about economies of scale because
         | marginal costs are negligible for software companies. Besides,
         | network effects and vertical integration are sufficiently
         | powerful to control the market.
         | 
         | > In addition, new tools and new processes for software has
         | increased the productivity times fold, which means that you
         | need fewer developers for new software.
         | 
         | There are other barriers to entry besides the cost of writing
         | software, like product, sales, operations, and most
         | importantly, network.
        
           | streetcat1 wrote:
           | However, the network effect in tech can be leapfrogged due to
           | the zero marginal cost (as shown in this post). I.e. what
           | network effect do you get from doing ML inference in the
           | cloud?
           | 
           | The case for big tech today is still the economy of scale and
           | not network effects (maybe facebook have those, but it exists
           | only if the interface to facebook does not change).
           | 
           | The big tech players have economy of scale, due to their
           | ability to use automation and offload the risk of managing
           | complexity (I.e. one AWS engineer can manager 1000's of
           | machines with AWS software).
           | 
           | No wonder, that the software that manages the public cloud is
           | still closed source.
           | 
           | However, with Kubernetes operators, there is a way to move
           | those capabilities into any Kubernetes cluser.
        
             | mooted1 wrote:
             | Did you actually read the post?
             | 
             | > The case for big tech today is still the economy of scale
             | and not network effects (maybe facebook have those, but it
             | exists only if the interface to facebook does not change).
             | 
             | This is only true if you believe that the greatest cost of
             | developing software is running hardware. The greatest cost
             | of developing software is developing software. Not only are
             | economies of scale in compute management negligible except
             | at massive scale, the cost of compute has declined
             | dramatically as the companies you've described have made
             | their datacenters available for rent through the cloud. Yet
             | the tech giants persist.
             | 
             | Facebook, Google, Netflix, Amazon all have considerable
             | network effects that you're not considering. For each of
             | these companies, having so many customers provides benefits
             | that accrue without diminishing returns, giving them a firm
             | hold on market share. See
             | https://stratechery.com/2015/aggregation-theory/
             | 
             | Ben is saying that the only way to topple the giants is by
             | working around them and leveraging new computing
             | technologies better than them. He makes the (admittedly
             | speculative) case that this is no longer possible because
             | we can't bring compute any closer to the user than the
             | mobile devices.
             | 
             | > However, with Kubernetes operators, there is a way to
             | move those capabilities into any Kubernetes cluser.
             | 
             | Kubernetes, at the scale of technologies we're discussing,
             | is a minor optimization. Introducing k8s costs more than it
             | helps far until far into a company's infra maturity. Even
             | if most companies deployed k8s in a manner that
             | significantly reduced costs, it's not enough to overcome
             | the massive advantages existing tech companies have
             | accrued. Not to mention all of the big tech companies have
             | internal cluster managers of their own.
        
               | mr__y wrote:
               | >this is no longer possible because we can't bring
               | compute any closer to the user than the mobile devices.
               | 
               | this is based on a very dubious assumption that bringing
               | compute closer is the only path for innovation.
               | 
               | and even that is not true, you could imagine compute
               | being even closer with a direct brain interface (actually
               | you could consider google glasses to be an attempt at
               | bringing compute closer)
        
               | streetcat1 wrote:
               | I don't think that the amount of current customers is any
               | indication of network effects or any other kind of moat.
               | 
               | See: Walmart -> Amazon, Nokia->Apple, MSFT -> Andriod.
               | 
               | I mean, what more of network effect did MSFT had in the
               | 90's. It was dominating both the OS layer AND the app
               | layer (office). And yet, it does not have ANY share in
               | mobile.
               | 
               | Kubernetes is not minor optimization if you think about
               | what it is. Yes, if you see it as mere container
               | orchestration. But it is the first time that a widely
               | deployed, permissionless, open API platform exists.
        
               | yodon wrote:
               | Your comment would work equally well without the initial
               | inflammatory "Did you actually read the post?" opening
               | line.
        
         | foobiekr wrote:
         | It really doesn't make a lot of sense to do AI at the edge (in
         | terms of the various edge providers).
         | 
         | But then a lot of edge cases don't make a lot of sense. The
         | best edge use cases are fan-in (aggregation and data
         | reduction), fan-out (replication and amplification -
         | broadcasting, conferencing, video streaming, etc.) and caching
         | (which is just a variant of fan-out).
         | 
         | The rest of the cases are IMHO largely fictional - magical
         | latency improvements talked about in the same context as
         | applications that are grossly un-optimized in every way
         | imaginable, AR/VR, etc. Especially the AR/VR thing.
         | 
         | Beyond that the only thing left is cost arbitrage - selling
         | bandwidth (mostly) cheaper than AWS.
         | 
         | What's the use case for moving inference to the edge? Most of
         | the inference will in fact be at the edge - in the device,
         | which has plenty of capacity - but that's not the case you're
         | describing.
        
           | streetcat1 wrote:
           | Why would you run AI in the cloud? It is a closed, expensive,
           | high latency, etc. You might want to train in the cloud,
           | maybe.
           | 
           | For inference, I See 90% on the edge (I.e. outside of the
           | clouds).
        
         | xapata wrote:
         | > 1) Move to the edge. Specially for AI, there is really no
         | need for a central public cloud due to latency, privacy, and
         | dedicated hardware chips. I.e. most of AI traffic is inference
         | traffic which should be done on the edge.
         | 
         | Inferencing is done at the edge, but training must be done
         | centrally.
        
           | Slartie wrote:
           | Right now, the only market participant I see doing some
           | inferencing at the edge is Apple with its photo analysis
           | stuff that runs on the phone itself.
           | 
           | Anyone else is busy building little dumb cubes with
           | microphones and speakers that send sound bites into clouds
           | and receive sound bites to play back (heck, even Apple does
           | it this way with Siri). Or other dumb cubes that get plugged
           | into a wall socket and that can switch lights that you plug
           | into them by receiving commands from a cloud (even if the
           | origin of the command is in the same room). Or dumb bulbs
           | that get RGB values from a cloud server which inferred
           | somehow that the owner must have come home recently and which
           | then set the brightness of their RGB LEDs accordingly. Or
           | software that lets you record sounds bites, send them into
           | the cloud and receive transcripts back. Or software that
           | sends all your photos to a cloud library where it is scanned
           | and tagged so you can search for "bikes" or whatever in your
           | photos.
           | 
           | No matter what you look at in all that stuff that makes up
           | what consumers currently consider to be "AI", it does
           | inference (if it even does anything like that at all) on some
           | cloud server. I don't like that development myself, but
           | unfortunately that's how it is.
        
       ___________________________________________________________________
       (page generated 2020-01-07 23:00 UTC)