[HN Gopher] 2020 Mac Mini - Putting Apple Silicon M1 To The Test
       ___________________________________________________________________
        
       2020 Mac Mini - Putting Apple Silicon M1 To The Test
        
       Author : kissiel
       Score  : 595 points
       Date   : 2020-11-17 14:09 UTC (8 hours ago)
        
 (HTM) web link (www.anandtech.com)
 (TXT) w3m dump (www.anandtech.com)
        
       | anoncow wrote:
       | Can AMD/Intel pivot to ARM and provide the efficiency benefits to
       | the non-Apole ecosystem?
       | 
       | Can Apple's M1X/M2 outperform desktop CPUs?
       | 
       | Qualcomm tried their hand at desktop CPUs with Microsoft a few
       | years back. Is it time they tried again?
       | 
       | How comparable is a Surface Go with the performance/efficiency of
       | M1?
        
         | AgloeDreams wrote:
         | AMD has tried a bit, Intel probably won't. You can still get
         | great numbers on desktop x86 with better cores and processes.
         | Zen 4 is perfectly good so far. Apple's M1 already outperforms
         | most desktop CPUs, it goes to reason that a model with more
         | cores and bigger L1 could outperform the whole industry.
         | 
         | Surface Go is nowhere close, half as fast in single core, 1/5th
         | as fast in multicore. The 5W TDP is really a generic number
         | with no real meaning as Intel doesn't really abide by it, I
         | would say it probably uses about the same power as the M1,
         | possibly much more under turbo while also having a much higher
         | power floor (IE: When at idle the Surface go uses much more
         | power)
         | 
         | Keep in mind that the Surface Go is very low-cost and the CPU
         | is at a 14nm build.
        
           | anoncow wrote:
           | Would the M1 with more cores be able to beat the threadripper
           | at the same wattage? Right now the M1 stands at a score of
           | 8000 Vs Threadripper's 25000. The comparison I am sure is not
           | just about comparing benchmark scores, but is there a
           | prediction possible given that the M1 is at a 24W TDP whereas
           | the threadripper has a 280W TDP (A 3x change in the
           | benchmarks alongside a 10x change in TDP)
           | 
           | Does Qualcomm or Samsung have a M1 beater in their kitty?
        
             | hajile wrote:
             | It costs more to move 2 bytes into the CPU than to actually
             | add them. As you go bigger, you spend an increasingly
             | larger amount of time and energy moving data as opposed to
             | actually calculating things.
             | 
             | Anandtech numbers showed 50-89% of total power consumption
             | for the 7601 being used for Infinity Fabric. With 89w
             | remaining spread among 32 cores, that's a mere 2.78w per
             | core or 1.39w per thread at an all-core turbo of 2.7GHz.
             | 
             | Oh, I'd note that the 7601 is a 14nm Zen 1 part.
             | 
             | https://www.anandtech.com/show/13124/the-amd-
             | threadripper-29...!
        
       | obblekk wrote:
       | I'm really curious what Xcode and IntelliJ compile times look
       | like for real world repos.
       | 
       | It's a single use case, but by far the most common one for me
       | where I genuinely feel productivity slowed by my computer.
       | Hopefully good news there as well.
        
         | arvinsim wrote:
         | As someone using Jetbrains products, I would get no less than
         | 32GB for my machine :)
        
         | stu2b50 wrote:
         | Dave2D had xcode benchmarks and it seems like... It's faster
         | than the iMac Pro, and even faster than his friends hackintosh.
         | Might be the fastest, or almost the fastest, out of Apples
         | entire lineup.
         | 
         | https://youtu.be/XQ6vX6nmboU at minute 3
        
           | alfonsodev wrote:
           | I'm looking this repo[1] for reference, something must be
           | off, because a mid 2015 has half the time in incremental
           | build, of course the project matters, but Dave doesn't reveal
           | the name of the app they are compiling or I missed it.
           | 
           | [1] https://github.com/ashfurrow/xcode-hardware-performance
        
       | mtgx wrote:
       | It seems that Apple kept its eyes on the goal of beating Intel,
       | while underestimating AMD (like just about everyone else).
       | 
       | Still, now Apple has the #2 fastest CPU on the market and with
       | different ISA. Intel....#3. Oh, how the mighty has fallen.
       | 
       | At least AMD won't get to rest on its laurels now, as Apple will
       | definitely try to surpass Zen 4, too, now, or at least Zen 5.
        
         | hu3 wrote:
         | I wonder what will happen when AMD launches 5nm processors.
        
           | wmf wrote:
           | Apple will be releasing 3nm M3 processors that will probably
           | still be faster.
        
           | ogre_codes wrote:
           | It's always a slippery slope comparing future products to
           | present day products. Apple has additional CPUs coming out
           | over the next 2 years as well. It's going to be an
           | interesting couple of years.
        
         | jmull wrote:
         | Well, Apple released the M1 as their low-end chip, putting it
         | in the entry-level slot of their low-power Macs.
         | 
         | They may have something with substantially more power in store
         | soon.
        
           | vel0city wrote:
           | The cheapest MacBook Air with an M1 chip is $1,000. The
           | cheapest Mac Pro with an M1 chip is $1,300.
           | 
           | I don't consider a $1,000+ laptop "entry level" or "low end".
           | These are high-end machines.
        
             | Toutouxc wrote:
             | High-end, maybe. Not high-performance. There are segments
             | and use cases where Apple simply doesn't have an offering.
        
             | alwillis wrote:
             | _I don 't consider a $1,000+ laptop "entry level" or "low
             | end". These are high-end machines._
             | 
             | In the Apple ecosystem, a $1000 laptops are low-end
             | devices.
             | 
             | The iMac Pro [1] and the Mac Pro [2] are high-end,
             | professional level machines. The iMac Pro _starts_ at
             | $5,000; the Mac Pro at $6000.
             | 
             | The biggest difference is that Apple doesn't sell commodity
             | hardware that virtually every PC OEM does. That was a
             | deliberate choice many years ago.
             | 
             | BTW, the M1 Mac mini starts at $699 and blows away all the
             | PCs in it's class, including those that cost more.
             | 
             | Some Hollywood studios have already talked about replacing
             | much more expensive computers with the Mac mini because
             | it's so fast [3]. No joke.
             | 
             | To be clear, you're not going to render a full-length movie
             | in 4k on an M1-based Mac mini--that's what the Mac Pro is
             | for. But for less demanding 4k editing tasks that would
             | have been unthinkable on an under $1000 machine a year ago,
             | certainly.
             | 
             | [1]: https://www.apple.com/shop/buy-mac/imac-pro
             | 
             | [2]: https://www.apple.com/shop/buy-mac/mac-pro
             | 
             | [3]: _Hollywood thinks new Mac mini 'could be huge' for
             | video editors_:
             | https://appleinsider.com/articles/20/11/12/hollywood-
             | thinks-...
        
             | ogre_codes wrote:
             | At some point you have to recognize that different product
             | lines and market segments have different entry points.
             | Otherwise you end up comparing everything to the Raspberry
             | Pi, and every computer is "High-End".
        
               | romanoderoma wrote:
               | Well, it depends
               | 
               | The Renault Zoe EV and Tesla Model 3 have the same price.
               | 
               | The Renault Zoe is very low end compared to a Tesla, what
               | make them cost the same?
               | 
               | An EV includes technology that is very costly and even a
               | middle end car ends up costing like a base offer in a
               | higher segment (because the Zoe EV is the premium offer
               | in their segment)
               | 
               | You can't get any lower than that
               | 
               | A Zoe with an ICE engine costs in fact 10k less.
               | 
               | The same exact car.
               | 
               | There is no equivalent for Tesla, Tesla does not make
               | cars in that segment and even if they could, they won't
               | do it.
               | 
               | Said in other words: a low end Mac costs and has specs of
               | a high end machine
               | 
               | Highly castrated from the manufacturer (only 16GB of RAM
               | tops?) but definitely not low end, not even for Apple
               | 
               | It's their base offer for the high end segment
               | 
               | Which is very different from saying it's a low end
               | machine.
               | 
               | Their aren't low end, they are simply not premium (there
               | isn't going to be a big difference in performances
               | between the two, only a different positioning, equipment
               | and less artificial limitations from the manufacturer)
               | 
               | They are like AMD K6 CPUs that you could overclock using
               | a pencil
               | 
               | The conclusions of this review support the idea that the
               | specs of the Mac mini are not far from what we could
               | expect from the pro models
               | 
               | > _In the new Macbook Pro, we expect the M1 to showcase
               | similar, if not identical performance to what we've seen
               | on the new Mac mini_
        
             | mikey_p wrote:
             | You may not consider it such, but they are the entry level
             | Apple machines at the lowest end of their product range.
        
               | acmecorps wrote:
               | You have the Mac Mini, which is way below $1k, but for
               | mobile macOS? yes, $1k is the minimum
        
               | rowanG077 wrote:
               | That doesn't mean they are low end laptops. A rolls-royce
               | is also not ever a low-end car. They are the cheapest
               | products Apple offers. But they are still high-end.
        
       | mekkkkkk wrote:
       | I'm not a big fan of Apple, but this makes me genuinely happy.
       | Seeing discussions and benchmarks of CPUs where Intel isn't even
       | a contender is fantastic. What a lovely day!
        
       | akritrime wrote:
       | The last sentence on the first page has a typo, I think. I am
       | guessing it would be `tough competition`, not `though
       | competition`.
        
         | joshstrange wrote:
         | I never understand downvotes in this situation. Like are you
         | mad someone is calling out typos? I mean I'm not surprised, I
         | got downvotes for the literally the same thing [0] a few days
         | ago when anandtech published another article full of typos... I
         | like their reporting but the typos and the multi-page nonsense
         | is a huge turnoff.
         | 
         | [0] https://news.ycombinator.com/item?id=25052892
        
           | MBCook wrote:
           | I would assume the down votes are because this doesn't really
           | add to the discussion at all. If you spot typos why not
           | notify the original website instead?
        
             | akritrime wrote:
             | That's what I hoped to do with the original comment. I
             | didn't want to send them an email for something this
             | trivial and I was just leaving the comment in case anyone
             | from Anandtech stumbled upon it. The typo is not really an
             | issue and my comment was not a criticism. I was just trying
             | to be helpful but I can see how that can appear to be when
             | my comment is just about the typo and not the content of
             | the post.
        
           | akritrime wrote:
           | Honestly I don't mind. I was not commenting on the quality of
           | the post, neither was the typo that much of an issue. I have
           | seen people from Anandtech active in the hackernews and I
           | just this is a good way of reaching them and letting them
           | know something trivial with their posts.
        
       | [deleted]
        
       | andy_ppp wrote:
       | I think I'm nearly ready to buy one of these... does anyone know
       | of any benchmarks showing a Javascript test suite running? Be
       | interesting to compare to to my current machine... the
       | performance of node being good (and the sort of stop start choppy
       | test suite stuff) would probably make me go for it.
        
       | Thaxll wrote:
       | I don't think there is enough test yet, why is it only Cinebench
       | and Geekbench? Show us real test with ffmpeg, gcc etc ...
        
         | HatchedLake721 wrote:
         | There are more pages behind that first page, see at the bottom
         | of the article.
        
         | d3nj4l wrote:
         | They state in the article:
         | 
         | > As we've had very little time with the Mac mini, and the fact
         | that this not only is a macOS system, but a new Arm64-based
         | macOS system, our usual benchmark choices that we tend to use
         | aren't really available to us.
         | 
         | I think most other benchmarks weren't compiled for MacOS on ARM
         | yet.
        
         | lizknope wrote:
         | There are benchmarks for SPEC INT 2006 which includes a gcc
         | benchmark.
        
         | yxhuvud wrote:
         | Does ffmpeg even compile on M1 yet?
        
         | zachberger wrote:
         | There are multiple pages in the review with more benchmarks
         | than the ones you mentioned.
        
         | apetrovic wrote:
         | WebKit compilation:
         | https://twitter.com/panzer/status/1328700636926332928?s=21
        
       | dtech wrote:
       | Impressive results from Apple, and another well-deserved kick in
       | the teeth for Intel after years of stagnation. The coming decade
       | is going to finally see some interesting developments in the CPU
       | market again.
        
         | taftster wrote:
         | Right, I think in the end, this is going to show just how bad
         | monopolies (or near-monopolies) can be for innovation. These
         | are super impressive results, just hoping that the rest of
         | Apple (software, developer relations) can turn away from the
         | draconian future they are currently heading.
        
       | formerly_proven wrote:
       | The Cinebench R23 results seem kinda weird to me. The 5950X would
       | have almost a 40 % clock speed advantage over the M1 (~5 GHz 1C
       | vs 3.2 GHz 1C), yet the M1 is only about 8 % slower in an
       | entirely ALU-limited SIMD benchmark? This suggests the M1 core
       | has like 50 % more EUs and achieves much higher throughput than
       | Zen 3.
       | 
       | The SPEC results are... decisive to say the least. Without Zen 3,
       | x86 CPUs would look, well... like shit. All Intel offerings,
       | including the Sunny Cove part (so not a 7 year old uarch), look
       | uniformly bad across all workloads.
        
         | phire wrote:
         | You overestimate how ALU limited Cinebench is.
         | 
         | I managed to find an AVX vs AVX off benchmark run for Cinebench
         | r20 [1]. Going from 128bit SSE to 256bit AVX and doubling the
         | ALUs only results in a 10-12% increase in performance.
         | 
         | I assume this has to do with how each SIMD lane of calculation
         | might need to branch independently, limiting the performance
         | speedup from just throwing wider SIMD ALUs at it.
         | 
         | [1] https://www.techpowerup.com/forums/threads/post-your-
         | cineben...
        
           | diimdeep wrote:
           | ELI5 please, how this change things for M1
        
             | phire wrote:
             | There is more than one way to scale. Over the last decade,
             | Intel had been pushing wider SIMD.
             | 
             | Instead of making your cpu able to execute more
             | instructions per cycle, why don't you make each instruction
             | do more work. SSE packs four floats/ints or two
             | doubles/longs into a single 128bit register and then you
             | can do the same ALU operation to each lane.
             | 
             | It works great on certain workloads.
             | 
             | With AVX, Intel increased the size of these registers to
             | 256bit (eight floats) in 2011 and are currently pushing
             | AVX512 doubles the width again (16 floats).
             | 
             | Apple, and ARM in general are limited to 128bit vector
             | registers (though they are plans to increase that in the
             | future)
             | 
             | Cinebench is well known as a benchmark which takes
             | advantage of the 256bit AVX registers, and some people have
             | speculated that Apple's M1 might be at a significant
             | disadvantage because of this, with just half the ALU
             | thoughput.
             | 
             | But these numbers show that while cinebench gets a notable
             | boost from AVX, it's not as large as you might think (at
             | least on this workload), allowing the M1's IPC advantage to
             | shine though.
        
               | hajile wrote:
               | I'd note that both arguments have merit.
               | 
               | A SIMD is basically controller + ALUs. A wider SIMD gives
               | a better calculation to controller ratio. Fewer
               | instructions decreases pressure on the entire front-end
               | (decoder, caches, reordering complexity, etc). This is
               | more efficient overall _if fully utilized_.
               | 
               | The downsides are that wide units can affect core
               | clockspeeds (slowing down non-SIMD code too), programmers
               | must optimize their code to use wider and wider units,
               | and some code simply can't use execution units wider than
               | a certain amount.
               | 
               | Since x86 wants to decrease decode at all costs (it's
               | very expensive), this approach makes a lot of sense to
               | push for. If you're doing math on large matrices, then
               | the extra efficiency will make a lot of sense (this is
               | why AVX512 was basically left to workstation and HPC
               | chips).
               | 
               | Apple's approach gambles that they can overcome the
               | inefficiencies with higher utilization. Their decode
               | penalty isn't as high which is the key to their strategy.
               | They have literally twice the decode width of x86 (8-wide
               | vs 4-wide -- things get murky with x86 combined
               | instructions, but I believe those are somewhat less
               | common today).
               | 
               | In that same matrix code, they'll have (theoretically) 4x
               | as many instructions for the same work as AVX512 (2x vs
               | AVX2, so we'd expect to see the x86 approach pay off
               | here. In more typical consumer applications, code is more
               | likely to use intermittent vectors of short width. If the
               | full x86 SIMD can't be used, then the rest is just
               | transistors and power wasted (a very likely reason why
               | AMD still hasn't gone wider than AVX2).
               | 
               | To keep peak utilization, M1 has a massive instruction
               | window (a bit less than 2x the size as Intel and close to
               | 3x the size of AMD at present). This allows it to look
               | far ahead for SIMD instructions to execute and should
               | help offset the difference in the total number of
               | instructions in SIMD-heavy code too.
               | 
               | Now, there's a caveat here with SVE. Scalable vector
               | extensions allow the programmer to give a single
               | instruction along with the execution width. The
               | implementation will then have the choice of using a
               | smaller SIMD and executing a lot or a wider SIMD and
               | executing fewer cycles. The M1 has 4 floating point SIMD
               | units that are supposedly identical (except that one has
               | some extra hardware for things like division). They could
               | be allowing these units to gang together into one big
               | SIMD if the vector is wide enough to require it. This is
               | quite a bit closer to the best of both worlds (still have
               | multiple controllers, but lose all the extra instruction
               | pressure).
        
             | vdfs wrote:
             | ELI5: Some things that ARM can do in one instruction, can
             | be done with multiple instructions on x86
        
               | CyberDildonics wrote:
               | That's absolutely not the conclusion here.
               | 
               | One person thought that benchmarks were saying that the
               | M1 had strong SIMD performance, but the reality is that
               | cinebench (and in fact most renderers) doesn't use SIMD
               | very effectively when looking at the whole process, and
               | the assumption that it demonstrates SIMD performance is
               | not correct.
        
         | JAlexoid wrote:
         | Intel's CPUs have been getting remarkably worse for a long time
         | now.
         | 
         | They trail the software improvements. To give you an anecdote -
         | I got a ThinkPad T430s and it made my work feel 10x faster(Java
         | EE development in 2012). I got my next ThinPad P51s in 2017 -
         | an it was just one huge disappointment. It felt like Intel was
         | stepping back. I now have ThinkPad P1 and computing power is
         | still just OK, though still better than the U class i7 in P51s.
         | 
         | I'll be happy to knock Apple for marketing BS("3 times faster",
         | etc). But Intel has shown that they just need to crumble. I
         | hope that my next laptop is not using Intel's ISA or cores.
        
           | bigboii wrote:
           | >I got my next ThinPad P51s in 2017 - an it was just one huge
           | disappointment. It felt like Intel was stepping back.
           | 
           | the microcode bugs cut performance by 20-30% varrying in your
           | workloads. and it comes with a 4k display? That would also
           | contribute to a performance loss, depending on what you're
           | doing.
           | 
           | That 2.8-3.9 cpu would be equal to a 2.3-3.3 before the bug,
           | afaik. Thats _hardly_ faster than 10 year old duals, wow!
           | 
           | >>they just need to crumble.
           | 
           | >Less competition will only make things worse for us
           | consumers. -\\_(tsu)_/-
        
             | fakedang wrote:
             | I'd say Intel has been enjoying the fruits of its pseudo
             | monopoly for far too long now.
        
             | JAlexoid wrote:
             | > Less competition will only make things worse for us
             | consumers. -\\_(tsu)_/-
             | 
             | Intel is so large that it is using up too much of
             | production capacity for anyone to enter the market. Intel
             | crumbling = more resources for new players to get lower
             | cost manufacturing capacities.
        
       | EmmEff wrote:
       | Is it safe to assume that future ASi CPUs for desktops will have
       | just Firestorm cores and no Icestorm, which should further
       | increase MT performance?
       | 
       | I know Apple was trying to get to market quickly, but I fail to
       | understand why we need Icestorm cores in a non-mobile CPU,
       | especially with this already (really) low TDP.
        
         | AgloeDreams wrote:
         | More likely they will ship more Firestorm cores and keep the
         | Icestorm. Their future chip designs will likely be cross
         | desktop/mobile. Keeping Icestorm lowers the cost in whole by
         | allowing them to ship more chips and gives about a 30%
         | performance gain in multicore.
         | 
         | Far more interesting to me is the idea that in heavy use, the
         | Icestorm cores can run the OS, notifications and all that,
         | allowing full uninterrupted use of the firestorm cores. Also
         | when the mac is in idle it uses far less power.
         | 
         | Basically, I fail to see a reason to not keep them :).
        
       | d3nj4l wrote:
       | Dave2D found the air to be on par with the Pro, at least for
       | tasks that took under ~8.5 minutes. It only really throttled
       | after that point, according to him.
        
         | Aperocky wrote:
         | So.. if I place it on a slab of ice, it would work the same as
         | Pro?
         | 
         | Tbh, the only reason I didn't even think of buying a pro is
         | because I don't want the touch bar. I might still buy an air if
         | there's no touch bar on the pro, but the decision will be a lot
         | harder.
        
           | d3nj4l wrote:
           | Both currently released M1 Pros have the touch bar, so it
           | looks like your decision will be quite simple!
        
           | [deleted]
        
         | heipei wrote:
         | I watched and read multiple reviews and Dave2D seems to be the
         | only one who tried to quantify the throttling to some extent,
         | all the others only had useless statements like "The Pro will
         | probably be able to sustain unthrottled workloads for longer
         | thanks to it's active cooling" - No shit, sherlock. For me the
         | fact that it only throttles after 8-9 minutes (!) of heavy use
         | is going to be the deciding factor that will allow me to go
         | with the Air (and actual physical function keys) over the Pro,
         | so thanks Dave.
        
           | runeks wrote:
           | Couldn't the Pro just turn on its fans after 8-9 minutes (to
           | avoid throttling), thus giving the best of both worlds?
        
             | rsynnott wrote:
             | I assume it does. My 13" 2016 MBP doesn't turn on its fans
             | much unless it's busy.
        
               | manmal wrote:
               | My 16" MBP is running its fans basically all day (iOS dev
               | work and ARQ backups)
        
               | rsynnott wrote:
               | Yeah, I think the 45W laptops always run them, even if
               | sometimes very slowly. The smaller laptops have been able
               | to turn them off completely for a while, though, when not
               | very busy.
        
             | ashtonkem wrote:
             | That's exactly what it does.
             | 
             | But 8-9 minutes of full 100% CPU is a relatively rare
             | occurrence for the vast majority of users. Developers might
             | occasionally do that, but it will be _very_ language and
             | project dependent.
        
             | molszanski wrote:
             | Best of both of worlds is:
             | 
             | - active cooling
             | 
             | - lack of a touchbar
        
               | grovellogic wrote:
               | I've been wondering if someone could make an active
               | cooling dock for the Mac Book Air. I was even thinking
               | the M1 wattage is low enough that you could have a
               | thermoelectric cooler lowering case temp down to room
               | temp.
        
               | 93po wrote:
               | I mean if you're desperate to get it to compile in 20
               | minutes instead of 25 for a particular occasion, you
               | could just grab a bag of peas from the freezer and set
               | the laptop on them.
        
               | Siira wrote:
               | The touchbar is pretty great if you program it yourself
               | using, e.g., BetterTouchTool. I especially love the
               | clipboard widget - works fantastic with VIM/EVIL.
        
               | xenonite wrote:
               | That would be the Mac Mini.
               | 
               | But seriously, I share your opinion on laptop keyboards:
               | regular function keys please.
        
           | masklinn wrote:
           | Somebody on twitter reported that during Rust compilation the
           | Air started throttling a bit (20-30% hit) after 3-4mn. The
           | Pro doesn't throttle.
        
         | ghaff wrote:
         | That's one data point that's particularly interesting to me. As
         | someone who (normally) travels a great deal, I'd probably go
         | with the Air unless there were real throttling compromises,
         | especially given that I use a different computer for multimedia
         | editing at home.
        
           | murukesh_s wrote:
           | Wish they re-introduce the discontinued Macbook 12 inch with
           | the same specs as air. It weighed only 970grams vs 1.29 kg
           | for air. In fact air feels bulky compared to other light
           | weight laptops like LG Gram, not to mention the design is
           | outdated. Always wondered why Apple killed the smaller model.
           | Perhaps they want to push the iPad pro so killed off the
           | netbook line. The wannabe traveller inside me keep drooling
           | at 12 inch whenever i see it in someones hands. It feels so
           | light and compact. With new M1 silicon, it's the ideal time
           | to bring it back. I would grab it without any thought.
        
             | Tagbert wrote:
             | The 12" MacBook could not be updated to newer Intel
             | chipsets due to thermal issues. The single port was also a
             | limitation. Once Apple upgraded the Air to retina, a large
             | part of the market for the 12" was lost. They were too
             | close to each other and cross-competed except for the super
             | portable use cases which was not large enough.
             | 
             | This model of Air is obviously a transition product with
             | new guts in an old shell. I suspect that as Apple introduce
             | fully redesigned, second generation Apple Silicon products,
             | you might see something that is closer to the 12" MacBook.
        
               | read_if_gay_ wrote:
               | I'm also hoping for thinner bezels as the current models'
               | ones are just huge compared to Dell's XPS line for
               | example. It's slowly becoming obvious that the design has
               | barely changed since 2016 or so. The 16" model was a step
               | in the right direction, but it's still not even close to
               | what Dell is delivering.
               | 
               | It'd be amazing if they managed to squeeze a 13" screen
               | into the old 12" form factor - you'd still get great
               | battery life thanks to the M1.
        
               | snowwrestler wrote:
               | It seems like Apple is capable of it--look at the bezels
               | on a new iPhone or iPad. But it would certainly require a
               | whole new shell, which probably takes a while for Apple
               | to design and ramp up because of all the machining
               | involved.
        
             | OkGoDoIt wrote:
             | I'm also surprised they didn't bring that back with an M1.
             | Here's to hoping it will be released next year to balance
             | out the higher end 16" pro and whatever others come out
             | next yet.
             | 
             | I had the 12" MacBook for a couple years and the form
             | factor was amazing. I backpacked around the world with it.
             | But it was so underpowered, it was barely useful. I found
             | myself using my phone more and more because it was less
             | frustrating. I would love to see what an M1-powered 12"
             | MacBook could be like!
        
             | sooheon wrote:
             | The 12 inch is still my favorite MacBook experience, having
             | owned pretty much every form factor since pre-unibody white
             | plastic. Can't wait to see what they can do in that hyper
             | minimal portable niche with Apple Silicon.
        
               | murukesh_s wrote:
               | haven't used it, but can feel it. you are making me want
               | it more.. wish Steve was alive, he would have perhaps
               | kept it alive at least for bragging rights as smallest,
               | lightest laptop on planet. Still remember Steve jobs
               | introducing air inside an office envelope.
        
             | ghaff wrote:
             | I'm definitely part of the target market (well, depending
             | upon my mood) for a <13" laptop for travel. I've never been
             | able to make an iPad-based workflow work for me. If nothing
             | else I spend too much time with my laptop on my, well, lap
             | and nothing with a removable keyboard works for me.
             | 
             | Based on the data I've seen so far, I'm not sure why they
             | even did a with fan Pro variant. Even if the market for an
             | 11-12" model is smaller I'm not sure why they didn't do
             | that instead. I was sure that was going to be the reason
             | they didn't refresh their 12" Intel system.
        
               | djrogers wrote:
               | > Based on the data I've seen so far, I'm not sure why
               | they even did a with fan Pro variant.
               | 
               | The 'pro' variant released was the low-end 13, aka the 2
               | port, formerly the 'macbook escape'. The 13" line has
               | been bifurcated since 2016, with this one firmly lower-
               | spec'd and powered.
               | 
               | It's very likely that the '4 port', or high-end 13" pro
               | will make more use of the active cooling, so it was
               | likely worth it to develop the new laptop with it.
        
             | dogma1138 wrote:
             | They should probably release an 11" version of the air, I'm
             | not sure a 12" having the same specs as the Air would be
             | viable.
             | 
             | However the interesting part would be what are they gonna
             | do with their iPad Pro line at this point I don't see a
             | reason for it not to run Big Sur or the Bigger Sur they'll
             | release next year and compete directly with the surface.
             | 
             | What I see Apple doing is the following:
             | 
             | iPhone/iPad non-pro continuing to use A series SoCs and run
             | iOS
             | 
             | iPad Pro migrate to M series SoCs and become what is
             | essentially Apple's Surface Pro
             | 
             | Macbook Air 13" and 11" (possibly drop to a single 12"
             | model) with M series SoCs this essentially will be the
             | Surface Laptop/Book competitor
             | 
             | Macbook Pro's will continue as they are 13" and 16" models,
             | if Apple goes for 11" and 13" MBAs they might move the MBP
             | to 14" and 16".
             | 
             | Without discrete GPUs and essentially no way to "upgrade"
             | the CPU to a higher model I don't really see the MBP 13"
             | being viable in the long term tbh, I think they'll need a
             | model that will differentiate it much more from the MBA and
             | unless Apple starts binning their future M series SoCs much
             | more in line with Intel and AMD I don't see them having too
             | much of a range here for upgrades.
             | 
             | So alternatively I also see them dropping the 13" MBP
             | altogether and having only a 15" or 16" on whilst the Air
             | will occupy the smaller form factors.
        
               | ghaff wrote:
               | Convergence can be overrated. Arguably Apple finally made
               | tablets mainstream because they didn't feel the same need
               | to maintain compatibility with their desktop/laptop line
               | that others did.
               | 
               | But it's hard not to see some sort of convergence between
               | mobile, laptops, and desktops over time.
        
               | dogma1138 wrote:
               | They are doing convergence now with allowing iOS apps on
               | Macs I can definitely see the iPad Pro line being moved
               | closer to MacOS from a UI perspective, especially since
               | the pen now works on all iPads.
        
             | em500 wrote:
             | There are several rumours about a return of the 12-inch in
             | 2021H1.
        
             | tinodotim wrote:
             | That would be a great device to also include a touchscreen
             | in a mac for the first time... after all macOS is getting
             | more and more touch-capable UI and got iOS app support. :)
             | 
             | But like you said, likely would eat into the iPad market -
             | on the other side, as long as they don't make it a 2-in-1,
             | the iPad should still have more than enough reason to
             | exist.
        
         | em500 wrote:
         | Here's one data point: a WebKit compile took 25min on the Air
         | vs 20min on the Mini/Pro. That 25min is still a bit faster than
         | the Intel 16-inch Pro, which took 27min and waaay faster than
         | Intel 13-inch Pro at 46min.
         | 
         | The crazy thing is that both M1 MacBooks still had 91% battery
         | left after the compile, vs 61% on the 16-inch Pro and 24% on
         | the 13-inch Intel Pro.
         | 
         | [1] https://techcrunch.com/2020/11/17/yeah-apples-m1-macbook-
         | pro... -> "Compiling WebKit"
        
           | nwlieb wrote:
           | Is this a 1-1 comparison? If the ARM compile is compiling to
           | ARM binaries then there might be less work/optimizations
           | since it is a newer architecture. Seems like a test with two
           | variables that changed. Would be interesting to see them both
           | cross-compile to their respective opposite archs.
        
             | mlyle wrote:
             | Maybe not, but A) it's close-- most of the work of
             | compiling is not microarchitecture-level optimizations or
             | emitting code, and B) if you're a developer, even if some
             | of the advantage is being on an architecture that it's
             | easier to emit code for... that's still a benefit you
             | realize.
             | 
             | It's worth noting that cross-compiling is definitely harder
             | in many ways, because you can't always evaluate constant
             | expressions easily at compile-time in the same way your
             | runtime code will, etc, too, and have to jump through
             | hoops.
        
               | marmaduke wrote:
               | Hm my experience was that compiling C on arm was always
               | super fast compared to x86, because the latter had much
               | more work to do.
        
               | mlyle wrote:
               | jar% time x86_64-linux-gnu-gcc --std=c99 -O3 -c
               | insgps14state.c -I inc -I ../../shared/api x86_64-linux-
               | gnu-gcc --std=c99 -O3 -c insgps14state.c -I inc -I 0.97s
               | user 0.02s system 99% cpu 0.992 total jar% time
               | x86_64-linux-gnu-gcc --std=c99 -O3 -c insgps14state.c -I
               | inc -I ../../shared/api x86_64-linux-gnu-gcc --std=c99
               | -O3 -c insgps14state.c -I inc -I 0.93s user 0.03s system
               | 99% cpu 0.965 total jar% time x86_64-linux-gnu-gcc
               | --std=c99 -O3 -c insgps14state.c -I inc -I
               | ../../shared/api x86_64-linux-gnu-gcc --std=c99 -O3 -c
               | insgps14state.c -I inc -I 0.94s user 0.01s system 99% cpu
               | 0.947 total jar% time x86_64-linux-gnu-gcc --std=c99 -O3
               | -c insgps14state.c -I inc -I ../../shared/api
               | x86_64-linux-gnu-gcc --std=c99 -O3 -c insgps14state.c -I
               | inc -I 0.92s user 0.04s system 99% cpu 0.955 total jar%
               | time arm-linux-gnueabihf-gcc --std=c99 -O3 -c
               | insgps14state.c -I inc -I ../../shared/api arm-linux-
               | gnueabihf-gcc --std=c99 -O3 -c insgps14state.c -I inc -I
               | 1.43s user 0.03s system 99% cpu 1.458 total jar% time
               | arm-linux-gnueabihf-gcc --std=c99 -O3 -c insgps14state.c
               | -I inc -I ../../shared/api arm-linux-gnueabihf-gcc
               | --std=c99 -O3 -c insgps14state.c -I inc -I 1.46s user
               | 0.03s system 99% cpu 1.486 total jar% time arm-linux-
               | gnueabihf-gcc --std=c99 -O3 -c insgps14state.c -I inc -I
               | ../../shared/api arm-linux-gnueabihf-gcc --std=c99 -O3 -c
               | insgps14state.c -I inc -I 1.55s user 0.04s system 99% cpu
               | 1.587 total jar% time arm-linux-gnueabihf-gcc --std=c99
               | -O3 -c insgps14state.c -I inc -I ../../shared/api arm-
               | linux-gnueabihf-gcc --std=c99 -O3 -c insgps14state.c -I
               | inc -I 1.44s user 0.03s system 99% cpu 1.471 total
        
               | throwaway894345 wrote:
               | As someone who knows relatively little about this, I'm
               | very curious why this is downvoted. It seems like a
               | rebuttal would be enlightening.
        
             | attractivechaos wrote:
             | Apple has been optimizing the compiler for a decade for
             | iOS.
        
             | freehunter wrote:
             | If everything else is the same, that seems like a solid
             | reason to prefer the ARM architecture even setting aside
             | 1:1 comparisons. Isn't faster compilation and execution the
             | whole point of a faster processor?
        
               | kag0 wrote:
               | The assertion is that compilation might be faster since
               | there are fewer optimizations, and therefore runtime
               | would be slower.
        
           | zaroth wrote:
           | How is that anything less than _mind blowing_?
           | 
           | Twice as fast, using 1/10th the battery life.... and that's
           | for a part that costs Apple $70 instead of, what, $400?
        
             | bengale wrote:
             | Can you imagine how frustrating it must have been at Apple
             | knowing what you had and having to deal with intel's crap
             | over the last year or two.
        
           | tonyhb wrote:
           | This is insane perf/watt. x86 backwards compatibility may
           | have gotten us to where we are, but it's certainly holding it
           | back. Arm is looking great, and maybe it's time for x86 to
           | die.
        
             | JAlexoid wrote:
             | It's time for Intel and x86 to die.
             | 
             | But I would also be a little wary, because ARM systems are
             | way more locked down than x86 systems today.
        
               | vmception wrote:
               | Is some of this because of those processor level
               | flaws/exploits where the fixes resulted in disabling some
               | processor commands making them slower and less efficient
               | 
               | With only a completely new/different architecture getting
               | those advances back?
        
               | comeonseriously wrote:
               | Why does Intel need to die? Sure they're not exactly the
               | company they used to be, but would it be enough for them
               | to just move away from x86? I'm just thinking I don't
               | want just one or two or three companies doing procs.
        
               | AmericanChopper wrote:
               | Intel stagnated and at the same time started implementing
               | some rather anti-consumer practices. This allowed AMD to
               | take the performance lead off them with their latest
               | generation of products. It's fantastic that the market
               | for processors is so competitive. I've grown to not like
               | Intel very much recently, but I'm glad they're here.
               | They'll keep the pressure on for further innovation, so
               | AMD will either need to keep up or be overtaken again.
               | Either of which is a good outcome for consumers.
        
               | JAlexoid wrote:
               | Resource allocation.
               | 
               | Intel dying would free up resources for development by
               | other companies.
        
               | ashtonkem wrote:
               | They don't need to die, but if they don't begin to
               | compete they simple will die.
        
               | StreamBright wrote:
               | Absolutely not. We need more competition because the #1
               | reason we got to this situation is mono culture and a
               | single platform (x86). We need Apple to succeed of
               | creating an alternative ARM based desktop/laptop platform
               | and for more competition we could add in Mips64 from
               | China to the mix. I am really hoping that by 2025 we are
               | going to have 3 major platforms available for end
               | customers, so that there is real competition.
        
               | Hamuko wrote:
               | Isn't having a whole bunch of different processor
               | architectures at the same time kind of bad for end-users?
        
               | spijdar wrote:
               | This really depends. Once-upon-a-time, at least in the
               | UNIX (tm) world, there were a plethora of ISAs, and this
               | was the environment where ideas like Java really made
               | sense. Write once, run anywhere.
               | 
               | Most OSes are still pretty well situated to handle this.
               | Java remains, and is easily cross platform. I can run
               | Java-based proprietary games like Minecraft on my POWER9
               | desktop, despite no-one involved probably ever
               | considering ppc64le a valid target.
               | 
               | The CLR on Windows is also pretty easily cross-platform,
               | although it won't help legacy x86 PE executables. Apple
               | has solved this for ages on the tooling side, encouraging
               | production of "fat" binaries with many arches since OS X
               | was NeXT, and your .app packages needed to run on x86 +
               | m68k + SPARC + PA-RISC.
               | 
               | Emulators like Rosetta (and qemu's usermode emulation)
               | can fill the gap of legacy executables, while these other
               | technologies can make the end-user experience good. Of
               | course, that's only if a) someone writes your platform's
               | equivalent of Rosetta, and b) developers write
               | crossplatform apps.
               | 
               | So, the answer depends on how cynical or optimistic you
               | are :-)
        
               | JAlexoid wrote:
               | And where are those chips going to be made? The issue
               | with Intel's dominance is it's complete dominance on the
               | supply side as well.
               | 
               | You fail to realize that this isn't like 3D printing, or
               | other low volume manufacturing. You can't just setup a
               | 100nm Si lithography lab in your spare room and churn out
               | RISC-V chips.
               | 
               | In 5 years - realistically we will have a few high
               | performance(non-mobile) ARM chips manufactured at
               | economic scale. Any other type of disruption would
               | require Intel and AMD to fail and relinquish the supply
               | side capacity... or China investing billions into new
               | chipmaking facilities now.(it takes a few years to build
               | that capacity)
        
               | sudosysgen wrote:
               | China already has 14nm online, and should have 7nm in a
               | year or two, so that means that we will probably see some
               | real RISC-V chips from there soon, if sanctions continue.
               | 
               | So I think that we will have a four way competition
               | between Intel, AMD, Apple, and Chinese RISC-V chips.
               | 
               | That being said, I don't see x86 dying, I think AMD and
               | eventually Intel when they wake up will be competitive.
        
               | StreamBright wrote:
               | I don't see X86 dying either, I think it will be dominant
               | in the desktop/laptop segment for a long time. I am not
               | sure why Longsoon uses Mips64 over RISC-V. Is RISC-V
               | generally available and ready for prime times?
        
               | StreamBright wrote:
               | That is a great question. I am not familiar with how much
               | the production of these CPUs are dependent on ASML, TSMC
               | etc. I think think China is kind of forced to have its
               | own supply chain after the Obama era ban on Intel chips
               | in Chinese supercomputers.
               | 
               | https://www.theregister.com/2015/04/10/us_intel_china_ban
               | /
        
             | Wowfunhappy wrote:
             | And that backwards compatibility may not even be necessary,
             | given Rosetta's performance. Sure Apple is using lots of
             | tricks, but if Microsoft or any Linux project could get
             | even somewhat close...
        
               | Rebelgecko wrote:
               | Based off of what happened with Rosetta1, I don't think
               | devs should count on Rosetta2 being around forever
        
               | romanoderoma wrote:
               | Just as testimony, it probably doesn't mean much, but
               | bakcwards compatibility you either have it or you don't,
               | there's no middle ground
               | 
               | Apple is one of the most capitalistic companies out
               | there, they want you to buy new stuff and they'll try
               | everything they can to force users to upgrade sooner or
               | later
               | 
               | The story is this: a friend of mine is a well respected
               | illustrator and he has been a long time Mac user (at
               | least since I remember)
               | 
               | Few days ago he asked me advices about a new laptop and
               | he asked for a PC because "new Mac OS will not work with
               | my Photoshop version"
               | 
               | He owns a license for Photoshop 6, payed for it and has
               | no need to uograde, especially to the new subscription
               | based licensing
               | 
               | MacOS Sierra doesn't even work with Photoshop CS6
               | 
               | The only option he had to keep using something he owned
               | was to switch platform (Adobe allows platform change upon
               | request)
               | 
               | End of story.
               | 
               | Backwards compatibility has no value until you need it.
               | 
               | Just like an ambulance or a pacemaker.
        
               | macintux wrote:
               | > Apple is one of the most capitalistic companies out
               | there, they want you to buy new stuff and they'll try
               | everything they can to force users to upgrade sooner or
               | later
               | 
               | The more charitable view is that by not being wedded to
               | backwards compatibility they can make their ecosystem
               | stronger, faster.
               | 
               | See https://medium.learningbyshipping.com/apples-
               | relentless-stra... for some discussion of those
               | tradeoffs.
        
               | Wowfunhappy wrote:
               | I've used Photoshop CS6 on both Sierra and High Sierra.
               | It's ever-so-slightly more crash-prone than on older
               | OS's, but totally usable.
               | 
               | It launches on Mojave as well, so I'm pretty sure it
               | works, but I haven't personally used it for any length of
               | time. Catalina is what killed it.
               | 
               | IMO, backwards compatibility in OSX/macOS was perfectly
               | decent for a long time. Most software compiled for Intel
               | that wasn't doing something weird continued to chug on,
               | frequently with significant glitches but not to the point
               | where the software was unusable. Then in Catalina Apple
               | just gave up or something.
        
               | klelatti wrote:
               | It's odd isn't it because if they invested a little bit
               | in Catalina and Rosetta they could probably have had a
               | great backwards compatibility story even in a few years
               | time - but it's just not in the DNA I guess.
        
               | djxfade wrote:
               | In Catalina, Apple dropped 32 bit support. And in the
               | same process dropped a lot of Frameworks that had been
               | deprecated for ages. 64 bit software that didn't rely on
               | deprecated Frameworks continue to function
        
               | Wowfunhappy wrote:
               | Isn't that the meaning of breaking backwards
               | compatibility?
        
               | romanoderoma wrote:
               | Photoshop 6, not CS6
        
               | Wowfunhappy wrote:
               | The GP said:
               | 
               | > MacOS Sierra doesn't even work with Photoshop CS6
               | 
               | I'm not sure where they got that impression, but it
               | definitely works!
        
               | romanoderoma wrote:
               | I got it from Adobe Web site
               | 
               | > _Mac OS X v10.6.8 or v10.7. Adobe Creative Suite 3, 4,
               | 5, CS5.5, and CS6 applications support Mac OS X v10.8 or
               | v10.9 when installed on Intel-based system_
               | 
               | They work, maybe, they are not supported though
               | 
               | It means that if it doesn't work, Adobe won't provide any
               | support
        
               | chipotle_coyote wrote:
               | > He owns a license for Photoshop 6
               | 
               | Uh, I'm guessing you mean CS6 rather than Photoshop 6,
               | the program that came out in 2000.
               | 
               | In any case, Adobe's help page[1] currently reads, "As
               | Creative Suite 6 is no longer sold or supported, platform
               | or language exchanges are not available for it." Since
               | they're certainly not selling or supporting versions
               | _older_ than CS6, it 's unlikely your friend is going be
               | able to keep Photoshop CS6 by buying a new PC laptop.
               | (And he sure as hell ain't gonna be able to get a copy of
               | Photoshop 6 to run on Windows 10.)
               | 
               | > Apple is one of the most capitalistic companies out
               | there, they want you to buy new stuff and they'll try
               | everything they can to force users to upgrade sooner or
               | later
               | 
               | That's not wrong, but s/Apple/Adobe and the sentiment is
               | still true. I suppose he'll save money if he gets a
               | cheaper-than-Apple PC laptop, but I don't think he's
               | gonna avoid paying for Creative Cloud.
               | 
               | [1] https://helpx.adobe.com/x-productkb/policy-
               | pricing/exchange-...
        
               | bitL wrote:
               | CS6 runs just fine on Windows 10. Of course it's not
               | supported by Adobe as they were pretty aggressive in
               | canceling CS6 licenses if one mistakenly accepted CC with
               | the same account before in order to put everybody onto
               | their extortion scheme, but I use CS6 as before just fine
               | on PC, not on Mac.
        
               | romanoderoma wrote:
               | I'm talking about Photoshop 6
               | 
               | That's why I said "MacOS Sierra can't even run CS6"
               | 
               | Technically in Italy if you bought a license and the
               | manufacturer won't support it anymore, you can use it on
               | another platform even downloading an illegal copy.
               | 
               | As long as you have the original license.
               | 
               | That's the same reason why you can listen to mp3s if you
               | own the original record, you have the right to keep a
               | copy and the right to use it even if the manufacturer
               | stop supporting it, because you bought it in perpetuity
               | when you bought the product
               | 
               | That's why I stay away from the new licenses that give
               | you none of those rights
               | 
               | And that's why backwards compatibility sometimes is what
               | drives people choices
        
               | lostlogin wrote:
               | > He owns a license for Photoshop 6, payed for it and has
               | no need to uograde, especially to the new subscription
               | based licensing
               | 
               | Sounds like the friend has a need to upgrade, and that
               | upgrade is going to require new software. I don't think
               | this situation is Adobe or Apple's fault, old stuff stops
               | working at some point.
        
               | heavyset_go wrote:
               | > _I don't think this situation is Adobe or Apple's
               | fault, old stuff stops working at some point._
               | 
               | Old stuff stops working due to deliberate design choices
               | made on both Apple and Adobe's parts. Apple deliberately
               | stripped Rosetta and 32-bit support from macOS, and Adobe
               | is deliberately making it nearly impossible to use older
               | versions of the CS suite on their end.
               | 
               | Meanwhile, I can run Photoshop 6 on Windows or WINE, and
               | I can still run binaries that were statically compiled
               | for Linux 20 years ago today.
        
               | romanoderoma wrote:
               | The hardware, which is not the main tool in his craft
               | 
               | He draws by hand on paper and the final preparation on
               | Photoshop is for printing
               | 
               | After almost 10 years he needed a new laptop (things wear
               | out with time and he could not install more RAM) but not
               | a new Photoshop version with a different and more costly
               | license
               | 
               | The need to upgrade software is an artificial one and
               | it's only needed because some platforms don't have a good
               | backwards compatibility
               | 
               | Windows does
               | 
               | For many people the OS doesn't make any difference, as
               | long as they can keep using the tools they already know
               | 
               | There is a limit on the improvements a new software will
               | provide if your workflow is already good as it is and you
               | already paid for the version that works for you
               | 
               | I know many small businesses that still use Office 2003
               | 
               | They can install it on new hardware on new Windows
               | versions, it's simply not possible to do the same on Mac
               | 
               | It's not better or worse, backwards compatibility it's a
               | feature and as any other feature some people value it a
               | lot, some don't care at all
        
               | bitL wrote:
               | That CS6 issue was a major faux pas and a reason why many
               | people stay with Mojave or are forced to use VMs.
        
             | asimpletune wrote:
             | Honest question. Does the ISA, as a language, really
             | matter? Or is it more a by product of who owns the ISA, eg
             | intel sucks, arm is more liberally licensed.
             | 
             | I used to work at intel, and no one I knew there thought
             | ISA mattered at all. That's just a few people though, so
             | I'm curious if people think there's something better or
             | worse about the different ISAs as a technology in their own
             | right, or if it's more about the business interests behind
             | them that matters.
        
               | sudosysgen wrote:
               | It really doesn't. AMD has essentially the same perf/watt
               | coming in a few months. ISA doesn't change anything
               | nowadays because it all gets decoded into a per-CPU
               | specific actual instruction set anyways.
        
               | spear wrote:
               | Exactly right. With today's transistor budgets, the x86
               | ISA decoder/translator is just noise.
               | 
               | This is not the difference between x86 and ARM -- it's
               | the difference between Intel's team and Apple's (also
               | AMD's). You don't see Qualcomm being competitive even
               | though they also use ARM.
        
           | tpetry wrote:
           | The interesting information after which time the air was
           | throttled and how much performance is lost when throttling.
        
         | tomaskafka wrote:
         | Having the experience (or, love/hate relationship - so
         | awesomely thin and quiet, so underpowered) with 12" Macbook,
         | one surprise is that throttle time really depends on
         | environment temperature and GPU use.
         | 
         | In a cool room it can last few minutes before throttling, while
         | outside on a warm day it throttles almost instantly.
         | 
         | Also, a thermal budget is shared with GPU, so once you plug-in
         | the external display, or start Sidecar, you run out of thermal
         | headspace pretty much instantly.
         | 
         | I'd love to see these two factors tested.
        
       | ksec wrote:
       | 1. This is roughly a 20-25W SoC. Apple could easily scale this to
       | 50W, or a M1"X" with Double of everything, ( likely not with the
       | Neural Processing Unit and the Image Signal Processing ).
       | 
       | 2. That would give you double the performance in MultiCore
       | Benchmarks, and Double the Graphics.
       | 
       | 3. They will need to double the memory transfer as well, so it
       | will either be a Quad Channel LPDDR4X or may be going with
       | LPDDR5.
       | 
       | 4. This hypothetical chip could be coming to MacBook 16" next
       | year.
       | 
       | 5. It is the nature of Chip and Devices that we are fundamentally
       | limited by Heat Dissipation. I call this TDP Computing.
       | 
       | 6. That is why in many, if not literally every explanation under
       | every graph they will note the TDP difference and you _should_
       | get the correct perspective or what is being measured.
       | 
       | 7. That means you should not expect a 10W / 25W chip to out
       | perform a 32 Core 250W Chip in MultiCore Benchmarks. You are
       | basically comparing Apple to orange. And I dont know why there
       | are _many_ comments in this thread doing it.
       | 
       | 8. The M1, and SPEC scores ( no longer are we relying on
       | Geekbench ) are to showcase what Apple is capable of.
       | 
       | Edit: I just deleted a massive Rant specific to HN comments on
       | Hardware.
        
       | jeswin wrote:
       | This is as much a ringing endorsement of AMD Ryzen as it is of
       | the M1. The 15W 4800U is just as impressive as the M1, and the
       | performance per watt gap seems small enough to bridge with AMDs
       | upcoming 5nm switch.
        
         | Synaesthesia wrote:
         | It is an awesome CPU. But Apple have put in a sick GPU too.
        
         | matteopey wrote:
         | Not only that, but also the 4800U is Zen 2. The Zen 3 mobile
         | processor (series 5000) are not here yet.
        
           | andy_ppp wrote:
           | Yes, will Zen 3 give these mobile parts a 19% boost? That
           | would be incredible if so...
        
         | skavi wrote:
         | You are making the mistake of equating TDP to actual power
         | draw. The Yoga Slim 7, which uses a 4800U has an average load
         | power of ~50W[0] vs the Mini's load power of ~30W.
         | 
         | [0]: https://www.notebookcheck.net/The-Ryzen-7-4800U-is-an-
         | Absolu...
        
         | bfgoodrich wrote:
         | AMD definitely is the best of the rest, but it doesn't seem
         | quite as close as it may be held to.
         | 
         | In single core tests the 4800u is running that core at 4.2Ghz.
         | Yet it gets soundly bested by the M1 @ 3 - 3.2Ghz (running at a
         | 50%+ advantage, 65%+ clock per clock). The M1 has an enormous
         | IPC advantage.
         | 
         | In a multicore test the 4800u has 8 performance cores with HT.
         | It only marginally beats an M1 with 4 performance cores and 4
         | efficiency cores (by the scaling the efficiency cores look like
         | they're 1/4 performance or worse -- these are very lightweight
         | cores).
         | 
         | Again, it's the best of the rest, but Apple clearly holds an
         | enormous lead here. Somehow everyone is focused on 5nm, but the
         | A13 on 7nm was still in a substantial lead. The 4800u on 5nm is
         | only going to be marginally better.
         | 
         | Apple clearly sandbagged this first entrant because they're
         | packing it into their "entry level" devices. In six months or
         | whatever they'll unveil the 6+2 core device in the mid range,
         | the 12+2 in the high range, etc, and we'll be back at these
         | discussions.
         | 
         | (Speaking generally) - This whole discussion about Apple
         | Silicon is fascinating because the goal posts have moved so
         | much. Looking back to HN discussions a year ago and everyone
         | was talking about some pathetically weak entrant that would be
         | a joke, etc. Now people are celebrating that it doesn't beat a
         | 24-core, 300W Threadripper. Now the narrative is that it isn't
         | impressive because the v1 didn't overwhelming destroy
         | everything else in the market.
        
           | kllrnohj wrote:
           | > The 4800u on 5nm is only going to be marginally better.
           | 
           | This is definitely not true. We already know Zen 3 is +20%
           | IPC over Zen 2 on the same process at the same power. So add
           | 20% to the 4800U without changing anything else as a starting
           | point.
           | 
           | Then toss in the process improvements from 5nm (which TSMC
           | says is either 15% faster or 30% less power) as well as any
           | further architectural improvements that AMD is doing in Zen 4
           | and there's going to be a very significant gap between the
           | 4800U and AMD's 5nm 6800U or whatever they end up calling it.
        
             | bfgoodrich wrote:
             | To be clear, I said that a 4800u @ 5nm (if one could simply
             | scale a design like that) would only be marginally better.
             | That the 5nm boogeyman is more incremental than the big
             | advantage it is held as.
             | 
             | You replied that if you take the 4800u, switched it to 5nm,
             | switch it to Zen 3...no actually switch it to to Zen 4 and
             | a completely different chip, it would be lots better so
             | what I said is "definitely not true".
             | 
             | I'm not sure this logic follows.
        
               | kllrnohj wrote:
               | A 4800U on 5nm would be 30% more efficient or 15% faster.
               | 5nm was a significant bump.
               | 
               | And that's before considering the density improvement
               | that came along with it (which is also substantial -
               | TSMC's N5 is up to 1.8x the density of N7). Which is why
               | I mentioned Zen 3 & Zen 4, because you don't make the
               | same chip across a shrink. You use the extra budget to
               | _do things_
        
       | michaelmrose wrote:
       | Why did the single threaded cinebench single threaded measure
       | against the Ryzen 5950x a high end desktop processor then the
       | multithreaded version of the same benchmark only list the 4900HS
       | a high end laptop part with a fraction of the thermal budget and
       | half the cores?
       | 
       | Other sites had the Ryzen 5950x pegged at 28,641 in the
       | multithreaded version vs 7833 for the m1 mac mini.
       | 
       | Its not really surprising that something with 4x as many high
       | performance cores as the m1 with a much higher thermal budget is
       | almost 4x faster than the m1.
        
       | [deleted]
        
       | HJain13 wrote:
       | This is bitter sweet moment for me, I hate Apple's walled garden
       | approach so much, but they have the hardware (CPU, speakers,
       | screens, etc) side of the things down (mostly)...
        
       | HJain13 wrote:
       | Interestingly Anandtech is feeding into the well deserved hype,
       | by comparing 5950x to M1; Intel being 2 node generation behind
       | still gives a good competition to M1 while still being a laptop
       | chip.
        
       | dukeofdoom wrote:
       | Macbook air: $999 (2 ports, no hdmi)
       | 
       | vs
       | 
       | $699 mac mini (2 ports + hdmi) + ipad $329 = $1029
       | 
       | Thinking of upgraing, my current macbook 2013 sits in a drawer
       | 99% of the time connected to a monitor and keyboard. That 1% of
       | the time when I travel, the macbook is too large to use
       | comfortably on an airplane seat. The ipad would work better for
       | this use. macbook air also only has two ports, so one would be
       | used for external monitor, the other for power. No place left to
       | connect external drives. Which I need to use for video editing,
       | and sometimes need to connect two drives to transfer files
       | between them. Seems like this would only be doable on macbook air
       | running on battery.
        
         | postalrat wrote:
         | I wish apple or some company would make a computer stick that
         | plugs into a usb c hub for power and peripherals. No battery
         | and no buttons other than power.
         | 
         | I don't see why it couldn't about the size of a wallet and
         | offer at least as good thermals as a macbook air.
        
           | freehunter wrote:
           | Similar products do exist, although they're not very good:
           | https://www.amazon.com/dp/B07KKYZL66
        
             | postalrat wrote:
             | I've seen those. They all are made to plug into the back of
             | a tv through hdmi.
             | 
             | HDMI doesn't have power and typically no peripherals like
             | keyboard and mouse. To get that to work as I would want it
             | would need multiple cables plugged into it.
             | 
             | I just want to stick my computer into a hub like a flash
             | stick and have it boot up.
        
         | amwelles wrote:
         | I switched from a MBP to a Mac Mini and iPad Pro last winter
         | for my personal setup. I absolutely love it. I spend roughly
         | equal time on both, but I'm not doing a ton of programming
         | these days outside of work. Taking the iPad traveling is way
         | easier than taking a laptop.
        
           | arvinsim wrote:
           | Are you using the Magic Keyboard? Just asking if that is a
           | big part of your good experience with the iPad.
        
             | spike021 wrote:
             | Not who you replied to, but I have one of those non-numpad
             | small Apple BT keyboards and it has worked really well for
             | me when traveling with only my iPad. It fits in the same
             | carry sleeve I use for the iPad, holds a long charge, and
             | is usable when I ssh to a host for IRC or if I really want
             | to code in VIM.
        
             | amwelles wrote:
             | Nope, I'm using the Smart Keyboard folio, since it came
             | with the iPad. (Bought secondhand from a friend.)
        
           | read_if_gay_ wrote:
           | Similar setup (desktop Mac plus iPad Pro with keyboard), and
           | similarly happy with it, but I'm afraid once the COVID
           | situation has been resolved I'll need a MacBook again. The
           | iPad works surprisingly well as a laptop replacement, and I
           | can get things done on it, but it's not an optimal
           | environment for serious work.
        
             | amwelles wrote:
             | I'm curious what you consider serious work? I write a lot
             | on the iPad, can answer emails, get my shopping done, etc.
             | I definitely wouldn't use it for programming, but I know
             | some people have set it up to do so.
        
               | read_if_gay_ wrote:
               | Yeah, programming. I agree it's a pleasure to do the
               | other tasks you mentioned on the iPad, but coding is
               | cumbersome (although still possible). I'm using a VPS and
               | Blink, the upshot is that you become decently efficient
               | working with tmux/vim/etc.
        
         | cactus2093 wrote:
         | > macbook air also only has two ports, so one would be used for
         | external monitor, the other for power. No place left to connect
         | external drives.
         | 
         | You can do all this through one port. Most LG or Dell
         | thunderbolt 3 monitors can supply 65 watts of power (some
         | models may be higher, up to 80 or even 100w) any of which
         | should be enough to run and charge this macbook air decently,
         | and have 3 extra usb type A ports on the monitor.
        
         | lowbloodsugar wrote:
         | I use a CalDigit Thunderbolt 3 doc on my macbook pro. Website
         | says it works with M1 macs, but you only get one screen. It
         | delivers power (87W), network, and has a bunch of ports,
         | including allowing thunderbolt daisy-chaining. It has a 10Gb/s
         | usb-c port, and five 5Gb/s usb ports. That uses a single port.
         | So if you had, e.g. two TB3 external drives, you could plug one
         | into the dock, and one into the other port on the Mac. This
         | dock might be overkill for you (its $250), but I used it to get
         | dual external screens on my MBP before I said "fuck it" and
         | bought an eGPU.
        
       | paulus_magnus2 wrote:
       | This means there is (probably) plenty of opportunity for seasoned
       | engineers to make serious money at Intel. But only for ones with
       | thick skin who can deliver despite rotten company culture.
        
       | jillesvangurp wrote:
       | I think this will get interesting if/when MS, Nvidia and others
       | start using ARM cpus more widely as well. Nvidia just bought ARM
       | so that would help them get rid of Intel as a middle man for
       | gaming hardware. MS already has windows running on ARM but that
       | seems to be a budget laptops only kind of thing so far. Also they
       | are shipping AMD on x-box, which is interesting. But you could
       | see Nvidia building an SOC graphics + cpu running windows
       | potentially. Most game engines already target IOS and Android so
       | there should be no issues porting to ARM on that front either.
       | 
       | AMD ought to be paying attention. Risc V could be an alternative
       | at this point if they want to push the market in a different
       | direction. Having to license ARM from Nvidia would not be their
       | dream scenario, I imagine.
        
         | Tsarbomb wrote:
         | AMD already licenses ARM. Ryzen CPUs have an on die ARM CPU for
         | handling part of their platform security.
        
       | itg wrote:
       | Another source: https://wccftech.com/intel-and-amd-x86-mobility-
       | cpus-destroy...
       | 
       | At least in multicore, all of the Ryzen CPUs beat the M1.
        
         | d3nj4l wrote:
         | AnandTech (TFA) found the M1 performing very well compared to
         | even Desktop-class Ryzen in SPEC:
         | https://www.anandtech.com/show/16252/mac-mini-apple-m1-teste...
        
         | HatchedLake721 wrote:
         | Those are only Cinebench benchmarks.
         | 
         | Have a look at SPEC2006 and 2017 benchmarks, M1 beating desktop
         | class Ryzen 9 5950x, or just trailing behind (edit: in single
         | threaded performance), keeping in mind cost of each and that:
         | 
         | > While AMD's Zen3 still holds the leads in several workloads,
         | we need to remind ourselves that this comes at a great cost in
         | power consumption in the +49W range while the Apple M1 here is
         | using 7-8W total device active power.
        
           | TwoNineA wrote:
           | > the Apple M1 here is using 7-8W total device active power.
           | 
           | Anandtech showed almost 27W power draw under full load for
           | the M1 Mini.
        
             | HatchedLake721 wrote:
             | That's Anandtech's quote. M1 beats or trails behind Ryzen 9
             | 5950x in single threaded performance, hence they mention
             | 7-8W.
             | 
             | The 27W power draw comes from multi threaded performance.
             | Ryzen's multi threaded power drain is at ~130W (as far as I
             | know).
        
       | levesque wrote:
       | Am I reading this right? Is the new mac mini competing with a
       | Ryzen 5950X? The whole mac mini costs the price of that processor
       | alone. This is insane.
        
         | fxtentacle wrote:
         | Only in single threaded performance, which nobody actually uses
         | for rendering.
         | 
         | In multi-threaded, the Ryzen 5950X is at 28,641 while M1 is at
         | 7,833. So no, the Mac Mini is maxing out at 27% of the Ryzen
         | 5950X if you use it properly. And I was already friendly and
         | used the M1 number for a native port, while in reality you'll
         | likely need Rosetta and take a 33% performance hit.
        
           | geerlingguy wrote:
           | I think the overall point is that for the average user, who
           | doesn't need all those cores or could make good use of them,
           | the M1 may in fact feel / be faster.
           | 
           | For users like you or I, of course we'd see a huge
           | difference, but not everyone is running workloads that need
           | more than 2 or 4 cores.
        
             | heavyset_go wrote:
             | Just using a web browser these days requires many threads
             | and processes to run at once.
        
             | kissiel wrote:
             | I have a theory on why ST perf is always the most important
             | metric for me, and some other folks. When you're waiting
             | for something synchronously, like rendering a webpage,
             | stuff to open, etc. you're usually running a ST load. For
             | stuff that can benefit from multithreading it's usually
             | planned task. So does it make a difference if it takes 4
             | minutes compared to 3? You will still context switch.
        
               | sickygnar wrote:
               | Right now I have ~20 tabs open and a few apps, a workload
               | which is probably similar to the average user. My machine
               | currently has 510 processes running with 2379 threads,
               | though most of them are background. I'd wager core count
               | is more important than ST performance nowadays,
               | especially considering the fact that applications seem to
               | be multicore optimized.
        
             | kllrnohj wrote:
             | An average user is going to buy a 5600X or whatever not the
             | 5950X, and the 5600X's single-threaded performance is
             | barely behind the 5950X. You only get a 5950X if you want
             | multi-threaded performance.
        
             | alwillis wrote:
             | _For users like you or I, of course we 'd see a huge
             | difference, but not everyone is running workloads that need
             | more than 2 or 4 cores._
             | 
             | It's hard to imagine a regular person playing games or
             | editing the family photos or editing the kid's birthday
             | party videos aren't using multiple cores for almost
             | everything they do.
             | 
             | Even browsing the web these days uses multiple cores.
             | 
             | Apple wouldn't have made the investment if people couldn't
             | see and feel real world results.
        
           | snazz wrote:
           | Depends on what you're doing. For example, compiling is
           | multi-core, but linking is normally single-core. Many
           | workloads are still heavily single-core-dependent, so great
           | single-core performance is still a big asset.
        
             | jcelerier wrote:
             | > linking is normally single-core.
             | 
             | GNU gold was doing threaded linking 15 years ago, and
             | nowadays threaded linking is the default for new linkers
             | like LLVM's lld. Unless you use very specific GNU linker
             | hacks, there aren't any reason to not use lld, it works
             | fine for linking large software like LLVM/Clang, Qt,
             | ffmpeg...
        
           | 3JPLW wrote:
           | Yeah, but this is a laptop chip at ~20W. Of course it's not
           | going to compete with a 16-core 120W monster.
           | 
           | Getting 1/4 of the performance with 1/4 of the (high perf)
           | cores and 1/6 of the power is very impressive.
        
           | runeks wrote:
           | But the Ryzen 5950X has 16 cores while the M1 has only 4 high
           | performance and 4 low performance cores. So the Ryzen gets 4x
           | multi-core performance with 4x the cores.
        
             | kristianp wrote:
             | I wonder if Apple will bother to produce a CPU with
             | desktop-level TDP. That would really compete with the
             | Ryzens.
        
         | [deleted]
        
       | a012 wrote:
       | For very long time, the mac mini is attractive again with this
       | new M1 performance. I feel like my Ryzen 2 sff build is old even
       | though it's just less than 1 year.
        
         | skavi wrote:
         | Ryzen 2000 or Zen 2? The former is definitely more than an year
         | old.
        
       | kzrdude wrote:
       | What kind of graphics APIs will it support? OpenGL?
        
         | oblio wrote:
         | The proprietary Metal API: https://developer.apple.com/metal/
         | 
         | I think they might support OpenGL but I think everyone
         | considers their support second rate.
        
         | maeln wrote:
         | Graphics API support is a OS/Driver thing. OpenGL has been
         | deprecated on MacOS for a long while now, being stuck on a old
         | version (4.1). Apple refuse to support Vulkan also so the only
         | officially supported Graphical API on MacOS is Apple Metal.
        
         | galad87 wrote:
         | Metal. Then there are OpenGL and Vulkan wrappers that run on
         | Metal.
        
       | fxtentacle wrote:
       | I'm a bit surprised by their tagline "Integrated King, Discrete
       | Rival".
       | 
       | I'm using an Acer Aspire V15 Nitro Black 15" from 2016. On Aztec
       | Ruins Normal Offscreen, I get 270fps. So my 4 year old $800
       | laptop is still faster than the brand new M1. It seems Anandtech
       | chose a very Apple-friendly set of laptops to compare to.
        
         | kissiel wrote:
         | So a 60W TDP dGPU (gtx 960M) is ~35% faster than this iGPU? I
         | think this is what they called a Rival.
        
           | fxtentacle wrote:
           | Agree. But don't you think this would come off as a lot less
           | impressive if Anandtech had included all of the old rivals
           | from 1-4 years ago that still rank above the M1?
           | 
           | "If you currently own a 2016 15" Acer, buying the new 2020
           | MacBook will be a downgrade." sounds pretty lame to me.
           | 
           | That's why I said they had a very Apple-friendly comparison
           | set.
        
             | kissiel wrote:
             | I don't think people are considering replacing a gaming
             | 2kg+ laptop with a fanless macbook. Authors included two
             | popular Turing dGPUs for comparison.
        
             | kllrnohj wrote:
             | > But don't you think this would come off as a lot less
             | impressive if Anandtech had included all of the old rivals
             | from 1-4 years ago that still rank above the M1?
             | 
             | Not really. Why would you compare against old rivals
             | instead of the current market? They had 1660 Ti's on the
             | charts, too, which both obliterated the M1 & are not at all
             | the high-end of discreet mobile GPUs.
             | 
             | The "discreet rival" was because the M1 was competing
             | favorably against the discreet 1650 & 560(X). As in, entry-
             | level discreet GPUs make increasingly less sense (they
             | already weren't making much sense with Intel's new Xe and
             | AMD's Vega 8 & 11 integrated, but more nails in that coffin
             | with the M1)
        
         | bluedino wrote:
         | That would have you scoring higher than the Acer Nitro 5 (2020)
         | with a 1650, so I doubt you're running the same benchmark.
         | 
         | This might be more accurate, 88fps:
         | 
         | https://gfxbench.com/device.jsp?benchmark=gfx50&os=Windows&a...
        
           | kissiel wrote:
           | Lol. So that would be a +100% upgrade for fxtentacle.
        
       | vermaden wrote:
       | We will have to wait for AMD ZEN4 based CPUs which will also like
       | M1 be based on 5nm TSMC process.
       | 
       | Currently its apples (have to :>) versus oranges: 5nm M1 vs 7nm
       | ZEN3
        
       | goatinaboat wrote:
       | 1) Can I run VirtualBox on M1 (yet)? 2) What is the overhead of
       | doing so with Rosetta2 vs native on Intel? 3) What is the
       | situation with VT-X?
        
         | goatinaboat wrote:
         | Thanks all
        
         | my123 wrote:
         | 1) VirtualBox is strictly x86_64 only, everywhere.
         | 
         | 3) Arm virtual machines only. For now, Parallels has a preview
         | that you can enroll to at
         | https://www.parallels.com/blogs/parallels-desktop-apple-sili...
         | or you might use https://github.com/kendfinger/virtual which
         | uses the high level Virtualization.framework, for Linux VMs.
        
         | tomku wrote:
         | The answer to all three of your questions is "If you are
         | worried about this, absolutely do not buy an M1 Mac." Rosetta 2
         | cannot magically turn VirtualBox from a virtualization
         | management system into a high-performance x64 emulator. The
         | long-term solution is probably going to be running ARM Windows
         | or Linux in a VM and leaning on Rosetta-style
         | compatibility/translation in the client OS to run x64 programs.
         | 
         | Edit: Since this is attracting downvotes, maybe it needs some
         | clarification. The things OP asked about fundamentally cannot
         | work. Rosetta 2 is designed exclusively for user-mode programs
         | and cannot cooperate with virtualization software to run
         | arbitrary OSes in VMs. VirtualBox has no plans to port to ARM
         | and will not work in Rosetta. None of this is negativity or
         | cynicism towards M1 Macs - it's just the reality of how
         | switching architectures affects virtualization. If your use
         | case for Mac hardware is to run arbitrary x64 code at high
         | speed in VMs, you should not buy an M1 Mac because that
         | capability does not currently exist.
        
           | TillE wrote:
           | Yeah, I figure the only realistic solution for my work needs
           | (running a bunch of x86/x64 Windows VMs) is to do that
           | remotely on a Windows workstation.
           | 
           | I probably won't buy an M1 anyway, but I'll be extremely
           | interested to see what everything looks like when the M2
           | rolls around.
        
         | runjake wrote:
         | You may be interested in this thread:
         | 
         | https://forums.virtualbox.org/viewtopic.php?f=8&t=98742
         | 
         | tl;dr: VirtualBox is an x86/x86 hypervisor, there's no porting
         | to do. It would be a re-write.
        
         | HatchedLake721 wrote:
         | Benchmarks show M1 with Rosetta2 beats previous Mac iterations
         | in Cinebench benchmark. See page 2 here
         | https://www.anandtech.com/show/16252/mac-mini-apple-m1-teste...
         | 
         | >> What's notable is the performance of the Rosetta2 run of the
         | benchmark when in x86 mode, which is not only able to keep up
         | with past Mac iterations but still also beat them.
        
       | [deleted]
        
       | bartread wrote:
       | These are impressive performance stats and, given that I'm
       | working from home all the time nowadays and with much less of a
       | need for a laptop, the Mac Mini is actually a fairly attractive
       | option.
       | 
       | Except for one thing: it's maxxed out at 16GB of _unified_ RAM.
       | 16GB. In 2020 (nearly 2021). FFS.
       | 
       | Come on Apple: get your act together. The 16GB limit was
       | frustrating as hell when I bought my last MBP in 2015: now it's
       | absolutely unforgiveable.
       | 
       | (The iMac obviously goes way beyond 16GB but isn't yet available
       | with Apple silicon, and obviously the attraction with the Mini is
       | the relatively ludicrous performance of that Apple silicon.)
        
       | CitizenKane wrote:
       | It's certainly an impressive achievement and makes it pretty
       | clear why Apple is transitioning away from Intel. I'm a bit
       | surprised that the fact that this is on the TSMC 5nm process
       | seems to be glossed over in the comments. Apple is benefitting
       | from some what seem like on the surface to be significant process
       | improvements. Will be interesting to see how other players take
       | advantage of it as well.
        
         | phire wrote:
         | No, the gains from Apple's A13 (TSMC 7nm) to Apple's A14 (TSMC
         | 5nm and same cores as the M1) really aren't that large. About
         | average for a node jump.
         | 
         | This is mostly about architecture, not silicon process.
         | 
         | From A13 to A14, Apple managed to increase the clockspeed by
         | about 15% and increase IPC by 5% all while keeping power
         | consumption the same.
        
       | perardi wrote:
       | That WebKit compile time is impressive.
       | 
       | https://techcrunch.com/2020/11/17/yeah-apples-m1-macbook-pro...
        
         | mciancia wrote:
         | Question is whether they were compiling for the same
         | architecture on both x86 and arm.
        
           | alwillis wrote:
           | That's a good question but I don't think it would make a huge
           | difference. Those details should have been included.
           | 
           | Safari is already a universal binary on my Intel Mac running
           | Big Sur; that means WebKit runs natively on Intel and M1
           | processors.
        
         | w-m wrote:
         | Wow, the M1 MBP is on par with the 12-core Mac Pro from 2019
         | for the WebKit compilation. And even more impressive: "After a
         | single build of WebKit, the M1 MacBook Pro had a massive 91% of
         | its battery left. I tried multiple tests here and I could have
         | easily run a full build of WebKit 8-9 times on one charge of
         | the M1 MacBook's battery. In comparison, I could have gotten
         | through about 3 on the 16" and the 13" 2020 model only had one
         | go in it."
        
       | puranjay wrote:
       | This might get me back into the Apple ecosystem. I'll still wait
       | for the kinks to be ironed out in the first generation though.
        
         | zf00002 wrote:
         | I feel the same way. Very interested, but not going to go in
         | for first gen.
        
           | dev_tty01 wrote:
           | This is the 12th generation Apple Silicon processor design.
           | 
           | A4, A5, A6, A7, A8, A9, A10, A11, A12, A13, A14, M1
        
             | Brendinooo wrote:
             | Sure, but there are concerns beyond the chip itself.
        
             | pwthornton wrote:
             | I think this is 11th generation, where the A14 and M1 are
             | the same generation. I expect we will see a few other chips
             | from this generation, perhaps a A14X for iPad Pros and a
             | M1X for bigger laptops and iMacs.
        
             | kace91 wrote:
             | The issue is with macos running on apple silicon. There was
             | a thread today somewhere with docker mentioning that they
             | are still working on support for example.
        
             | lukeramsden wrote:
             | First generation of MacOS on Apple Silicon?
        
               | gayprogrammer wrote:
               | iOS has always used the same kernel as macOS.
        
               | heavyset_go wrote:
               | iOS XNU is compiled with different features than macOS
               | XNU.
        
               | AsyncAwait wrote:
               | And that helps i.e. Docker how?
        
               | yjftsjthsd-h wrote:
               | Sure, it's all Darwin, but userspace matters a lot - you
               | can do lots of things on MacOS that you can't do on iOS
               | (JIT, arbitrary web engines, assorted emulators, pop a
               | root shell and load arbitrary kernel modules)
        
             | JAlexoid wrote:
             | I'm absolutely not worried about the silicon.
             | 
             | I am worried about the other hardware and MacOSX being
             | total POS right now.
        
         | Wildgoose wrote:
         | Likewise - so long as they don't ditch the headphone jack on
         | this as well.
        
           | terramex wrote:
           | This, but I also hope they understand that most studio
           | headphones have cable attached on the left side and come back
           | to pre-touchbar era jack placement. With new, miniaturised
           | components there should be enough internal space on left
           | side.
        
       | jeffbee wrote:
       | Intrigued by the single-thread main memory bandwidth being a
       | multiple of what you get from a single SKX. We also see this with
       | Graviton 2. The latency is not terrible, either. How would this
       | much available bandwidth change your choices when optimizing your
       | algorithms?
        
       | blunte wrote:
       | This may be a bit of a stretch, but would the power savings
       | (value of which, Earth aside, could be measured in local
       | electricity costs) of an M1 Mac be significant enough in a year
       | to justify upgrading an otherwise functioning Intel Mac?
        
         | MagnumOpus wrote:
         | Nope. Say the differential under heavy CPU load is 10W, say you
         | run it under full load for 2,000 hours per year (which nobody
         | does on a laptop), then you saved about 20 kWh, or roughly $3.
        
       | jonplackett wrote:
       | I find it amusing people thought that apple silicon was going to
       | be crap. Or that they would lie about it being good.
       | 
       | They are not insane! They wouldn't jump ship and go through all
       | that expense and possibility of failure if they don't know they
       | had something amazing at the end of the rainbow.
        
       | k__ wrote:
       | So, it's like AMD, but different?
       | 
       | I am whelmed.
        
         | cwxm wrote:
         | not exactly, the performance per watt is what's impressive
         | here.
        
       | mattlondon wrote:
       | Wow it must suck to be Intel right now.
       | 
       | It wasn't so long ago that the trope was while others had better
       | multi-core performance "...Intel still holds the lead for single
       | core performance"
       | 
       | Now not only do AMD have a better product, but also Apple now
       | offer equal or better performance than the best that Intel can
       | offer.
       | 
       | I wonder what is next for Intel now their PS1000+ CPUs are firmly
       | in third place. Looking forward to some new innovation and
       | competitive (inc pricing!) products from them.
        
         | KoftaBob wrote:
         | That's the consequence of resting on your laurels and getting
         | complacent. They relied too much on being the large incumbent,
         | and they reap what they sow.
        
           | yborg wrote:
           | They reaped billions in profits. The issue is that
           | organizations can't turn it on and off based on competition,
           | once you are rich and lazy, the organization fills up with
           | coasters and before they know it they no longer have a higher
           | gear. Remains to be seen if Intel can come back, but I doubt
           | it under their current leadership.
        
         | bstar77 wrote:
         | Don't forget to add that Apple is now doing this on their
         | version of a "budget" laptop that has no active cooling, that
         | gets 18-20 hours of battery life, that runs emulated x86 code
         | with almost no performance hit and is a 1st gen product.
         | 
         | I don't think any of these details can be understated. Even
         | AMD's 1st gen Ryzen kind of sucked and look where that is now.
        
           | kllrnohj wrote:
           | > Don't forget to add that Apple is now doing this on their
           | version of a "budget" laptop that has no active cooling
           | 
           | The Anandtech tests were on an actively-cooled Mac Mini and
           | the power draw numbers they were observing were far outside
           | of what can be passively cooled in a laptop. You'd need to
           | wait for Air-specific results before drawing too many
           | conclusions on how it performs.
        
             | bstar77 wrote:
             | AnandTech isn't the only one providing benchmarks, they are
             | rolling in from all over the place now. People are running
             | 15 minute finale cut pro jobs and the fan isn't kick in on
             | the macbook pro.
        
               | offtop5 wrote:
               | Any word on if Final Cut and Logic X are recompiled for
               | Arm ?
        
               | bredren wrote:
               | FCP 10.5 dropped on 11/2:
               | 
               | * Improved performance and efficiency on Mac computers
               | with Apple silicon
               | 
               | * Accelerated machine learning analysis for Smart Conform
               | using the Apple Neural Engine on Mac computers with Apple
               | silicon
               | 
               | Discussion: https://forums.macrumors.com/threads/apple-
               | updates-final-cut...
        
               | offtop5 wrote:
               | Thanks , my M1 macbook pro gets here tommorow so we'll
               | see what happens
        
               | bonestamp2 wrote:
               | They said in the keynote that Logic had major
               | improvements under arm as well. I can't remember if it's
               | actually shipping yet.
        
               | andy_ppp wrote:
               | If they are Apple they are Universal apps I believe?
        
               | mlyle wrote:
               | I installed updates a few days ago and the release notes
               | say "support for Apple Silicon"...
        
           | benhurmarcel wrote:
           | > "budget" laptop
           | 
           | At this price it's more expensive than 80% of best-selling
           | laptops, so not quite budget. If you compare in price to Dell
           | for example, they only compete with their XPS line, which is
           | their high-end one.
           | 
           | Apple only does high-end products, which is fine but doesn't
           | make that model cheap.
        
             | danpalmer wrote:
             | To look at these CPUs a different way, it's fairly
             | competitive with Ryzen processors that cost $600-700 alone,
             | except that will buy the whole Mac Mini.
        
             | bstar77 wrote:
             | That why I put the word budget in quotes. It's the cheapest
             | laptop they make even though it isn't all that cheap.
        
             | oflannabhra wrote:
             | > Apple only does high-end products
             | 
             | I know this gets repeated often, but this is simply not
             | true. Apple _does_ make high-end products, and they market
             | themselves as a high-end brand, but Apple has always filled
             | as many market segments as they can. There are plenty
             | examples that prove this statement wrong: iPod Shuffle,
             | iPhone SE, the $250 iPad. They never do deep discounts on
             | their products though, so when they age or go stale they
             | are far overpriced; and they do _not_ make value or budget
             | models.
        
           | tonyhb wrote:
           | This is really the 12th generation of Apple's own chips,
           | though - and the third of this particular design, if I recall
           | correctly.
        
             | JAlexoid wrote:
             | Are you claiming that A1 is in the same category as M1?
             | 
             | If so... then Intel's latest chip generations should be
             | traced back to 8088 in 1979.
        
               | seniorivn wrote:
               | that's exactly how people describe Intel lineup
               | 
               | And architectural similarities between their first 14nm
               | chip and their last 10nm chips are as m1 is similar to
               | a12z at least, may be even their first 64bit
        
               | elteto wrote:
               | Intel CPUs _literally_ boot pretending to be 8088s [0]
               | 
               | [0] https://en.m.wikipedia.org/wiki/Real_mode
        
             | bstar77 wrote:
             | When I say first gen product, I mean the whole product, not
             | just the chip used. It would be a very different situation
             | if we were talking about an upgraded iPad with a new chip.
             | This is a platform defining moment.
        
         | MrBuddyCasino wrote:
         | Market share x86 overall (mobile + desktop + server), AMD vs
         | Intel:
         | 
         | Q3/2020 20,2% vs. 79,8%
         | 
         | Q2/2020 19,7% vs. 80,3%
         | 
         | Q1/2020 17,5% vs. 82,5%
         | 
         | I don't know why the OEM business works that way, but it is
         | very slow to shift, so Intel still has time. Self-built
         | consumer PCs for gaming are already overwhelmingly AMD though.
        
           | JAlexoid wrote:
           | Discounts and design lead time.
           | 
           | Unlike modular desktops - you can't just drop in an AMD CPU
           | into a laptop chassis and expect everything to work.
        
         | stuff4ben wrote:
         | At some point one wonders if Intel will just cede the desktop
         | and enthusiast markets to AMD and/or Apple and just focus on
         | server and high-end computing? As an IBMer this feels familiar
         | for some reason...
        
           | LegitShady wrote:
           | Amd 2qe on the market for decades practically as second in
           | the performance tier. I'm not sure why Intel who should
           | ostensibly have lower unit costs would abandon a market for a
           | possibly temporary situation one or two deign nodes away
           | might now be able an issue.
        
           | ginko wrote:
           | What would keep AMD or Nvidia from eventually eating Intel's
           | lunch in the server market as well?
        
             | breakfastduck wrote:
             | Nothing, they'd just die a lot slower
        
           | Aperocky wrote:
           | No reason why servers will always stick with Intel.
           | 
           | Amazon already has their own Graviton ARM chips - And that's
           | EC2, cloud native workflows might have already migrated.
        
             | m4rtink wrote:
             | Not to mention HPC, with the top three supercomputers being
             | Power & ARM.
        
           | Wohlf wrote:
           | Not a chance, the consumer market it massive. Even among PC
           | enthusiasts, AMD is in the minority. It's not even close:
           | https://store.steampowered.com/hwsurvey
           | 
           | I'd sooner expect Intel to start making their own ARM chips
           | to compete with Apple.
        
             | dk1138 wrote:
             | Thanks for the link! Didn't know steam made that kind of
             | analysis public.
             | 
             | But I'd take a closer look at those numbers: in 5 months
             | intel has lost 2.5 points that AMD has gained. Doing some
             | stupid, atrocious math of just taking the average point
             | gain over those 5 months (and not accounting for the fact
             | that my pc enthusiast friends are stating that their next
             | machine will be AMD), that puts November of 2023 that they
             | are 50% market share. That gives Intel very little time to
             | pivot.
        
         | patentatt wrote:
         | Not long ago at all, like just a few weeks ago right before
         | zen3 was out in the wild! A double whammy for sure for Intel,
         | tough times ahead indeed. Apologists can hand wave AMD off by
         | citing the huge lead in sales that Intel still enjoys, but that
         | argument falls flat with Apple, a trillion dollar company.
         | Maybe Intel will start to compete on price like AMD used to.
        
           | whizzter wrote:
           | I think they announced a short while back that they're
           | "looking at trying to outsource manufacturing of some high
           | end parts", ie they've known that they were falling behind in
           | too many areas due to their shrinkage problems so they're
           | taking in help from the outside to not become irrelevant.
           | 
           | M1 is running on "5nm", looking at specs Intel 10NM is
           | 100Mtr/mm2 vs TSMC's Apple 5nm chips being 173Mtr/mm2 (So
           | even if Intel nomenclature seems more conservative they still
           | lag by a lot in manufacturing capacity)
        
           | arvinsim wrote:
           | As someone who owns both Mac and PC, I am excited on what
           | Ryzen can offer on 5nm.
           | 
           | If Apple has these gains, I am sure Ryzen will have great
           | performance leaps too.
        
             | mcintyre1994 wrote:
             | I'm not massively familiar with CPU architectures, would
             | you expect to see similar performance gains going to 5nm on
             | x86 as you do on ARM?
        
               | hajile wrote:
               | Not strictly because of 5nm itself.
               | 
               | 5nm will be Zen 4 which should bring 10-20% IPC uplift if
               | AMD's current trend continues.
               | 
               | TSMC's N5 5nm transistors are 85% smaller than their N7
               | transistors which should lower power consumption
               | significantly though SRAM only shrinks a modest 35% (this
               | especially affects desktop Ryzen with tons of cache
               | compared to their laptop versions).
               | 
               | AMD currently makes the Zen 2/3 IO die on Global
               | Foundries 12nm for contractual reasons. When they finally
               | shrink that to 7 or 5nm, the power savings should be
               | significant.
               | 
               | Zen 4 is expected to bring DDR5 support which will both
               | drastically increase bandwidth _and_ lower RAM power
               | consumption. Likewise, it is expected to support PCIe 5
               | which doubles the bandwidth per lane to a little shy of
               | 4GB /s.
               | 
               | All of these things together could mean a decent
               | improvement in IPC and total performance and a very big
               | improvement in performance per watt.
               | 
               | Meanwhile, I suspect we'll start seeing large "Infinity
               | Cache" additions to their APUs that is shared between the
               | CPU and GPU as the bus width of DDR just doesn't offer
               | the bandwidth to keep larger GPUs from fighting the CPU
               | for bandwidth. This should not only improve APU total
               | performance, but fewer trips to RAM has a significant
               | effect on power consumption (it costs more to move 2
               | bytes than to add them together).
        
         | blunte wrote:
         | This happens to every giant eventually (and to countries or
         | civilizations). They get climb to the top, and then they hold
         | such dominant positions that they aren't forced to try. They
         | get lazy or sloppy (and in Intel's case, I'm not suggesting the
         | engineers were the sloppy ones... more likely strategic
         | decisions from management and quarterly earnings per share-
         | focused execs). Eventually they are dethroned, and some never
         | return to power.
         | 
         | Intel will never go away, but they definitely will become
         | laggards for the foreseeable future. In their industry it takes
         | years or even a decade to see the fruits of your effort.
        
           | amelius wrote:
           | > In their industry it takes years or even a decade to see
           | the fruits of your effort.
           | 
           | So how long has Apple been working on this chip?
        
             | ghshephard wrote:
             | Roughly 10 years on this particular processor line:
             | https://en.wikipedia.org/wiki/Apple-
             | designed_processors#A_se... (According to Anandtech, the M1
             | is a rough equivalent to the A14)
        
         | copperx wrote:
         | Why, exactly? As long as Apple keeps their chips to themselves,
         | Intel or AMD will have nothing to worry about.
        
       | mensetmanusman wrote:
       | The best option seems to be a hybrid.
       | 
       | Use the M1 Air to remote into a real non virtualized computer
       | running linux.
        
       | aero-glide wrote:
       | I hope Apple allows us to install the OS of our choice. The
       | battery life is impressive but I refuse to not use Linux.
        
         | crazygringo wrote:
         | If you want to use Linux for the tools, then just use a VM.
         | 
         | But if you want full control over your hardware... Apple isn't
         | the way to go. I'm not even sure what the "OS of our choice"
         | means when we're talking about a custom-designed SoC. The
         | amount of reverse-engineering required to get _any_ other OS to
         | work would be staggering, no?
         | 
         | If you want to run a custom OS natively, you need to buy a
         | laptop with a commodity chip, not a custom one. Fortunately,
         | there are tons of them.
        
           | josteink wrote:
           | ARM actually has a defined architecture and UEFI equivalent,
           | which would have worked wonders here.
           | 
           | If Apple had decided to support it, that is.
        
         | simonh wrote:
         | You're likely to hit the common problems porters face with
         | putting Linux on an arbitrary ARM SoC. These chips have lots of
         | integrated components on them, requiring device drivers that
         | may not exist for Linux. Take the custom Apple developed in
         | house GPUs for example. Good luck finding any kind of Linux
         | device driver for those, open source or not. It gets even worse
         | for things there isn't even an external equivalent of, like the
         | neural engines.
         | 
         | Even if Apple does nothing to stop you running whatever
         | software you like on the device, you're still likely to be out
         | of luck. I wouldn't be surprised if some enterprising folks
         | have a good run at it, but it's likely to be a massive
         | undertaking.
        
         | alwillis wrote:
         | _I hope Apple allows us to install the OS of our choice. The
         | battery life is impressive but I refuse to not use Linux._
         | 
         | Apple's hypervisor technology runs natively on the M1; Linux
         | running on that will be faster than Linux running on anything
         | else you can buy for the same amount of money.
         | 
         | They showed Debian running on Apple Silicon during the WWDC
         | keynote nearly 6 months ago.
        
           | lhl wrote:
           | Tuxedo Computing and Slimbook both sell Ryzen 4800H computers
           | that will outperform the M1 in heavy multithreaded workloads
           | and come with Linux preinstalled. These laptops aren't quite
           | as slick as the MBP but weigh in at 1.5kg, have huge 91Wh
           | batteries, and have a better keyboard (I have one from a
           | different OEM, but same ODM design). They also have user
           | upgradable memory and storage - I am running with 64GB RAM
           | and 2TB SSD at a total cost (with upgrades) of less than what
           | Apple is charging for their base 8GB/256GB MacBook Pro.
           | 
           | I expect a future "M2" to maybe take the performance crown,
           | but AMD isn't standing still. Cezanne has Zen 3 cores, which
           | should boost IPC by about 20%, and Rembrandt should get to
           | 5nm and have RDNA2 graphics.
        
             | alwillis wrote:
             | _Tuxedo Computing and Slimbook both sell Ryzen 4800H
             | computers that will outperform the M1 in heavy
             | multithreaded workloads and come with Linux preinstalled.
             | These laptops aren't quite as slick as the MBP but weigh in
             | at 1.5kg, have huge 91Wh batteries..._
             | 
             | 1. You're not going to get 20 hours of battery life.
             | 
             | 2. Don't forget it's not just the M1--it's the unified
             | memory, the 8 GPU cores and the 16-core Neural Engine. Most
             | CPU and GPU-intensive apps are going to run faster on the
             | M1 than on your machine. Even x86-64 apps using Rosetta 2
             | on an M1 Mac may run faster, since those apps are
             | translated to native code on the M1.
             | 
             | 3. Mac's SSD is probably faster; it's essentially a 256GB
             | cache for the processor.
             | 
             | 4. The Mac can run iOS/iPadOS apps too.
             | 
             | 5. If done right, Linux compiled for the M1 will likely run
             | faster on an M1 Mac than it does on a machine like yours,
             | especially if Apple provides a way to access certain
             | hardware features.
             | 
             | We'll have to see what happens but expect these machines to
             | be pretty popular with users, even those who need to run
             | Linux when that the distros are updated.
             | 
             | We shouldn't forget that the underpinnings to all of this
             | is Darwin, the BSD-derived Unix layer which is already
             | running natively on M1, including the compiler and the rest
             | of the toolchain.
        
         | ucha wrote:
         | You can't run Linux on Macbook Pros released after 2016 in any
         | meaningful sense anyways...
        
         | ghaff wrote:
         | I can pretty much guarantee you that trying to run anything
         | other than macOS on Apple's silicon is going to be an exercise
         | in frustration. You will presumably be able to run an Arm build
         | of Linux in a VM--given that Apple has demoed this--but if you
         | want native Linux, I'm not sure why you would pay a premium to
         | possibly get a bit more performance on a laptop while probably
         | having various support issues.
        
         | tinus_hn wrote:
         | They have already stated they won't:
         | 
         | "We're not direct booting an alternate operating system," says
         | Craig Federighi, Apple's senior vice president of software
         | engineering. "Purely virtualization is the route. [...]"
         | 
         | https://www.theverge.com/2020/6/24/21302213/apple-silicon-ma...
        
           | ravetcofx wrote:
           | This researcher has said it may be possible with PongoOS and
           | kernel signing https://mobile.twitter.com/never_released/stat
           | us/13273981029... https://mobile.twitter.com/never_released/s
           | tatus/13273946342...
        
           | easton wrote:
           | But you can disable Secure Boot and boot whatever OS you
           | want, so unless there's some other hardware gotcha it's not
           | like someone couldn't get Linux running if they wanted to put
           | the time in (which is a big if, considering there's no UEFI-
           | ish helper like on the Windows ARM devices).
           | 
           | https://support.apple.com/guide/mac-help/macos-recovery-a-
           | ma...
        
             | heavyset_go wrote:
             | There are quite literally millions of ARM devices out there
             | that will never have Linux support, and millions more are
             | being produced each year.
             | 
             | When it comes to ARM SoCs, Linux requires vendor support to
             | get it running. If you want mainline kernel support, that
             | requires even more work that many vendors just aren't
             | providing.
             | 
             | A locked bootloader is just one issue to overcome for Linux
             | support. A lot of the real issues come down to the lack of
             | an enumerable bus on ARM SoCs, along with a lack of
             | drivers.
             | 
             | Without vendor support from Apple to support Linux, these
             | devices will be like the millions of iPhones and iPads that
             | don't run Linux and will never run Linux.
             | 
             | Most ARM SoCs that are sold explicitly as mini Linux
             | computers also have this problem. Many of them are stuck on
             | old kernel forks, because vendors didn't give the proper
             | support their SoCs needed to run a mainline Linux kernel.
             | 
             |  _tl;dr_ : For Linux to be a viable option on Apple's SoCs,
             | Apple needs to put in a lot of work to explicitly support
             | Linux. Without that vendor support, you will never be able
             | to download a Linux ISO and install it like you can on an
             | x86 Mac.
        
           | dshpala wrote:
           | Challenge accepted :)
        
           | qz2 wrote:
           | Another item added to the list of why I'm not buying one of
           | these.
        
             | ramraj07 wrote:
             | Hopefully it's not just secret apple sauce that makes these
             | powerhouses, and other chip makers make arm based
             | processors soon enough giving us the choice we deseeve.
             | (given gravitons similar performance bump this is likely
             | the case)
        
             | foldr wrote:
             | It doesn't make a huge amount of sense to buy a Mac if
             | you're not going to use Mac OS as your daily driver. A lot
             | of the benefits (e.g. battery life, touchpad quality) are
             | dependent on software as well as hardware, and are greatly
             | diminished on Windows or Linux.
        
               | JAlexoid wrote:
               | Mac Mini doesn't raise those issues.
        
               | abainbridge wrote:
               | I've never been that impressed with the Mac Mini's
               | battery life or touchpad :-)
        
               | [deleted]
        
               | foldr wrote:
               | Touche. But seriously, most people who want to run Linux
               | on Mac want to do it because they like Apple's laptop
               | hardware. If you want a compact Linux desktop then a NUC
               | should probably serve you just as well. Or at least, this
               | was the case while Apple was still using Intel chips. If
               | Apple Silicon lives up to expectations then I suppose
               | there could finally be a compelling reason for running
               | Linux on a Mac desktop.
               | 
               | To be clear, I'm not saying that there couldn't possibly
               | be any good reason for wanting to run Linux on a Mac
               | desktop. But desktops are already a niche product for
               | Apple, and people who want to run Linux on Mac desktops
               | are arguably a tiny niche within a niche.
        
               | qz2 wrote:
               | Actually I had a Mac mini with the touchpad and the damn
               | thing disconnected three times a day. All my input
               | devices have wires now and the stick out of the right
               | places.
        
               | tonyhb wrote:
               | libinput's touchpad support is pretty great recently.
               | working on an xps 17, and the touchpad is - no joke -
               | just like the touchpad on my previous MBP.
        
               | postalrat wrote:
               | You can speculate but we will never know for certain.
        
             | tachyonbeam wrote:
             | I feel kind of grossed out, as a developer (and tinkerer)
             | by how locked down Mac products are. It's not really your
             | computer, you're just renting. Apple has decided that they
             | know what you want and need better than you.
        
               | bscphil wrote:
               | It's really kind of tragic that so much incredible
               | research and engineering work goes into creating new
               | hardware like this only for it to be locked into one
               | particular company with very tight constraints on target
               | audience, income bracket, and technical limitations.
               | Think how incredible it would be if everyone could use
               | this new silicon.
        
               | jmagoon wrote:
               | It is, in fact, already used by everyone, because it's an
               | evolution of the chipset in basically every smartphone in
               | the world with widely divergent target audiences, income
               | brackets, and technical limitations.
        
               | bscphil wrote:
               | I don't know that that's a fair comparison. Just because
               | it's an ARMv8 chip doesn't mean it's directly comparable
               | to what's in smartphones. (I assume you aren't comparing
               | it to Apple made chips for iPhone specifically, since
               | then it wouldn't be true that it's in "basically every
               | smartphone in the world".)
               | 
               | In particular, this is the first 5nm chip to be widely
               | available, and by most accounts on performance it
               | competes with top of the line hardware at a small
               | fraction of the power use. Most existing ARM chips are
               | designed for the very-low-power market, e.g. in phones,
               | not to be used in a high performance laptop.
               | 
               | If there's a Dell or Thinkpad laptop with an ARM chip
               | that's comparable, by all means, let me know.
        
               | tachyonbeam wrote:
               | AFAIK you are correct. Apple has completely redesigned
               | their own ARM chip. It has the same instruction set (or a
               | superset of the instruction set) as what runs in a
               | cellphone, but the design is completely different from
               | say, Qualcomm chips.
        
               | jmagoon wrote:
               | I prefer for the class of device the Air fits into
               | (travel, work laptop) to have a nicely curated _nix
               | machine with working drivers out of the box. Apple has
               | continued to improve on this by making this product class
               | faster, more battery efficient,_ and* cheaper.
               | 
               | There is a massive marketplace for tinkering on
               | computers, from Arduinos to multi-GPU ML rigs. Trying to
               | optimize for both classes of things seems like a foolish
               | endeavor, especially when Linux users represent such a
               | small fraction of the desktop market.
        
               | tachyonbeam wrote:
               | I hear this all the time from people "drivers working out
               | of the box", but I've been running Linux machines for a
               | decade now, and I've run into very few issues
               | comparatively speaking. My work makes me use a MacBook
               | for work, and it has a lot of significant bugs that are
               | not getting fixed. The trick with Linux is to use a
               | popular distribution. The one thing I will fully concede
               | is that Linux laptops have poor battery life.
        
               | coldtea wrote:
               | > _I feel kind of grossed out, as a developer (and
               | tinkerer) by how locked down Mac products are_
               | 
               | That's part of the value proposition (leave it or take
               | it).
        
               | jb1991 wrote:
               | Just because it is locked down, why is that the same as
               | "renting"? Those are two very different concepts.
        
               | AsyncAwait wrote:
               | Because you are not the ultimate decider of what to do
               | with the machine. If you owned it, you could do anything
               | outside of harm.
        
               | coldtea wrote:
               | The whole concept of the machine is to be bought and
               | optimized for running macOS.
        
               | AsyncAwait wrote:
               | Right, the point is that it didn't use to be that way
               | exclusively and now it is, so the new machines are more
               | restrictive than previous Macs, which also ran macOS.
               | 
               | In fact macOS itself is more restrictive nowdays than it
               | used to be.
        
               | yjftsjthsd-h wrote:
               | I guess, but the "whole concept of the machine" that I'm
               | typing this on was to run Windows... 7 (I think?); that's
               | a completely artificial limitation, as shown by running
               | Ubuntu on it years after the hardware went out of
               | support.
        
               | sixstringtheory wrote:
               | I'm not sure what the problem is, then. You have a device
               | that does what you (or the GP) want, which is to install
               | any operating system, tinker, etc.
               | 
               | Is the worry that Apple and its practices will dominate
               | the industry to the point that you literally will not be
               | able to turn on your current machine and use it?
        
               | AsyncAwait wrote:
               | > Is the worry that Apple and its practices will dominate
               | the industry to the point that you literally will not be
               | able to turn on your current machine and use it?
               | 
               | I know you're joking, but I actually kind of am...
               | 
               | Apple has a tremendous amount of industry influence, just
               | see removal of the headphone jack.
        
               | heavyset_go wrote:
               | macOS deprecates support for Macs that are 5-7 years old
               | with every release. I put Linux on them when new macOS
               | releases no longer support them, and they're perfectly
               | good machines afterwards.
               | 
               | When macOS deprecates support for these ARM Macs in 5-7
               | years, Linux isn't an option for them unless Apple puts
               | in a lot of work to support a mainline Linux kernel on
               | their hardware. Apple has said they won't support running
               | other operating systems on these ARM Macs unless they're
               | virtualized.
        
               | jb1991 wrote:
               | But renting implies you are continuing to pay money and
               | will some day need to return it.
        
               | AsyncAwait wrote:
               | No, not necessarily. Renting just implies you're not the
               | owner and need to follow someone's rules, (that of the
               | actual owner), in order to make use of the rented item.
               | 
               | 'Purchasing' a Kindle book or video on Amazon is also
               | renting for example and yet it does not mean you have to
               | continue paying and yet you don't own the copy as
               | Amazon's going to decide how you're allowed to consume it
               | and if they're going to let you keep it[1][2].
               | 
               | 1 - https://en.wikipedia.org/wiki/Amazon_Kindle#Criticism
               | 
               | 2 - https://www.hollywoodreporter.com/thr-esq/amazon-
               | argues-user...
        
               | jb1991 wrote:
               | I don't think purchasing a computer is the same thing as
               | buying a movie from Amazon. The computer is always gonna
               | be yours, and you can do whatever you want with it, even
               | if Apple has made it very difficult to do so. But there
               | are lots of objects in my house that would fall under
               | that category as well, but I consider myself as their
               | owner.
        
               | heavyset_go wrote:
               | There's a direct parallel you can draw between software
               | licensing and leasing.
        
           | js2 wrote:
           | I realize Federighi's reply seems to rule out Linux, but the
           | context of the question seemed to be with respect to Boot
           | Camp and Windows. My take is that Apple doesn't want to
           | continue to invest in Boot Camp, especially since Microsoft
           | apparently isn't willing to license ARM Windows for this use
           | case.
           | 
           | It's not clear to me that the new Macs won't allow booting
           | Linux if the Linux community can figure out how to do it. The
           | number of folks booting Linux on Mac via Boot Camp has to be
           | really tiny.
        
             | Synaesthesia wrote:
             | Getting drivers to work will be hard without Apple's help
             | or blessing. And there are a lot of drivers.
             | 
             | For comparison you can check the progress of Linux on
             | iPhones (which is actually a thing!)
        
               | js2 wrote:
               | Yeah, agreed, but my take isn't that Apple is going out
               | of their way to prevent it, just that they have no
               | interest in spending any resources on it. Some conjecture
               | here about what will be possible:
               | 
               | https://forums.macrumors.com/threads/running-linux-on-
               | apple-...
        
             | heavyset_go wrote:
             | > _It 's not clear to me that the new Macs won't allow
             | booting Linux if the Linux community can figure out how to
             | do it._
             | 
             | Mainline Linux support requires a lot of work from vendors.
             | Check out the ARM SoC Linux market for an abundance of
             | examples of this problem. Many of the devices will be
             | forever stuck an old kernel fork and will never run a
             | mainline kernel.
        
         | [deleted]
        
         | k2enemy wrote:
         | You can, but need to sign the OS image.
        
         | sabana wrote:
         | Why?
        
         | bstar77 wrote:
         | And that's your choice. I would start looking at AMD's Ryzen
         | offerings because supporting Linux is not going to be high on
         | Apple's list going forward.
        
           | JAlexoid wrote:
           | has it ever been?
        
             | bstar77 wrote:
             | I ran linux for years on my MacBook via Bootcamp. I'd be
             | surprised if Bootcamp ever comes back.
        
         | criddell wrote:
         | What can you do on Linux that you can't do on macOS?
        
           | toast0 wrote:
           | Have a TCP stack with synflood protection? (The mac stack was
           | copied from FreeBSD in 2001, before syncookies/syncache were
           | added, and not meaningfully pulled since)
        
           | choward wrote:
           | Automate the entire set up of my computer using a declarative
           | language. I use NixOS. Mac OS isn't even close.
        
         | jdlyga wrote:
         | I wouldn't pay for Apple hardware unless I wanted to use MacOS.
        
           | OJFord wrote:
           | Why? The hardware's the nice bit.
        
             | Aperocky wrote:
             | Yeah this. Imagine if we had the same hardware but designed
             | for linux, I'd pay a premium for that.
             | 
             | Although hardware specific software from Apple is probably
             | a big part of that draw too. I don't think we're ever going
             | to see Linux prioritize a certain hardware and put in the
             | effort to make it integrate as well as macs does.
        
               | ogre_codes wrote:
               | I don't really get this. I switched to MacOS because it's
               | fundamentally BSD with a nice/ well integrated GUI.
               | Almost all of the good OSS I love is supported nearly
               | perfectly.
               | 
               | Perhaps I'm a bit jaded after running into too much
               | bullshit trying to get Linux running well on laptops in
               | the 90s and 00s. Since I made the move I never wax
               | nostalgic for the "Good Ole Days" of fighting for hours
               | to get Wifi working properly.
               | 
               | Even assuming Apple released the specs so you could port
               | Linux to M1, on top of the usual laptop driver issues
               | around the trackpad, wifi drivers, and video drivers, you
               | also have to deal with the Secure Enclave. Without that,
               | you are stuck with either a non-encrypted drive or
               | running drive encryption on the CPU which is likely going
               | to kill many of the performance gains from using the Mac
               | hardware. Likewise, without the Secure Enclave, you lose
               | fingerprint auth.
               | 
               | Not anti-Linux by any means, but dropping Linux on the M1
               | isn't going to get you the same performance or battery
               | life by any means. You are far better just going with a
               | laptop which was designed to be Linux friendly to start
               | with.
        
               | lhl wrote:
               | IMO, the BSD/Darwin stuff isn't the problem, but rather
               | all the recent additions that are just super
               | invasive/restrictive/bloated - Gatekeeper and trustd,
               | that in my experience _often_ (not just when OCSP is
               | down) chewed through CPU often for example. IMO, even a
               | few years ago (when I mostly switched off from Macs) the
               | LaunchDaemon /Agent situation was getting totally out of
               | control, as were notifications and updates (worse than
               | Win10 even).
               | 
               | Here's a script (that no longer works apparently due to a
               | new system signing restriction) that disabled some of
               | those, to give an example of the amount of crap running
               | by default: https://gist.github.com/pwnsdx/1217727ca57de2
               | dd2a372afdd7a0f...
        
               | reaperducer wrote:
               | I think part of the aversion is that we're seeing a
               | generation come into being that doesn't understand that
               | Unix > Linux.
               | 
               | The way that for Windows people Unix was "other" and bad
               | and scary. Now we have legions of programmers who were
               | brought up on Linux, and now think of Unix as "other."
        
               | prewett wrote:
               | Having had the fun experience of compiling a fairly heavy
               | UI application on Unix, they all seem pretty "other" to
               | me. Solaris didn't do anything weird, so it was maybe the
               | only non-other. HP-UX had something really weird with
               | linking and I feel like it was lacking some shell
               | commands that were fairly standard. AIX did something
               | strange with shared libraries and their error messages
               | were decidedly non-standard, although they all had unique
               | code at the beginning so at least it was easy to search
               | for problems. I think AIX was the only one for which
               | malloc(0) = 0, all the others at least produced a valid
               | pointer. I can't remember what the problems with Irix
               | were, I think it was just that by 2008 Irix was just old
               | so getting an up to date compiler was troublesome. Linux
               | was just as "other" compared to the rest, but it was
               | increasingly full-featured. Solaris kept up for a while.
               | 
               | And admining them was definitely very different aside
               | from the basic shell commands.
        
               | AsyncAwait wrote:
               | > Since I made the move I never wax nostalgic for the
               | "Good Ole Days" of fighting for hours to get Wifi working
               | properly.
               | 
               | I can assure you that you didn't have to do that for
               | quite some time and it's not that which people are
               | looking for.
               | 
               | - Am looking for a system that lets me run any damn thing
               | I want without pipups, blocks, firewalls, warnings,
               | requiring signed binaries etc.
               | 
               | I am looking to run and develop for the same environment
               | I end up deploying on.
               | 
               | - I want a system that has native docker support, systemd
               | and makes updating the whole system or installing pretty
               | much anything as easy as one terminal command.
               | 
               | - It's important for me to trust my system; where I know
               | no single entity has more power over the machine than
               | myself and no secret upgrades I didn't desire are going
               | to be pushed my way.
               | 
               | - There's no telemetry in my ideal system, certainly not
               | at the system level and patched out at the app level
               | where possible.
               | 
               | - I want a system that is open, configurable, respects
               | the four freedoms and is community ran.
               | 
               | macOS cannot give me this, no matter how "fundamentally
               | BSD" it is. I value the freedom that free software gives
               | that no closed-source BSD ever could.
        
               | heavyset_go wrote:
               | > _I switched to MacOS because it 's fundamentally BSD
               | with a nice/ well integrated GUI. Almost all of the good
               | OSS I love is supported nearly perfectly._
               | 
               | This is the reason I initially started using macOS more
               | than a decade ago.
               | 
               | However, I've been told that I'm the wrong kind of user
               | by Apple fans whenever I criticize Apple for transforming
               | macOS from a pretty Unix into a locked-down App Store
               | appliance.
               | 
               | > _Perhaps I 'm a bit jaded after running into too much
               | bullshit trying to get Linux running well on laptops in
               | the 90s and 00s_
               | 
               | Linux has gotten much better, and the problems of the 90s
               | and 00s have vanished for my use case.
               | 
               | These days, at least to me, Linux is the pretty Unix that
               | just works that macOS used to be.
        
               | JAlexoid wrote:
               | Well... That's why I have a ThinkPad, that is certified
               | on Linux. (So your prejudice is dated)
               | 
               | I'm literally trying to figure out how to install Python
               | 3.6 alongside 3.9 in MacOSX .... right now, and it's not
               | a one line command.
               | 
               | So... No. It has massive issues with developer
               | friendliness. New OSX stalls with bluetooth mice and
               | randomly locks my keyboard(MBP 2020). The only thing I
               | can commend OSX on - battery life on a MacBook and
               | nothing else
        
               | wenc wrote:
               | Python: another way to do this is to install Anaconda and
               | then spin up virtual environments with specific Python
               | versions.                 conda create -n myenv
               | python=3.6
               | 
               | Having multiple versions of system Pythons can be
               | complicated. I've learned not to touch the system Python.
        
               | rootusrootus wrote:
               | > install Python 3.6 alongside 3.9 in MacOSX .... right
               | now, and it's not a one line command
               | 
               | To be fair, that's not easy on any OS (well, maybe
               | Windows). Certainly on CentOS it is a chore to get two
               | versions of Python installed simultaneously.
        
               | JAlexoid wrote:
               | I'm on Ubuntu - it's not as mindbogglingly hard as on
               | MacOSX.
               | 
               | Unsupported versions - harder, but still a few
               | commands...
               | 
               | Supported versions? sudo apt install python-3.6 and done.
        
               | heavyset_go wrote:
               | pyenv[1] might help you out in this department. It's also
               | cross-platform.
               | 
               | [1] https://github.com/pyenv/pyenv
        
             | eyelidlessness wrote:
             | Because the software is also the nice bit
        
               | OJFord wrote:
               | Well if you like both there's no problem is there.
               | 
               | Comment I replied to was 'I wouldn't pay for Apple
               | hardware if I didn't want the software' implying that
               | would be a stupid thing to do.
               | 
               | I prefer its hardware to anything else; I prefer Linux to
               | macOS. So that's exactly what I'd want to pay for.
        
           | rowanG077 wrote:
           | The hardware is the only good part unfortunately. I would
           | have clicked buy faster then the flash could if the M1 could
           | run Linux.
        
             | foldr wrote:
             | Isn't this pretty much an impossible ask, though? The
             | hardware is great largely because Apple have invested so
             | much in developing a custom SoC. But as a result, you can't
             | easily run a generic OS on it. It's not like Apple just
             | need to bridge the Linux jumper on the motherboard.
             | Supporting Linux would require Apple to maintain millions
             | of additional lines of code, and either hold themselves
             | hostage to decisions made by the Linux kernel team, or
             | maintain their own fork of Linux (which, aside from being
             | based on BSD, is essentially how we got to Mac OS in the
             | first place!)
        
               | rowanG077 wrote:
               | Actually it would only require Apple to release internal
               | documentation. There are enough Linux nerds to write all
               | the drivers. Graphics will probably be the hardest.
        
               | foldr wrote:
               | "Only". It would probably be easier for them to maintain
               | the Linux drivers themselves than to thoroughly document
               | every feature of the SoC.
               | 
               | It's not just about individual drivers though, it's about
               | the surrounding kernel infrastructure and the whole
               | desktop experience. For example, getting instant
               | suspend/resume working on Linux is not (I'm fairly sure)
               | just a matter of writing a driver for a particular bit of
               | hardware.
        
               | rowanG077 wrote:
               | They don't need to anything new. What they already have
               | is enough. Their own software engineers could handle it
               | fine. It's good enough for people who want this.
               | 
               | There are people who have clean room implemented entire
               | nvidia drivers. Without doc. We can manage fine with
               | whatever incomplete doc Apple allready has.
        
               | ogre_codes wrote:
               | It's been a few years since I used Linux so forgive me if
               | I'm off base here. But last time I used Linux with
               | Nvidia, you had the choice of using FOSS drivers with
               | mediocre performance, or having closed source drivers
               | that performed on-par with Windows.
        
       | vbezhenar wrote:
       | Not very impressive considering 5nm process. But a good start. I
       | expect impressive CPUs in the coming years with more cores and
       | more TDP. Hopefully Mac Pro Mini rumours will be true. That will
       | be strong candidate for my next computer.
        
         | HatchedLake721 wrote:
         | Not very impressive a CPU at 7-8W power drain beats or just
         | trails behind a desktop class $799 Ryzen 9 5950x at +49W
         | consumption in single threaded performance?
        
           | vbezhenar wrote:
           | Mac mini is a plugged computer. I don't care whether it
           | drains 7W or 700W. Electricity is cheap. And the fact that it
           | trails behind AMD on a better node means that its design is
           | inferior or not fully uncapped.
        
             | nottorp wrote:
             | Are you the kind that can't hear their computer fan because
             | they wear headphones?
             | 
             | And/or the kind that keeps their laptop plugged in all the
             | time?
             | 
             | Some people do care about cool, quiet and long battery
             | life.
        
         | singemonkey wrote:
         | Actually it's very impressive.
        
       | robotnikman wrote:
       | Its starting to look like ARM is the way forward in terms of
       | performance and battery life, and I feel PC's will soon follow in
       | the next few years.
       | 
       | My only hope is this doesn't mean things get further locked down
       | (such as being able to install linux distributions or dual boot)
       | but I have a bad feeling they will.
        
       | Deukhoofd wrote:
       | Interesting that it is not able to outperform the Zen3 CPUs. I
       | had expected it to do somewhat better, especially it being a 5nm
       | processor, and with all the hype around ARM processors.
       | 
       | I don't know how well it will hold up to its x86 competitors like
       | this, especially once they launch their 5nm CPUs next year.
        
         | HatchedLake721 wrote:
         | Have you seen benchmarks post 1st page? M1 at 7-8W power drain
         | beats or just trails behind a desktop class $799 Ryzen 9 5950x
         | at +49W consumption in single threaded performance. What did
         | you expect?
        
           | kllrnohj wrote:
           | 5950X's CPU cores at 5ghz consume around 20w, not +49W. And
           | it's extremely non-linear power scaling, such that at 4.2ghz
           | it's already down to half the power consumption at 10w/core.
           | 
           | The 5950X's uncore consumes a significant amount of power,
           | but penalizing it for that seems more than a little
           | unreasonable. The M1 is getting power wins from avoiding the
           | need for externalized IO for GPU or DRAM, but those aren't
           | strictly speaking _advantages_ either. I, for one, will
           | _gladly_ pay 20w of power to have expandable RAM  & PCI-E
           | slots in a device the size of the Mac Mini much less anything
           | larger. In a laptop of course that doesn't make as much
           | sense, but in a laptop the Ryzen's uncore also isn't 20w (see
           | the also excellent power efficiency of the 4800U and 4900HS)
        
           | Slartie wrote:
           | That doesn't say too much. There is a single thread
           | performance ceiling that all CPUs based on current
           | lithography technology available just bump against and can't
           | overcome. The Ryzen probably marks that ceiling for now, and
           | the M1 comes impressively close against it, especially
           | considering its wattage.
           | 
           | But you cannot extrapolate these numbers (to multi-core
           | performance or to more cores or to a possible M2 with a
           | larger TDP envelope), nor can you even directly compare them.
           | The Ryzen 9 5950x makes an entirely different trade-off with
           | regard to number of cores per CPU, supported memory, etc.,
           | which allows for more cores, more memory, more
           | everything...and that comes at a cost in terms of die space
           | as well as power consumption. If AMD had designed this CPU to
           | be much more constrained in those dimensions and thus much
           | more similar to what the M1 offers, they would surely have
           | been able to considerably drive down power consumption - in
           | fact, their smaller units 4800U and 4900HS which were also
           | benchmarked and which offer really good multithreading
           | performance for their power envelope, even better than the
           | M1, clearly demonstrate this fact.
           | 
           | What I read out of these benchmark numbers is: the ISA does
           | matter far less than most people seem to assume. ARM is no
           | magic sauce in terms of performance at all - instead, it's
           | "magic legal sauce", because it allows anyone (here: Apple;
           | over there: Amazon) to construct their own high-end CPUs with
           | industry-leading performance, which the x86 instruction set
           | cannot do due to its licensing constraints.
           | 
           | Both ISAs, x86_64 and ARM, apparently allow well-funded
           | companies with the necessary top talent to build CPUs that
           | max out whatever performance you can get out of the currently
           | available lithography processes and out of the current state
           | of the art in CPU design.
        
             | SJetKaran wrote:
             | > What I read out of these benchmark numbers is: the ISA
             | does matter far less than most people seem to assume.
             | 
             | This was my conclusion too. Does this mean, there is not
             | much possibility of desktop pcs moving to ARM anytime soon?
             | Perhaps, laptops might move to ARM processors, but even
             | that seems iffy, if AMD can come up with more efficient
             | processors (and Intel too with its Lakefield hybrid cpu)
        
         | acchow wrote:
         | I fully expect the 16" MBP to launch with a 12 or 16-core Apple
         | chip.
        
         | matsemann wrote:
         | Yeah, as someone whose next laptop wont be a Mac again, this
         | was a good ad for what AMD has achieved lately. MyLenovo P1 got
         | a Intel Xeon of some kind, and while I'm otherwise very happy
         | with the Laptop, the CPU is hot, uses way too much power and
         | constantly throttles.
        
         | fxtentacle wrote:
         | In multi-threaded mode - which is what Zen3 are optimized for -
         | the M1 barely reaches 30% of the Zen's performance.
         | 
         | I mean that's kind of expected if you compare a low-power CPU
         | with fewer cores against an unlimited-cooling desktop monster
         | with much more cores.
         | 
         | The M1 will likely be an amazing laptop chip, but still
         | unusable for demanding desktop work, e.g. CGI.
        
           | d3nj4l wrote:
           | I've posted this elsewhere in this thread, but the M1 on SPEC
           | reaches Desktop-tier performance, going toe to toe with the
           | 5950X: https://www.anandtech.com/show/16252/mac-mini-
           | apple-m1-teste...
        
             | Deukhoofd wrote:
             | In single core performance, yes, but as the next page on
             | the article shows, it's more comparable to the 4900HS, AMDs
             | mobile CPU in multithreaded performance.
             | 
             | https://www.anandtech.com/show/16252/mac-mini-
             | apple-m1-teste...
        
               | d3nj4l wrote:
               | Yes, sorry if that came off as misleading. I'll edit this
               | in elsewhere.
        
               | foldr wrote:
               | The 4900HS has a 35W TDP, though. The M1 in the Mac Mini
               | is estimated at around 20-24W TDP.
        
               | ip26 wrote:
               | We might be able to chalk that level of difference up to
               | process advantage.
        
               | sudosysgen wrote:
               | The 4900HS has way more I/O and is on 7nm, as well as has
               | more powerful graphics.
        
           | gruez wrote:
           | >I mean that's kind of expected if you compare a low-power
           | CPU with fewer cores against an unlimited-cooling desktop
           | monster with much more cores.
           | 
           | Are we looking at the same charts here? For cinebench
           | multithreaded, the AMD 4xxx series CPUs are zen 2 parts with
           | 15/35W TDP, hardly "unlimited-cooling desktop monster" like
           | you described.
        
             | foldr wrote:
             | From the article: "While AMD's Zen3 still holds the leads
             | in several workloads, we need to remind ourselves that this
             | comes at a great cost in power consumption in the +49W
             | range while the Apple M1 here is using 7-8W total device
             | active power."
             | 
             | Looking through the benchmarks, the zen 2 parts generally
             | seem to have lower performance than the M1. The cinebench
             | multithreaded benchmark is one exception. It's not that
             | surprising because the 4800U has more cores than the M1 has
             | high performance cores. The M1 wins the single threaded
             | cinebench benchmark.
        
               | kllrnohj wrote:
               | The Zen2 4800HS also outperformed the M1 in the
               | Specint2017 multi-threaded results, too.
               | 
               | The M1's float results are weirdly good relative to the
               | int results, though. Not sure why Apple seems to have
               | prioritized that so much in this category of CPU.
        
               | foldr wrote:
               | It's a higher TDP part (I think - it's 35W) and has more
               | high performance cores, so it's not surprising that it
               | would win some of the multicore benchmarks.
        
               | ric2b wrote:
               | Maybe because of javascript, where all numbers are
               | floats?
        
               | hajile wrote:
               | Not strictly true.
               | 
               | Taking a loop and adding a bunch of `x|0` can also often
               | boost performance by hinting that integers are fine (in
               | fact, the JIT is free to do this anyway if detects that
               | it can).
               | 
               | The most recent spec is also adding BigInt. Additionally,
               | integer typed arrays have existed since the 1.0 release
               | in 2011 (I believe they were even seeing work as early as
               | 2006 or so with canvas3D).
        
         | danaris wrote:
         | Interesting that your takeaway from all this is "oh, it can't
         | beat some of the top x86 chips in existence--it can only meet
         | them on even footing. Guess it'll be falling behind next year."
         | 
         | This is Apple's _first_ non-mobile chip ever. You think this is
         | the best they can do, ever?
        
           | fxtentacle wrote:
           | I'd expect NVIDIA to join the ARM CPU race, too. And they
           | have experience with the tooling for lots and lots of cores
           | from CUDA. So I'd expect to have 5x to 10x the M1's
           | performance available for desktops in 1-2 years. In fact,
           | AMD's MI100 accelerator already has roughly 10x the FLOPS on
           | 64bit.
           | 
           | That said, it's an amazing notebook CPU.
        
             | danaris wrote:
             | To quote from Ars Technica's review of the M1 by Jim Salter
             | [0]:
             | 
             | > Although it's extremely difficult to get accurate Apples-
             | to-non-Apples benchmarks on this new architecture, I feel
             | confident in saying that this truly is a world-leading
             | design--you can get faster raw CPU performance, but only on
             | power-is-no-object desktop or server CPUs. Similarly, you
             | can beat the M1's GPU with high-end Nvidia or Radeon
             | desktop cards--but only at a massive disparity in power,
             | physical size, and heat.
             | 
             | ...So, given that, and _assuming_ that Apple will _attempt_
             | to compete with them, I think it likely that they will, at
             | the very least, be able to match them on even footing, when
             | freed from the constraints of size, heat, and power that
             | are relevant to notebook chips.
             | 
             | [0] https://arstechnica.com/gadgets/2020/11/hands-on-with-
             | the-ap...
        
             | athms wrote:
             | >I'd expect NVIDIA to join the ARM CPU race, too.
             | 
             | Nvidia has been making ARM SoCs since 2008. They have been
             | used in cars, tablets, phones, and entertainment systems.
             | 
             | What do you think powers the Nintendo Switch?
        
               | fxtentacle wrote:
               | Agree. Yeah I should have thought about the Switch and
               | write things more clearly.
               | 
               | I meant that NVIDIA will start producing ARM CPUs
               | optimized for peak data-center performance, similar to
               | how they now have CUDA accelerator cards for data
               | centers, which are starting to diverge from desktop GPUs.
               | 
               | In the past, NVIDIA's ARM division mostly focussed on
               | mobile SoCs. Now that Graviton and M1 are here, I'd
               | expect NVIDIA to also produce high-wattage ARM CPUs.
        
           | dtech wrote:
           | > This is Apple's first non-mobile chip ever. You think this
           | is the best they can do, ever?
           | 
           | They have been making mobile ARM chips for quite some time,
           | so it's not like they are inexperienced.
        
             | canes123456 wrote:
             | Look at the tread lines. They been able to keep increasing
             | single core performance every year. There is no reason to
             | think that is stopping this year.
        
               | hajile wrote:
               | They increased IPC only around 5% with A14. The remaining
               | performance increase was from clockspeeds (gained without
               | increasing power due to 5nm).
               | 
               | Short, wide architectures are historically harder to
               | frequency scale (and given how power vs clocks tapers off
               | at the end of that scale, it's not a bad thing IMO).
               | 
               | 4nm isn't shipping until 2022 (and isn't a full node).
               | TSMC says that the 5 to 3nm change will be identical to
               | the 7 to 5nm change (+15% performance or -30% power
               | consumption).
               | 
               | Any changes next year will have to come through pure
               | architecture changes or bigger chips. I'm betting on more
               | modest 5-10% improvements on the low-end and larger
               | 10-20% improvements on a larger chip with a bunch of
               | cache tweaks and higher TDP.
               | 
               | Intel 10nm+ "SuperFin" will probably be fixing the major
               | problems, improving performance, and slightly decreasing
               | sizes for a final architecture much closer to TSMC N7.
               | 
               | I'm thinking that AMD ships their mobile chips with N6
               | instead of N7 for the density and mild power savings
               | (it's supposedly a minor change and the mobile design is
               | a separate chip anyway). Late next year we should be
               | seeing Zen 4 on 5nm. That should be an interesting
               | situation and will help resolve any questions of process
               | vs architecture.
        
               | canes123456 wrote:
               | I agree that most of the gains were due to the node
               | shrink. However, being able to stick to these tick tock
               | gains for the last several years is impressive. They
               | could have hit a wall in architecture and were bailed out
               | by the node shrink but I doubt they would have switch
               | away from Intel if that was the case.
        
           | mrweasel wrote:
           | I'd argue that the M1 is a mobile chip and it's the low end
           | model. You're still right, the M1 is no where near the best
           | Apple is able to deliver.
        
       | GiorgioG wrote:
       | I'm not buying into Apple's eventually-closed desktop computer
       | systems - I don't care what the performance is. They've been
       | slowly marching towards iOS's closed ecosystem model on the Mac
       | and with an in-house CPU, they can effectively lock users out of
       | alternate OS choices on their hardware. Buyer beware.
        
         | sschueller wrote:
         | I'm with you. I'm not going to support this crap.
         | 
         | We might be in the minority at the moment but the harder Apple
         | makes it to repair their machines by 3rd parties the less
         | likely people are going to buy such an expensive machine in the
         | future where a broken key means $500+ in repairs.
        
         | [deleted]
        
         | Findeton wrote:
         | Exactly, I don't care whatever they do, I'm not buying into
         | their closed garden of eden. I've already replaced Spotify with
         | Funkwhale, and I'm using Linux, I'm not their target and I'll
         | never be.
        
           | singemonkey wrote:
           | Appreciate you taking the time out of your busy day to let us
           | know that.
        
             | millzlane wrote:
             | Thank you, for letting us know you appreciate his opinion.
        
         | matvore wrote:
         | I get why this idea of becoming iOS-like is uncomfortable for a
         | lot of people, but will they take away the POSIX-ness of the
         | OS? What about all the people that spend all their time in Vim,
         | TMUX, Emacs, and/or zsh? I think these people will continue to
         | be pretty happy on macOS if they already are.
         | 
         | In my experience, installing alternate OS's on Mac hardware has
         | never been frictionless or satisfying anyway.
        
           | aero-glide wrote:
           | >In my experience, installing alternate OS's on Mac hardware
           | has never been frictionless or satisfying anyway.
           | 
           | It would be if they take some effort to support it.
        
             | GiorgioG wrote:
             | Bootcamp has always run fine for me. I've never tried to
             | install Linux on my Macs.
        
             | matvore wrote:
             | > It would be if they take some effort to support it.
             | 
             | Oh, definitely. But the Linux (or alternate OS) fans are
             | not really on Apple's radar. OTOH, they do a good job of
             | keeping some core binaries up-to-date, like zsh and Vim,
             | and they did appeal about getting good compile times during
             | the M1 release event, so they consider POSIX users part of
             | their target market.
        
               | JAlexoid wrote:
               | There's no link between - compile time and "POSIX users".
               | 
               | Last I checked, I can compile and deploy an iOS app
               | without the need for anything POSIX.
        
           | karteum wrote:
           | > will they take away the POSIX-ness of the OS?
           | 
           | It's not only about POSIX. After the X years of planned
           | lifetime (with proper software/OS updates), will there be any
           | solution to extend the lifetime (which is what I used Linux
           | for, on > 10 years-old laptops) ? I guess there will be no
           | solution against planned obsolescence...
           | 
           | And with regards to control and privacy : will Apple finally
           | give-up their policy of deciding (and tracking) "for your own
           | good" which apps you are allowed to install and launch ?
        
           | nsxwolf wrote:
           | I don't see why they would do this. If they just want to sell
           | iPads, they could do that tomorrow. Just release XCode for
           | Linux and Windows so that devs can create iOS apps and call
           | it a day.
           | 
           | Obviously they still intend for the Mac to remain a general
           | purpose computer or they wouldn't be putting this much effort
           | into it.
        
             | JAlexoid wrote:
             | I wouldn't count on Apple leaving the PC market.
             | 
             | Tablet dominated world never materialized.
             | 
             | Post PC era isn't here, the PC is dominant still. Even iPad
             | Pro got laptop like, than any laptop got iPad like. iPad's
             | sales are either stalling or declining.
             | 
             | Yes - Apple clearly wants to keep that laptop market and be
             | general enough to be useful. But general purpose is for
             | general public, not your average HN reader.
        
             | michaelt wrote:
             | When GiorgioG speaks of a closed, iOS-style ecosystem, I
             | don't think they mean literally running iOS on laptops.
             | 
             | Rather, they foresee OS X becoming a system where you can't
             | run programs that haven't received Apple's blessing and
             | been brought through Apple's store. Blessings that will be
             | denied to software like youtube-dl.
             | 
             | As to why they would do this? A combination of the good of
             | most users, who will enjoy protection from malware and
             | viruses; and the irresistible temptation of a 30% cut of
             | all sales.
        
             | FreakyT wrote:
             | Exactly. They still need something to develop iOS and MacOS
             | themselves on, so unless they want to move all their
             | internal lower-level development over to Windows or Linux
             | (which, IMO, doesn't seem like something Apple would do),
             | they'll need to continue producing something resembling a
             | general purpose computer.
        
         | rvense wrote:
         | I feel exactly the same way, but I do wonder how this gap is
         | going to close. If Microsoft team up with another ARM vendor
         | and make similarly closed-off, proprietary glue sandwiches as a
         | response, where does that leave Linux? I doubt PC-compatible,
         | x86 laptops are going to disappear off the face of the Earth
         | anytime soon, but... if there's basically two types of
         | machines, and ours have worse performance and half the battery
         | life at twice the thickness (plus a fan, as a free bonus), a
         | Linux machine is a tough sell to someone who hasn't already
         | bought in. For the Linux workstation experience to keep up, we
         | need more people coming in, and if new people can't dip their
         | toes on hardware they already own, that raises the bar
         | significantly.
        
           | massysett wrote:
           | Look at that new Raspberry Pi where everything is built into
           | the keyboard unit like the old Apple 2 and Commodore 64.
           | Linux's future is brighter than ever. Linux hardware that is
           | unique in its own right is much more interesting than trying
           | to install Linux on PC or Mac hardware and beat it into
           | submission.
        
           | rowanG077 wrote:
           | I'm guessing we are moving towards the same system as
           | embedded vendors use. Patch the Linux kernel so it runs. But
           | never upstream anything so you will run around with an old os
           | never to be updates again.
        
           | igneo676 wrote:
           | You just have to look at the phone community (XDA Developers,
           | etc) to see how this will (eventually) go.
           | 
           | A good example were the old Asus Transformer tablets. They
           | were a super niche device, but it still lended itself to
           | Linux and so a small team of people managed to load Ubuntu on
           | it.
           | 
           | Another are Samsung phones. They try to lock people out, but
           | they have popular enough devices that people find a way to
           | put LineageOS on them.
           | 
           | Finally, even iPhones aren't immune. Small teams of people
           | have managed to load Android on them and get it (partially)
           | working. More people would give them even greater
           | functionality.
           | 
           | If laptop manufacturers lock things down with ARM, there will
           | be people who work around those mitigations and install their
           | own OS on that hardware. Tooling will be developed to make
           | that process easier and easier for the next round of people
           | with that device (or future devices). It'll suck up front
           | until the community grows large enough to work around issues
           | faster and faster.
           | 
           | And that's even supposing worst case scenario. I'm not fully
           | buying the idea that you _won't_ be able to change the OS on
           | these laptops. Microsoft has tried (and failed) to lock other
           | OS's out of their laptops. Chromebooks are (currently) the
           | largest market of ARM laptops and you're able to change the
           | OS on them. Apple might be the only company even remotely
           | able to hinder freedom on their devices.
           | 
           | Either way. In the war on general computing, I'm generally
           | optimistic for the users.
        
             | m4rtink wrote:
             | That's actually quite a depressing look - while the
             | community can often get undocumented & closed user hostile
             | hardware run their OS of choice, it's hardly ever as
             | seamless as installing a modern Linux distro on about
             | anything x86, usually without issues - mostly thanks to
             | standards such as BIOS/UEFI, ACPI & others.
             | 
             | Also even if you liberate a single device, it does not mean
             | all your hack will work on the next one - it's a never
             | ending battle. And without making sure manufacturers
             | actually respect some standards such as they do on x86, it
             | might become a loosing battle long term...
        
               | igneo676 wrote:
               | Sure, the lockdown situation is worse than things are
               | currently. You're likely to need per-device hacks that
               | unlock it and enable freedom for users.
               | 
               | Even in that scenario though, ARM devices use standards
               | too. There's a reason I can generally pick up any Android
               | device and know what needs to be done to build my own OS
               | for it. We just lack tooling that makes that incredibly
               | easy and lack maintainers who want to make those devices
               | work with the mainstream linux kernel.
               | 
               | Having open devices though (outside of Apple) is still my
               | bet. We still need to make that process smoother but that
               | just means there's lots of low hanging fruit :)
        
         | faitswulff wrote:
         | As someone who sympathizes with your perspective, I do believe
         | that this is a minority position. Most people don't care about
         | closed ecosystems - just look at Facebook's popularity. Look at
         | the Apple App Store. Government intervention would be needed to
         | break up these closed gardens.
        
           | GiorgioG wrote:
           | > I do believe that this is a minority position. Most people
           | don't care about closed ecosystems
           | 
           | You're absolutely right. I bought my first MacBook Pro (17")
           | in 2007 (and it's still running at my parents' house!) Over
           | time all the machines in the house save my work/gaming rig
           | have been replaced by Macs (iMacs, MBP, 12" MB, etc.)
           | 
           | Now I plan on reversing course. I'll keep my iPhone for now
           | because everyone in the family lives in the blue bubbles
           | (iMessage.)
        
             | electriclove wrote:
             | Re: Reversing course.. what will you replace your Macs
             | with?
        
               | GiorgioG wrote:
               | The iMac will be replaced with a custom-built PC.
               | 
               | The MacBook Pros are another story. I don't know. I can't
               | see me buying any new Intel Macs since they'll be phased
               | out at some point.
               | 
               | I've never had much luck with the Dell XPSes. I may give
               | ThinkPads a look (I have a P50 at my current job which
               | has been fine (if not for all the corporate antimalware
               | slowing it to a crawl.))
        
               | electriclove wrote:
               | Thanks for the response. I'm hesitant to move my family
               | away from Macs because they have been relatively easy to
               | support.
        
               | GiorgioG wrote:
               | I'm hesitant as well, for the same reason. Having said
               | that my personal PC (desktop) is still running well 3+
               | years after initially installing Windows on it. My
               | inlaws' very old laptop is still running fine (it's at
               | least 8 years old and was upgraded (on accident!) from
               | whatever version of Windows it was running prior to 10.)
               | Aside from user errors (accidentally installing adware
               | toolbars in Chrome, etc) there really hasn't been any
               | real issues as it pertains to Windows itself.
        
         | konart wrote:
         | >they can effectively lock users out of alternate OS choices on
         | their hardware
         | 
         | People buy mac to use macOS. Some will also use bootcamp for
         | windows if VM is not enough for their tasks. And installing
         | linux instead of macOS on a mac even sounds strange.
         | 
         | So - nothing to be aware of. Macs always were build to run
         | Apple OS
        
         | breakfastduck wrote:
         | Going on internet comments they've been marching towards this
         | for years. I've yet to see anything actually happen.
        
           | matvore wrote:
           | Eminent Internet speculation about Apple is correct
           | surprisingly often.
           | 
           | EDIT: I'm getting downvoted, maybe because I was too cryptic.
           | To clarify, it seems like the rumors that gain traction with
           | mainstream sites and Apple-focused YouTube channels and
           | forums tend to have strong correlation with _something_ that
           | will happen later.
        
             | breakfastduck wrote:
             | This speculation has been going on for years.
             | 
             | They're not going to remove the ability to run arbitrary
             | code or the unix core. There would literally be no reason
             | to buy a mac over another product of theirs.
        
               | matvore wrote:
               | I suspect there is an element of truth to it. I agree
               | they will keep the Unix core and allow you to use clang
               | and ld however you want. I suspect, however, they're not
               | interested in supporting alternate OS's nor are they
               | interested in allowing typical users to download normie
               | GUI apps from anywhere.
        
               | breakfastduck wrote:
               | Oh absolutely, it's just not as extreme as people make
               | out.
               | 
               | Yes, they won't support dual booting linux - you can run
               | a VM, but if that's a dealbreaker - fair enough.
               | 
               | There's absolutely no chance however that savvy users
               | will not be able to continue running non app store apps.
               | 
               | Apple want control, but they're not stupid enough to
               | completely lock down the development machines for their
               | entire ecosystem.
        
           | xenadu02 wrote:
           | > Going on internet comments
           | 
           | So instead of going on evidence and public statements by
           | Apple executives - statements that have repeatedly said the
           | Mac is the Mac - you choose to believe random internet
           | comments?
           | 
           | There is a delicate balance between protecting average users
           | who have no clue what is safe software and what isn't vs
           | allowing power users and developers to do what they want.
           | Since the days of ActiveX controls we've know if you give
           | users a "Please pwn me" dialog they'll just click "OK".
           | They've been trained that computers put up lots of pointless
           | dialogs they can't understand even if they take the time to
           | read them so just click until it gets out of the way. Even
           | with default security settings if opening an app from an
           | "unidentified developer" fails you can go into System
           | Preferences > Security and click "Allow".
           | 
           | macOS is trying to protect people by default while still
           | allowing the HN crowd to turn these protections off if they
           | so wish.
           | 
           | Apple Silicon Macs still allow you to disable SIP which turns
           | off a lot of modern protections. You can still downgrade boot
           | security. It is a deliberate decision to continue allowing
           | ad-hoc code signing. Software can still be distributed
           | outside the Mac App Store either with a Developer ID or
           | without. The vast majority of Mac users don't know or care
           | what any of these things are but the Mac has always allowed
           | them and as Craig has said several times over: the Mac is
           | still the Mac. It is still the system that supports hobbyists
           | and developers - people who sometimes want to poke at the
           | system, install their own kernel extensions, etc.
           | 
           | If your complaint is that things are not wide-open by default
           | anymore then I don't know what to tell you. We don't live in
           | the same software landscape we once did and there are far
           | more malicious actors out there. Protecting users by default
           | is the right thing to do IMHO.
        
             | breakfastduck wrote:
             | Are you sure you've replied to the right comment?
             | 
             | I was literally making the point that these rumours have
             | persisted for years and nothing has ever come of it.
             | 
             | I couldn't agree more with the rest of your comment!
        
       | xuki wrote:
       | Performance per watt of this thing is INSANE, take a look at the
       | battery left after compiling WebKit, compared to older Mac:
       | 
       | https://techcrunch.com/wp-content/uploads/2020/11/WebKit-Com...
       | 
       | https://techcrunch.com/2020/11/17/yeah-apples-m1-macbook-pro...
       | 
       | IMO the all day battery will be THE killer feature of the new
       | laptops.
        
         | stevehawk wrote:
         | will it? do people really spend that much time away from an
         | outlet while on their laptop?
        
           | xuki wrote:
           | All day battery and very light laptop will change people's
           | behavior. Previously they might not bring the laptop out as
           | much because of power constraint.
        
           | zarkov99 wrote:
           | Yes. Even at home it's so much nicer to be able take the
           | laptop anywhere without worrying about wires.
        
           | SulfurHexaFluri wrote:
           | My school had a policy of no chargers at school due to safety
           | regulations (Fires/tripping) so the macbook air made an
           | excellent choice.
        
         | interestica wrote:
         | These gains are just through improvements in processors. Now,
         | combine that with parallel leaps for battery tech... It's also
         | good that battery tech hasn't jumped too far ahead already -
         | the constraint has helped in the development in these mobile
         | chips. The display panel is still the biggest energy draw on my
         | laptop. There's a lot of chance for improvement there too. A 2
         | day battery in 5 years isn't that crazy.
        
           | legulere wrote:
           | Apple is slowly moving to mini led displays and probably
           | micro led displays after that. That should also improve
           | energy use.
        
           | xuki wrote:
           | Apple is the kind of company that will choose to slim down
           | the laptop instead of making it 2 day battery life.
        
             | Razengan wrote:
             | Apple is the kind of company that will choose to slim down
             | the laptop and improve the performance AND increase the
             | battery life.
             | 
             | Like they just fucking did with the M1 MacBooks.
        
               | xuki wrote:
               | Yeah, I'm not saying it's a bad thing. They know the
               | battery range they need and keep it there, to make room
               | for other features.
        
             | oblio wrote:
             | As much as I dislike them doing it while disregarding other
             | aspects, we need/want slimmer. A notebook PC should be as
             | thin as a real notebook or maybe even less. And it should
             | be sturdy enough, after all, a 5-6 mm plate of metal is
             | quite robust.
        
             | munificent wrote:
             | I think that's the right call, honestly.
             | 
             | I need to recharge my own wetware at least once a day. So
             | there is a nearly guaranteed several hour idle period where
             | I'm not using any technology and where laptops and phones
             | can be recharging too.
             | 
             | I don't see much end user value in not taking advantage of
             | that.
             | 
             | It's like if you parked your car literally at a gas station
             | every night anyway. Would you really care about a fuel tank
             | that could let you drive for more than 24 hours?
        
               | murukesh_s wrote:
               | I would want a 2, 3, even 7 day battery. Why not? I don't
               | want to bother about charging every day.
               | 
               | But my worry is that with more and more efficiency in
               | computing and battery tech, if Apple instead decides to
               | reduce the battery capacity? Especially with no
               | competition in sight, or competitors trying to follow
               | Apple, we may end up with smaller batteries instead of
               | more battery life. They seems to be doing that with
               | iPhone. Also 18 hour battery life can quickly degrade to
               | few hours if some processes are spinning the CPU
               | continuously which can happen knowingly or unknowingly
               | with several apps open (Docker occasionally does that to
               | my MBP).
               | 
               | With several days battery life, I can even go for short
               | travels without even bothering about how to charge it.
        
               | SulfurHexaFluri wrote:
               | The problem with multi day battery life is you never get
               | in to a routine and you get caught out every time. I had
               | a pebble watch which had a 10 day battery life and almost
               | every 10th day it would go flat on me mid day or I'd get
               | a warning about low battery while I'm outside and then
               | forget about it when I get home.
               | 
               | Now I have an Apple Watch that lasts almost but not quite
               | 2 days, I charge it every night and it has never gone
               | flat on me and I find it no hassle since I take it off
               | before bed anyway so I just drop it on the charger.
        
               | jakear wrote:
               | External battery for cases when you need it, ultralight
               | for cases when you need it. The ancient carpet the world
               | vs wear slippers debate.
        
               | Analemma_ wrote:
               | Especially now that thanks to USB-C, it's easier than
               | ever to augment laptop batteries. For $60 you can grab a
               | 100Wh RavPower battery pack that can double the lifespan
               | of a MacBook.
        
             | AgloeDreams wrote:
             | I'm not sure I or really anyone actually wants a 2 day
             | battery life. Like, we can do that on smartphones right now
             | but users have signaled that the 1 day device is fine for
             | them, notably because of the human gap.
             | 
             | You know the gap.
             | 
             | If you charge your phone every night it becomes a habit
             | tied to your daily routine.
             | 
             | If you were to charge your phone every other night, you
             | might lose track of what day you are on, not charge it and
             | then the perceived battery life experience is worse. This
             | is why smart watches with 3-4 days of battery have not
             | prevailed over those with one heavy day of battery. They
             | are annoying to know what day you are on so you might just
             | charge it every night and if you do, the platform is
             | trading off so much power that the experience is worse.
             | 
             | Plus, then you have to carry 2 days worth of battery or
             | have half the power envelope as a laptop with one day. the
             | concept all sounds great but the reality of people using
             | things really has honed in on the fact that these things
             | need to fit into habit and use cases that make sense.
        
               | twhb wrote:
               | Why are you assuming you need to charge a 2-day device
               | every other day? You charge it every night, and in
               | exchange you make it through heavy use days, late nights,
               | and the times you forget to charge it. I had a 2-day
               | phone and downgraded to a 1-day phone and my phone now
               | dies on me much more often, including in each of those
               | scenarios, and looking at the battery level and charging
               | have become a bigger part of my life.
        
             | Brendinooo wrote:
             | Do you think that's still true in 2020? Seems like thinness
             | hasn't been a major consideration for a couple of years
             | now.
        
               | xuki wrote:
               | They're doing it again with the new iPhone. All of this
               | year iPhone have a smaller battery.
        
               | reaperducer wrote:
               | And they all added hardware features to fill that space.
               | Look at the iFixit teardowns and you'll see it wasn't to
               | make the phones thinner. It was to make room for wireless
               | charging, LIDAR, and other features.
        
             | reaperducer wrote:
             | _Apple is the kind of company that will choose to slim down
             | the laptop instead of making it 2 day battery life._
             | 
             | Hopefully the Era of Ive is over. There are promising signs
             | around, but I'm not ready to believe it yet.
        
               | threeseed wrote:
               | Ive doesn't unilaterally decide how thin the laptop is.
               | 
               | It's a joint decision made by Hardware Engineering,
               | Product Management, Design, Operations etc.
               | 
               | And frankly everyone wants thin laptops just with top
               | tier performance which it looks like we will get.
        
             | ogre_codes wrote:
             | It gets 18 hours battery life. For most people, that is 2
             | day battery life.
        
             | ogre_codes wrote:
             | Apple just increased battery life on the MacBook Pro from
             | ~12 hours to 20 hours and kept the form factor the same.
             | Likewise the MacBook Air. Feedback from people using them
             | suggests gains are even bigger for people who use them on
             | battery under load.
        
             | samatman wrote:
             | ...maybe?
             | 
             | The Air form factor is already pushing the limit of what
             | you can do with aluminium and still have high confidence it
             | won't warp when you shove it into your bag, or have it fall
             | over when you open the screen past vertical.
             | 
             | I could see them slimming down the battery to get more
             | components in, maybe, rather than two day battery life.
             | 
             | Which would probably be the right call. There just aren't
             | enough circumstances where not plugging in your laptop
             | while sleeping is necessary to justify it.
             | 
             | Personally I'd like them to make a model with a cellular
             | modem, with all-day battery life even reasonably far from a
             | tower. That would be fun.
        
         | anuila wrote:
         | This metric is hilarious and perfect.
        
         | faitswulff wrote:
         | The takeaway quote I've been sending people from the TC article
         | is:
         | 
         | > And, most impressively, the M1 MacBook Pro used [just] 17% of
         | the battery to output an 81GB 8k render. The 13" MacBook Pro
         | could not even finish this render on one battery charge.
        
       | kowlo wrote:
       | Should I be annoyed about buying a 2020 iMac 27" i7 5700XT 32
       | days ago?
        
         | jakeva wrote:
         | I wouldn't be, the M1 isn't available in that form factor yet
         | and even if it was it would be first gen, and in the meantime
         | you still have a really great computer.
        
           | kowlo wrote:
           | Thanks for taking the time to reply! I will try to remind
           | myself that it's a good machine. Only real issue I have with
           | it is the audible fan at idle... not ideal when recording!
           | 
           | I suppose the M1 release has just amplified my buyer's
           | remorse
        
       | klelatti wrote:
       | It's interesting what this will do for the positioning of Mac
       | Mini. If you want a desktop with decent performance in a compact
       | package the Mini might just be first choice having languished
       | with underpowered Intel CPUs for so long.
        
       | ChrisMarshallNY wrote:
       | I'm not surprised it's doing well.
       | 
       | Apple has deeper pockets than anyone else on the planet, and they
       | have considerable experience doing this kind of thing -literally,
       | _decades_.
       | 
       | Say what you will about Apple; this is a strong point for them.
       | 
       | But I'm still waiting for the M2 before I upgrade. I'm also
       | interested in new form factors. Right now, they are still relying
       | on the currently-tooled production line for their shells. They
       | now have the ability to drastically change their forms.
        
         | dtech wrote:
         | I'd say Intel has the deepest pockets regarding CPU R&D, and
         | yet they are being overtaken left and right.
        
           | mastax wrote:
           | Intel's architectures have been massively delayed by process
           | issues. They're still shipping Skylake (Aug 2015)
           | architecture processors on the desktop and server because
           | they waited too long to change strategy. About a year ago
           | they announced they're going to start decoupling the
           | microarchitecture from the manufacturing process. 2021 will
           | be show the first fruits of that labor with Rocket Lake,
           | which is Ice Lake (Sept 2019) backported to 14nm. If they had
           | done that at the first sign of manufacturing trouble (2014?)
           | they could have 2 more generations of IPC improvement and
           | still be ahead in every way except efficiency. I guess Intel
           | management was more concerned with not rocking the boat.
        
           | WrtCdEvrydy wrote:
           | There is a natural progression where companies go from
           | engineering-driven to finance-driven. It's usually a death
           | march.
        
           | AgloeDreams wrote:
           | I also feel like Intel's depth is also limited by the breath
           | of CPUS they must develop. With every release they are
           | shipping tons of specific sets of cores and clock speeds to
           | meet their market. Then you have the raw investment in Fab
           | that has turned out to be just lighting cash on fire for
           | Intel. They make all kinds of claims and then fail over and
           | over, plus they are hemorrhaging key talent. I think their
           | soul really isn't in the game.
           | 
           | Apple has the luxury of building two or three chips total per
           | year and simply funding TSMC fab. All of this is to fund the
           | largest grossing annual product launch. If their chips fail
           | at being world beaters, hundreds of billions of dollars are
           | on the table. All in, Apple spends an incredible amount of
           | money here, ~$1 billion. Per chip design shipped, Apple is
           | probably spending much more but also getting their return on
           | investment. It's such a tight integration that if TSMC were
           | ever delayed by say, four months, I have no idea what Apple
           | would do.
           | 
           | AMD is playing smart, fast and loose. Best chip CEO by a wide
           | margin. AMD's gains really are on Apple's back, their chip
           | design is brilliant and they get to reap the leftovers when
           | Apple turns out their latest chip. They don't have to fund
           | Fab, they don't have to make crazy claims to appear relevant
           | like Intel does. They just ship great bang for the buck and
           | the fab gains and their own hard work has given them best
           | performance title too. Going Fabless was one of the most
           | controversial choices ever made in the industry...and wow,
           | was it the right move.
        
             | ChrisMarshallNY wrote:
             | _> Best chip CEO by a wide margin_
             | 
             | Reminds me of this story:
             | https://www.theregister.com/2018/04/16/amd_ceo_f1/
        
           | ChrisMarshallNY wrote:
           | Probably right. I have family that works for Intel, and I
           | have heard stories about the amenities and infrastructure
           | (like "Air Intel," a fleet of corporate jets that take
           | employees between Intel campuses).
        
       | [deleted]
        
       | snazz wrote:
       | > The MacBook Air and MacBook Pro get chips with all 8 GPU cores
       | enabled. Meanwhile for the Mac Mini, it depends on the SKU: the
       | entry-level model gets a 7 core configuration, while the higher-
       | tier model gets 8 cores.
       | 
       | This appears to be wrong. From what I can see on apple.com, the
       | Air offers the choice of a 7 or 8 core GPU, while the Pro and
       | mini start with the full chip.
        
       | wincy wrote:
       | They mixed up the GPU configurations in the article, currently it
       | says the Mac Mini has 7 and 8 core configurations, but that's
       | actually the MacBook Air that has the lower gpu option.
        
       | fareesh wrote:
       | I feel like if one can hold out another year or two for the
       | second iteration that will probably be a better long-term bet. My
       | current macbook is about 6 years old.
        
       | Technically wrote:
       | It makes me very annoyed they aren't selling a minimal laptop
       | version. I really don't get why they're doubling down on forcing
       | the touch strip onto their laptops when nobody is demanding this
       | and, thankfully, no competitor wants to acknowledge such a
       | feature or the software burden requiring that hardware implies.
        
       | CapriciousCptl wrote:
       | From the article's conclusion... "The M1 undisputedly outperforms
       | the core performance of everything Intel has to offer, and
       | battles it with AMD's new Zen3, winning some, losing some. And in
       | the mobile space in particular, there doesn't seem to be an
       | equivalent in either ST or MT performance - at least within the
       | same power budgets."
       | 
       | This is the first in-depth review validating all the hype.
       | Assuming the user experience, Rosetta2 things, first generation
       | pains, kernel panics, are all in-check, it's amazing. At this
       | point I'm mostly interested in the Air's performance with its
       | missing fan.
        
         | desireco42 wrote:
         | Yeah me too. I expect that there will be another release
         | sometime next year with updated "package". It is cool that they
         | put it in old Air box, but I think I can wait for better
         | camera, overall package. By that time it will be clearer all
         | the benefits and issues that come with it.
         | 
         | I wouldn't mind plastic edition of MacBook /w M1. Aluminum and
         | metal overall, not the best for everyday use. I prefer "warmer"
         | feel of plastic, like ThinkPad for example.
        
           | nicoburns wrote:
           | I suspect the aluminium frame is a key part of the passive
           | thermal cooling. I'd be very surprised if we see a plastic
           | version.
        
             | qz2 wrote:
             | I do miss the plastic ones myself. My sweat dissolves
             | macbooks and they give me a rash.
        
               | skavi wrote:
               | lol, clean your computer.
        
         | luto wrote:
         | I can't find that quote or even the words "undisputedly" or
         | "Zen3" in the article. Was it changed or, if it wasn't, can you
         | give me a pointer, please?
        
           | ajb wrote:
           | It's on page 7
        
           | juliand wrote:
           | You have to jump to the article's conclusion.
        
             | toxik wrote:
             | A jump to conclusions, if you will.
        
               | hyperdimension wrote:
               | Of course, we'd already know that if only we had a jump
               | to conclusions _mat_!
               | 
               | It's a shame it wasn't a commercial success.
        
           | [deleted]
        
           | alnorth wrote:
           | You have to use the dropdown to select the right page. As
           | they said, it's in the conclusion of the article.
        
           | mbrd wrote:
           | The article has multiple pages. You can find the conclusion
           | here: https://www.anandtech.com/show/16252/mac-mini-
           | apple-m1-teste...
        
           | [deleted]
        
           | HatchedLake721 wrote:
           | It's insane how bad the website's UX is for first time
           | visitors not seeing there's more content behind the first
           | page.
        
             | 727564797069706 wrote:
             | Wow, indeed! I thought the conclusion part in the article
             | was weird. Now I see why - I was at the end of the first
             | page!
        
             | freehunter wrote:
             | That's how websites used to fit more ads into an article
             | before constantly-updating Javascript ads became the trend.
        
               | HatchedLake721 wrote:
               | Yes, but with a big huge button "NEXT PAGE" or something
               | like that. Look at the number of comments here that
               | didn't even notice there's more content other than the
               | 1st page.
        
           | chrismorgan wrote:
           | AnandTech split some articles onto multiple pages. The print
           | view gets you the whole article on one page, so I rather
           | prefer it: https://www.anandtech.com/print/16252/mac-mini-
           | apple-m1-test...
        
             | porphyra wrote:
             | Wow, I never knew about the print view. It's way more
             | readable, and the lack of comments section makes it quite
             | fast to load despite the very long page.
             | 
             | The tiny drop-down menu in the default view is very hard to
             | discover and quite annoying to click on (many other review
             | sites, like Phoronix, have similar annoying drop-downs).
        
             | jdeibele wrote:
             | As I remember, they charged a membership fee for being able
             | to download the whole article as a PDF.
             | 
             | It seemed somewhat reasonable that an article that would be
             | passed around the department or on to your boss would
             | require a fee.
             | 
             | I can't find any mention of it on their website, though. Am
             | I getting my websites confused or did they drop it
             | altogether?
        
               | _djo_ wrote:
               | You're thinking of Ars Technica, I think.
        
               | jdeibele wrote:
               | Yes, you're right. Thank you. And they're still doing
               | offering PDF's only to members. Which I don't have a
               | problem with.
        
         | testfoobar wrote:
         | Chrome and 20-50 tabs, and my Intel Macbook can be used as a
         | blow dryer. Assuming Chrome's power needs don't change, it
         | seems that the only way to control for overheating an M1 is
         | going to be throttling down - slowing everything down. Curious
         | how M1 machines feel during day to day usage.
        
           | gnicholas wrote:
           | I have a 2017 MBP (base, no TB) and found that Chrome made my
           | fan rev like crazy. A friend told me about Brave and I tried
           | it out. Now my fan only kicks on when I'm doing serious work.
           | I know some folks don't like Brave for various reasons, but I
           | love it because my MBP is almost always silent.
        
           | pwthornton wrote:
           | The reviews I saw said that using Chrome gets good better
           | life, but if you want great battery life you need to use
           | something like Safari.
           | 
           | I switched to Safari a few years ago, and I couldn't be
           | happier. Chrome's performance and battery life are atrocious.
           | I only use Chrome when I need something specific from it.
        
             | gnicholas wrote:
             | I saw that comment in a couple different places. Presumably
             | Chrome is running through Rosetta 2, whereas Safari is
             | native to the M1. I imagine once Chrome is available
             | natively, performance will be somewhat better, though
             | probably still not as good as Safari.
        
               | simonh wrote:
               | On the new machines yes, but none if the people you read
               | about before today had AS machines, they were all
               | comparing Chrome and Safari on Intel, so it's unlikely
               | that stays quiet will change once Chrone is native on AS.
        
               | zitterbewegung wrote:
               | IIRC Chrome is a battery hog / memory hog on all
               | platforms.
               | 
               | Am I wrong in this regard?
        
               | rowanG077 wrote:
               | Afaik it's a memory hog but it doesn't really use more or
               | less battery then other browsers.
        
           | megablast wrote:
           | Chrome is just awful on a mac. I am not sure why anyone uses
           | it. FF is much nicer to use.
        
             | markholmes wrote:
             | I've been using Safari as my daily driver for some time and
             | it's quite nice to use. Don't be afraid to give it another
             | chance.
        
               | sumedh wrote:
               | Does Safari have extensions like Ad Blocker and does it
               | have good developer tools?
        
               | aculver wrote:
               | 1. Yes, it does. I use AdBlock Pro. 2. Yes, it does. I've
               | been using Safari as my primary browser as a Rails
               | developer for at least the past decade and have always
               | found the developer tools at least adequate. I don't use
               | the developer tools on other browsers, so I don't know
               | what I might be missing.
        
               | wildrhythms wrote:
               | Adblock Pro no longer exists for Safari (in the form of
               | an "official" extension).
               | 
               | Now there's "AdBlock for Safari" developed by BETAFISH
               | INC, which offers in-app purchases including "Gold
               | Upgrade" which "unlocks" some basic features that
               | gorhill's uBlock Origin already has for every other
               | browser.
               | 
               | https://help.getadblock.com/support/solutions/articles/60
               | 002...
               | 
               | Not switching until there are some better options for
               | this.
        
               | gnicholas wrote:
               | Safari is migrating to a new system of extensions that
               | will make it much easier to port from Chrome. However, I
               | understand it still requires Xcode (which non-Mac folks
               | can't run) and a developer license (which not everyone
               | wants to pay for). I hope to bring my Chrome extension to
               | Safari, but honestly it's not a priority because most
               | people who install extensions are not running Safari
               | (when you consider that most people are not on Mac, and a
               | large chunk of folks on desktop Safari are there because
               | it's the default -- and therefore would not likely
               | install extensions).
        
             | mattnewton wrote:
             | Speed mostly, though the last time I tried out Firefox
             | seriously was over a year ago, it was noticeably much
             | slower on pages (ab)using lots of javascript.
        
             | mycall wrote:
             | Does Edge Chromium for MacOS have the same awfulness?
        
             | testfoobar wrote:
             | I particularly like Chrome profiles. I have a few profiles
             | with their own bookmarks/histories/open tabs/etc. For
             | example, one of my profiles is "Shopping". Another is
             | "Work" and yet another is "Social Media".
             | 
             | Context switching profiles at a macro level - as opposed to
             | intermingling work/shopping/social - is beneficial to me.
             | 
             | When I switch over to "Shopping", I have my tabs on
             | whatever purchase I'm researching open. I can drop the
             | whole project for a few weeks and resume it later right
             | where I left off. None of it can bleed over into my "Work"
             | profile. I like the separation. Helps keep my head clear.
        
               | majormunky wrote:
               | Firefox has something like this called containers. The
               | best example is one for facebook, where any call to any
               | facebook servers only works in the facebook container. It
               | has similar setups as well, Work, Home, Commerce, etc.
        
               | testfoobar wrote:
               | Not quite the same. I want to set-up and tear-down entire
               | macro groups of windows and tabs while keeping others
               | active.
               | 
               | Opening my 'Shopping' profile brings up windows and tabs
               | from where I left off. Same with "Social". When I don't
               | want distractions, I just close those profiles. No
               | notifications, no updates, etc. I like the separation.
        
               | windexh8er wrote:
               | Simple Tab Groups [0] + Multi-Account Containers [1] are
               | my workflow for that exact case. Simple Tab Groups hides
               | the tabs based on the group you're in and the Multi
               | Account Containers can keep them segmented from a
               | contextual standpoint.
               | 
               | I can't stand Chrome either and so I've been using these
               | two together for about a year now I believe. Using a
               | naked version of Chrome is jarring given my browser feels
               | like it fits how I use it being setup like this.
               | 
               | [0] https://addons.mozilla.org/en-
               | US/firefox/addon/simple-tab-gr... [1]
               | https://addons.mozilla.org/en-US/firefox/addon/multi-
               | account...
        
               | feanaro wrote:
               | Firefox also has profiles, though they're not a very
               | prominent feature and are a bit less polished as a
               | result.                   firefox --ProfileManager
        
               | TsomArp wrote:
               | Is that a Chrome for Mac feature? Never seen it before.
               | Care to elavorate?
        
         | spacenick88 wrote:
         | As for kernel panics, with iOS likely sharing most if not all
         | of it's kernel code with macOS I'd be surprised if Apple hasn't
         | had an iPhone macOS build since before they released the first
         | iPhone.
        
         | paulpan wrote:
         | Unsung hero here is TSMC and their industry-leading 5nm
         | fabrication node. Apple deserves praise for its SOC
         | architecture on the M1 and putting it together, but the
         | manufacturing advantage is worth noting.
         | 
         | Apple is essentially combining the tick (die/node shrink) and
         | tock (microarchitecture) cadences together each year, at least
         | the past 2-3 years. The question, perhaps a moot one, is how
         | much the performance gains can be attributed to either? The
         | implication is that the % improvement due to tock is available
         | to other TSMC customers, such as AMD, Qualcomm, Nvidia, and
         | even Intel.
         | 
         | We'd have to wait until next year (or 2022) once AMD puts Zen4
         | on 5nm and see an apple-to-apples comparison on the per thread
         | performance. But of course by then Apple will be on TSMC 3nm or
         | beyond...
        
         | kllrnohj wrote:
         | > At this point I'm mostly interested in the Air's performance
         | with its missing fan.
         | 
         | I think the clear store here is that the Air will definitely be
         | slower than the rest over time. This isn't a 10W SoC, clearly,
         | so it definitely can't run at its best while being passively
         | cooled.
         | 
         | How it behaves when throttled will be interesting for sure
         | though.
        
           | pwthornton wrote:
           | It looks like it only begins to throttle after 8.5 minutes of
           | sustained high loads -- like exporting 4K video.
           | 
           | In a lot of day-to-day work that is much more peaks and
           | valleys, you may never see the throttling.
        
             | kllrnohj wrote:
             | Seeing reddit reports of gaming throttling more quickly
             | than that, which would make sense since the GPU is going to
             | sit at or near 100% pretty easily with a game and you'll
             | still be seeing a decent CPU load.
        
               | SulfurHexaFluri wrote:
               | The macbook air is probably the worst machine you can
               | think of for gaming. Even without the M1 you won't have a
               | great time. Its the perfect machine for students because
               | it can last all day in a browser and word.
        
         | fxtentacle wrote:
         | That conclusion is quite misleading, in my opinion.
         | 
         | They write "outperforms the core performance" and the keyword
         | here is "core". What they mean is that if one had a single-core
         | Zen3 and a single-core M1, then the M1 would win some and lose
         | some.
         | 
         | But in the real world, most Zen3 CPUs will have 2x or more
         | cores, thus they'll be 2x to 4x faster.
         | 
         | So what they mean to say is that they praise Apple for having
         | amazing per-core performance. But it kind of sounds as if the
         | M1 had competitive performance overall, which is incorrect.
        
           | 314 wrote:
           | The Zen3 processor that they are comparing it to is the 5950x
           | - the fastest desktop processor with a TDP of 105W. The
           | entire system power of the M1 mini under load was 28W.
           | 
           | What the article is pointing out is that the mobile low-power
           | version of the M1 (as the mini is really just a laptop in a
           | SFF box) is competitive with the top-end Zen3 chip; the
           | benchmark gap is smaller than 2x.
           | 
           | We don't know yet how far the M1 scales up, e.g. a
           | performance desktop will presumably have a higher TDP and
           | probably trade the integrated GPU space for more CPU cores.
           | But we don't known if/how this will translate into
           | performance gains. Previous incarnations of the Mac Pro have
           | also used multiple CPUs so it is not yet clear if "in the
           | real word, most Zen3 CPUs will have 2x or more cores".
        
             | [deleted]
        
             | [deleted]
        
             | oefrha wrote:
             | Not to mention 5950X alone without cooling ($799) costs
             | almost as much as an entry level MacBook Air.
        
               | SulfurHexaFluri wrote:
               | Well that CPU has 16 cores / 32 threads while the M1 has
               | 4 high power cores and 4 low power ones.
        
               | mey wrote:
               | The single core performance between the 5600x and the
               | 5950x isn't significantly different. The charts have some
               | interesting gaps...
               | 
               | Edit: Putting it head to head with the 5600x would make a
               | lot of sense for price/core/desktop space comparison.
        
               | qz2 wrote:
               | Yes. I'd like to see a decent-ish Ryzen APUs such as the
               | 3400G up against one of these as well.
               | 
               | I did notice that the cinebench for the M1 is only about
               | 10% higher than my Ryzen laptop (T495s) which is
               | laughable as it's a 3500U and the whole thing cost me
               | PS470 new!
        
               | alwillis wrote:
               | _Not to mention 5950X alone without cooling ($799) costs
               | almost as much as an entry level MacBook Air_
               | 
               | The M1-based Mac mini starts at $699.
        
               | oefrha wrote:
               | Yeah, forgot about that. Everything else being equal
               | (ostensibly), the M1 Mac Mini is $200 cheaper than the
               | crappy Intel i5 Mac Mini, more if you upgrade the Intel
               | CPU.
               | 
               | As an owner of a decked out 2019 Mac Mini, in hindsight I
               | made a shitty purchase decision.
        
               | jdeibele wrote:
               | I bought what I thought was a 2020 Mac Mini in April
               | direct from Apple. The only significant difference on
               | paper was that the base model came with 128GB for the
               | 2018, 256GB for the 2020.
               | 
               | As it turns out, that's true: About This Mac says "Mac
               | mini (2018)" even for the 2020.
               | 
               | I replaced the 8GB base RAM with 32GB of aftermarket and
               | have been thrilled with it. But then I was coming from a
               | 2018 MBP 4-Thunderbolt with only 8GB and the fan noise
               | with it drove me nuts.
               | 
               | I got the i3 because I thought the CPU wasn't the weak
               | point, the RAM was. And so far, for me, that's held up.
        
               | dylan604 wrote:
               | No matter what the purchase, I always force myself to
               | stop comparing for a bit of time after the purchase. By
               | the time I pull the trigger, I have shopped and compared
               | as best I can. Inevitably, as soon as I complete the
               | sale, one of the places I was looking will have lowered
               | the price or release the next-gen.
        
               | arvinsim wrote:
               | I sold my 2018 Mac Mini with high specs 1 week before the
               | keynote.
               | 
               | The guy must be feeling bad right now.
        
               | adolph wrote:
               | The Resulting Fallacy Is Ruining Your Decisions
               | 
               | http://nautil.us/issue/55/trust/the-resulting-fallacy-is-
               | rui...
               | 
               |  _There's this word that we use in poker: "resulting."
               | It's a really important word. You can think about it as
               | creating too tight a relationship between the quality of
               | the outcome and the quality of the decision. You can't
               | use outcome quality as a perfect signal of decision
               | quality, not with a small sample size anyway. I mean,
               | certainly, if someone has gotten in 15 car accidents in
               | the last year, I can certainly work backward from the
               | outcome quality to their decision quality. But one
               | accident doesn't tell me much._
        
               | alwillis wrote:
               | _As an owner of a decked out 2019 Mac Mini, in hindsight
               | I made a shitty purchase decision._
               | 
               | Probably not. If you need a machine to get work done, as
               | you probably did, it always makes sense to buy what's
               | current.
               | 
               | It's different if you can afford to wait for a particular
               | upgrade we know is coming.
               | 
               | I bought a 4k Retina iMac a little over a year ago
               | because I needed to badly and it's been great.
        
               | JAlexoid wrote:
               | Why?
               | 
               | Today's purchase of Mac Mini will be a crappy decision in
               | hindsight in about a year... and that is true every year.
               | 
               | It would have been a crappy decision - if you got a worse
               | product at the time of purchase. So don't get FOMO.
        
               | secabeen wrote:
               | I actually just bought an Intel Mac Mini to run MacOS VMs
               | with using ESXi. I expect it will be quite a while before
               | stable Mac VM support is available for Apple Silicon
               | Macs.
        
             | kllrnohj wrote:
             | > The Zen3 processor that they are comparing it to is the
             | 5950x - the fastest desktop processor with a TDP of 105W.
             | The entire system power of the M1 mini under load was 28W.
             | 
             | This is a very misleading statement. They primarily only
             | used the 5950X in single-core tests, and in those tests it
             | doesn't come remotely close to 105W. In fact per
             | Anandtech's own results[1] the 5950X CPU core in a single-
             | core load draws around 20w.
             | 
             | Take the M1's 28W under a multi-threaded load, that's going
             | to be somewhere in the neighborhood of 4-5w/core for the
             | big cores probably (single-core was ~10w total, ~6w
             | "active" - figure clocks drop a bit on the multi loads, and
             | then the little cores are almost certainly much less power
             | draw particularly since they are also much, much slower).
             | In multithreaded loads the per-core power draw on a 5950x
             | is around 6w. That's a _much_ closer delta than the "105W
             | TDP vs. ~28W!" would suggest.
             | 
             | M1's definitely got the efficiency lead, but it's also a
             | bit slower and power scales non-linearly. It's an
             | interesting head-to-head, but that 105W TDP number of the
             | 5950X is fairly irrelevant in these tests. That's not
             | really playing a role. Just like it's about as irrelevant
             | as you can get that the 5950X is 4x the big CPU cores,
             | since it was again primarily used in the single-threaded
             | comparisons. Slap 16 of those firestorm cores into a Mac
             | Pro and bam you're at 60w. Let it run at 3.2ghz all-core
             | instead of the 3ghz it appears to now since you've got a
             | big tower cooler and that's 100w (6w/core @ 3.2ghz per the
             | anandtech estimates * 16). That'd be the actual multi-
             | threaded comparison vs. the 5950X if you want to talk about
             | 105W TDP numbers.
             | 
             | Critically though the M1 is definitely not a 10W chip as
             | many people were claiming just a few days ago. You're
             | definitely going to see differences between the Air & 13"
             | MBP as a result.
             | 
             | 1: https://www.anandtech.com/show/16214/amd-zen-3-ryzen-
             | deep-di...
        
               | 314 wrote:
               | > This is a very misleading statement. They primarily
               | only used the 5950X in single-core tests, and in those
               | tests it doesn't come remotely close to 105W. In fact per
               | Anandtech's own results[1] the 5950X CPU core in a
               | single-core load draws around 20w.
               | 
               | It would seem that the switching of AMD chips in the
               | various graphs have caused some confusion. I was
               | referring to the "Geekbench 5 Multi-Thread" graph on page
               | 2. This shows a score of 15,726 for the 5950x vs 7715 for
               | the M1. This is about 2x. I do not see any notes that the
               | benchmark is using less cores than the chip has
               | available.
               | 
               | I don't follow your argument for why it is misleading to
               | characterize the 5950x as a 105W TDP in this benchmark.
               | Could you expand a little on why you believe this is
               | misleading? The article that you have linked to shows
               | over 105W of power consumption from 4 cores - 16.
               | 
               | Edit: I put in the wrong page number in the clarification
               | :) Also, I see later in the linked article that the 15726
               | score is from 16C/32T.
        
               | kllrnohj wrote:
               | If you're referring to the single time the 5950X's multi-
               | threaded performance was compared then sure, the 105W TDP
               | is fair. But you should also be calling that out, or
               | you're being misleading, as the _majority_ of the 5950X
               | numbers in the article were single-threaded results, and
               | it did not appear in most of the multi-threaded
               | comparisons at all.
               | 
               | But in multi-threaded workloads it also absolutely
               | obliterates the M1. Making that comparison fairly moot
               | (hence why Anandtech didn't really do it). It's pretty
               | expected that the higher-power part is faster, that's not
               | particularly interesting.
        
               | 314 wrote:
               | It's really not clear what you are trying to argue here.
               | The number of single-threaded benchmarks are irrelevant
               | to this point: when the M1 was compared to the 5950X in a
               | multithreaded comparison:
               | 
               | * The 5950X was 2x faster * The 5950X was using 4x the
               | power (28W system vs 105W+ for the processor). * The M1
               | only has 4 performance cores, the 5950X has 16.
               | 
               | Even counting the high-efficiency cores as full cores in
               | the comparison has the M1 with 8-cores providing 1/2 the
               | performance of the 5950X with 16-cores, i.e. it implies
               | that the lower performance cores are providing as much as
               | the 5950X cores.
               | 
               | That is certainly not the 5950X obliterating the M1, as
               | the article stated (and was the quote that started this
               | thread) the M1 is giving the 5950X a good run for its
               | money. If you think otherwise could you provide some kind
               | of argument for why you think so?
        
               | kllrnohj wrote:
               | The 2x number you're claiming was only for geekbench
               | multithreaded, which was the only multithreaded
               | comparison between those two in the Anandtech article.
               | You're trying to make broad sweeping claims from that one
               | data point. That doesn't work.
               | 
               | Take for example the CineBench R23 numbers. The M1 at
               | 7.8k lost to the 15W 4800U in that test (talk about the
               | dangers of a single datapoint!). The 5950X meanwhile puts
               | up numbers in the 25-30k range. That's a hell of a lot
               | more than 2x faster. Similarly in SPECint2017 the M1 @ 8
               | cores put up a 28.85, whereas the 5950X scores 82.98.
               | Again, a lot more than 2X.
               | 
               | This is all ignoring that 2x faster for 4x the
               | performance is also actually a pretty good return anyway.
               | Pay attention to the power curves on a modern CPU or what
               | for example TSMC states about a node improvement. For 7nm
               | to 5nm for example it was either 30% more efficient or
               | 15% faster. Getting the M1 to be >2x faster is going to
               | be a lot harder than cutting the 5950X's power
               | consumption in half (a mild underclock will do that easy
               | - which is how AMD crams 64 of these into 200W for the
               | Epyc CPUs, after all). But nobody cares about a 65w
               | highly multithreaded CPU, either, that's not a market.
               | Whatever Apple comes up with for the Mac Pro would be the
               | relevant comparison for a 5950X.
        
               | sudosysgen wrote:
               | You're being obtuse. The only test you're using is
               | Geekbench, which just isn't useful for these kinds of
               | comparisons.
               | 
               | In other multicore benchmarks, the M1 gets beaten by
               | parts with lower TDPs by AMD, and the 5950X has something
               | like 3 to 4+ times more performance.
        
               | michaelmrose wrote:
               | An author who deliberately switches which chip to test in
               | different versions of the same test in order to paint the
               | desired picture isn't much different than one who
               | literally makes up the numbers. The whole article ought
               | to be flagged and deleted.
        
               | JAlexoid wrote:
               | M1 has a lot of great things about it and I'm excited to
               | see the what it can bring. Intel needs to be humiliated
               | by something great, to remind that they have been crap
               | for a long time.
               | 
               | But... Other than ST performance, the multi-core CPU
               | isn't linear at scaling. At 16cores - core-to-core
               | communications take a hit, that is not as bad as for 4
               | cores.
        
               | egsmi wrote:
               | > This is a very misleading statement. They primarily
               | only used the 5950X in single-core tests, and in those
               | tests it doesn't come remotely close to 105W.
               | 
               | That's true but keep in mind this is the power going into
               | the AMD CPU only. The power number measured for the mini
               | was the entire system power pulled from the wall, so that
               | 28W included memory, conversion inefficiencies and
               | everything. That's crazy.
        
               | adrian_b wrote:
               | Actually a significant power, maybe around 20 W, is
               | consumed by the I/O chip, which consumes a lot because it
               | is made in an old process.
               | 
               | In 2021, when the laptop Zen 3 will be introduced, that
               | will have a much better power efficiency, being made all
               | in 7 nm.
               | 
               | Of course, it will still not match the power efficiency
               | of M1, which is due both to its newer 5-nm process and to
               | its lower clock frequency at the same performance.
        
               | kllrnohj wrote:
               | > which consumes a lot because it is made in an old
               | process.
               | 
               | And also because it's doing a lot. Infinity fabric for
               | the chiplet design isn't cheap, for example. A single-die
               | monolithic design avoids that (which is why that's what
               | AMD did for the Zen2 mobile CPUs).
        
               | JAlexoid wrote:
               | When we get to the detailed comparisons - it's almost
               | impossible to compare without deconstructing the chips.
               | 
               | In the end it'll be a question of - can Apple scale it
               | without incurring massive costs?
        
             | saltminer wrote:
             | >the fastest desktop processor with a TDP of 105W
             | 
             | TDP is a useless marketing figure. Anand measures the AC
             | power consumption of the Mini, which is a good measure, but
             | that is not comparable against CPU TDP because TDP has a
             | tenuous relation to actual power draw at best [0]. A better
             | comparison would be ARM Mini vs Intel Mini AC power draw,
             | and a similarly spec'd AMD system for good measure.
             | Unfortunately, unless I missed something, the article only
             | measured AC power draw from the ARM Mini.
             | 
             | The M1 is certainly more power efficient than Intel or AMD
             | for the average user, but as far as performance per watt,
             | we cannot make any judgements with the data we have.
             | 
             | [0] https://www.gamersnexus.net/guides/3525-amd-ryzen-tdp-
             | explai...
        
           | piyh wrote:
           | It bodes really well for future chips with higher power
           | budgets. The Pro seems a bit underwhelming for what it could
           | be though.
        
             | Tagbert wrote:
             | That new MacBook Pro replaces the low end of the Pro line
             | which had a slower CPU and only 2 ports.
             | 
             | I would expect that, when Apple brings out their next
             | iteration of chips, they would target the higher end of the
             | Pro line with more cores and ports along with higher RAM
             | capacities.
        
               | SulfurHexaFluri wrote:
               | I'm guessing they also want devs of tools like docker to
               | finish porting their software before they switch the rest
               | of the macbooks over.
        
             | arvinsim wrote:
             | They only need to add 2 ports to the Pro to differentiate
             | it from the Air.
        
             | bonestamp2 wrote:
             | On performance, ya I agree. Although, they basically
             | doubled the battery life over the previous generation so
             | that alone might be worthwhile for some users.
             | 
             | I think we'll see an additional higher end Macbook Pro 13"
             | when they start to release Apple Silicon models with
             | discrete GPUs.
        
           | d3nj4l wrote:
           | I feel like most people here haven't seen the SPEC benchmarks
           | AnandTech performed (and they're partly to blame for that;
           | their UX is awful). But the M1 is toe-to-toe with desktop
           | Ryzen: https://www.anandtech.com/show/16252/mac-mini-
           | apple-m1-teste...
           | 
           | E: And multi-core SPEC:
           | https://www.anandtech.com/show/16252/mac-mini-
           | apple-m1-teste..., where they're on par with mobile Ryzen.
        
             | fxtentacle wrote:
             | I looked up multi-core Cinebench R23 and the AMD 2990WX
             | comes in at 33,213 vs. 7,833 was given for the M1 in the
             | article.
             | 
             | Apple markets this as a "Pro" device for professional video
             | editing. That's why I believe it is fair to take their word
             | and compare it against my other options for a professional
             | video editing rig. And in that comparison, which Apple has
             | chosen itself, the M1 comes out woefully inadequate at a
             | mere 24% of the performance.
             | 
             | Of course, for a notebook, the M1 is amazing. But I feel
             | irked that Apple and Anandtech pretend that it's
             | competitive with desktop workstations by having such a
             | misleading conclusion about it being on par with Zen3 -
             | which it clearly isn't.
        
               | ebg13 wrote:
               | > * the AMD 2990WX comes in at *
               | 
               | Oh, neat. What kind of battery life do you get on your
               | AMD 2990WX laptop?
        
               | read_if_gay_ wrote:
               | > Apple markets this as a "Pro" device for professional
               | video editing. That's why I believe it is fair to take
               | their word and compare it against my other options for a
               | professional video editing rig.
               | 
               | That's ridiculous. Threadripper has 8 to 16 times as many
               | cores, runs on hundreds of watts of power and such a CPU
               | alone costs the same as several Mac Minis. Them claiming
               | you _can_ use it for video editing doesn 't mean you can
               | expect that a 1.5 pound notebook will measure up to
               | literally the biggest baddest computer you can buy.
        
               | nickpeterson wrote:
               | He knows it's ridiculous, but you're going to see a large
               | group of people who hate macs take this turn of fortune
               | quite poorly. My hope is that it really puts pressure on
               | intel to start firing on all cylinders but who knows? A
               | MacBook Pro 16 with higher clocks and more gpu cores
               | would be a really hard system to not buy.
        
               | sudosysgen wrote:
               | It is ridiculous, that being said on a per-core basis at
               | a similar wattage Zen 3 is equivalent to the M1 chip, but
               | has an order of magnitude more I/O.
        
               | djrogers wrote:
               | > Apple markets this as a "Pro" device for professional
               | video editing.
               | 
               | No they don't. Claiming something is capable of video
               | editing and marketing it as a video editor are two very
               | different things.
               | 
               | The 3 macs introduced this week are apples _lowest end_
               | devices, 2 of which still have 'big brother' intel
               | versions for sale today.
               | 
               | If you're truly 'irked' that the lowest-end, lowest
               | power, first release devices aren't comparable in
               | performance to the highest end desktop chips, then you're
               | putting the wrong stuff in your coffee.
        
               | alwillis wrote:
               | _Apple markets this as a "Pro" device for professional
               | video editing_
               | 
               | No, they don't. Because Apple keeps raising the ceiling
               | on low-end devices like the 13-inch MacBook Pro, in many
               | aspects, it's more performant than a high-end laptop or
               | desktop Mac from just a few years ago.
               | 
               | Please read the best article so far that explains what
               | "Pro" means for Apple--it just means _nicer_ ; it doesn't
               | mean _for professionals._ https://daringfireball.net/2020
               | /11/one_more_thing_the_m1_mac...:
               | 
               |  _Wait, wait, wait, you might be saying, the MacBook Pro
               | is pro. But as I've written numerous times, pro, in
               | Apple's product-naming parlance, doesn't always stand for
               | professional. Much of the time it just means better or
               | nicer. The new M1 13-inch MacBook Pro is pro only in the
               | way, say, AirPods Pro are. This has been true for Apple's
               | entry-level 13-inch MacBook Pros -- the models with only
               | two USB ports -- ever since the famed MacBook "Escape"
               | was suggested as a stand-in for the then-still-missing
               | retina MacBook Air four years ago._
        
               | d3nj4l wrote:
               | That is an absurd comparison. AnandTech clearly mentioned
               | that the M1 was on par in _single core_ , not in multi
               | core.
        
               | ksec wrote:
               | I am not even sure if you are serious or if you are
               | trolling.
               | 
               | Not only did Apple Not compared a Laptop CPU against a
               | _Workstation_ CPU, Anandtech didn 't pretend it to be
               | competitive with Desktop Workstation.
        
               | caycep wrote:
               | You can do toe to toe best of the best speeds and
               | feeds...
               | 
               | But I think the broader strategic outlook is: yes, the M1
               | loses on a few benchmarks, but the fact that it gets
               | ballpark to some monster rig multiple times in price and
               | power - is this not the whole picture of the Clayton
               | Christensen disruption curve?
               | 
               | The other point is - Apple's Logic and Final Cut software
               | are probably optimized for the M1, and they can likely
               | achieve much of the capabilities of the monster AMD rig
               | for a fraction of the cost/power budget.
        
           | nicoburns wrote:
           | IIRC the biggest Zen3 mobile CPUs are 8-core. So they'll have
           | at most 2x cores. And that's ignoring the low-power cores on
           | the mac which probably still count for half a core each.
           | 
           | AMD is likely to be faster in multicore overall, but not by
           | much it seems.
        
             | neogodless wrote:
             | There are no announced or released Zen 3 mobile CPUs at
             | this time. You are correct in that the Zen 2 mobile CPUs
             | currently top out at 8 cores, and up to about 54W TDP - the
             | top CPU is the "45W" Ryzen 9 4900H which can be configured
             | up to about 54W by the OEM. We might see Zen 3 mobile early
             | in 2021.
        
           | msie wrote:
           | M1 has multiple cores. 2x-4x multiple cores does not
           | necessarily mean 2x-4x faster.
        
             | d3nj4l wrote:
             | The M1 has a big.LITTLE design with only half of the cores
             | being performance-oriented, so if there is a gap, it would
             | almost always be in Ryzen's favour.
        
           | Uehreka wrote:
           | They'll be 2-4x faster in some multicore tasks. CPU
           | benchmarks specifically break out single core performance as
           | a separate metric, because as of 2020 a lot of everyday work
           | is single core bound (stuff like 3D graphic design, video
           | editing or compiling large codebases not considered "everyday
           | work").
           | 
           | Not to mention that even in multicore tasks, you don't
           | usually scale perfectly linearly due to overhead. And also,
           | the biggest Ryzen processors are usually in desktops, and
           | Apple Silicon hasn't entered that market yet.
        
             | JAlexoid wrote:
             | For most everyday work - Raspberry Pi is fast enough, so
             | it's not even an argument. Raspberry Pi 8GB is 10x cheaper?
             | There are mini desktops starting at $250 that will do
             | everyday work.
             | 
             | If you throw in "everyday work" - then we have passed the
             | need for new chips altogether.
        
               | rootusrootus wrote:
               | > For most everyday work - Raspberry Pi is fast enough
               | 
               | That really stretches the meaning of 'everyday work'
               | quite a lot. The pi is dog slow, even compared to an
               | Intel i9 ;)
        
               | hajile wrote:
               | That's a bit of an overstatement. Booting from SSD
               | instead of SD card has an enormous uptake in performance.
               | I have yet to hear of a Pi 4 that couldn't overclock to
               | 2GHz which is a pure uplift of 25%. Moving to 64-bit PiOS
               | gives another double-digit jump in performance too. Not
               | record-breaking, but not unusable either.
        
           | [deleted]
        
           | eyelidlessness wrote:
           | Single thread performance still matters a lot for personal
           | computer use. It's not everything, normal people do benefit
           | from some degree of parallelization, but there's a reason
           | _all_ of the major PC chip designs continue to push single
           | thread performance even as that becomes more difficult. Most
           | end users see more benefit from those improvements than from
           | more cores.
        
           | breakfastduck wrote:
           | Passively cooled first generation macbook air chip isn't
           | quite as fast as an absolute monster grade PC Ryzen chip on
           | its 3rd generation. Color me shocked.
           | 
           | I think you're just trying your hardest to convince yourself
           | that these chips aren't competitive.
        
       | Shivetya wrote:
       | Interesting for many gamers, Blizzard announced native support
       | for World of Warcraft[0] on Apple Silicon. This give hope to
       | other games from Blizzard coming to the platform as well and may
       | encourage other developers to join in.
       | 
       | The M1 has been shown to run Civ6 and Rise to Tomb Raider through
       | Rosetta faster than previous integrated GPU mac hardware[1]
       | 
       | [0]https://us.forums.blizzard.com/en/wow/t/mac-support-
       | update-n...
       | 
       | [1]https://www.macworld.com/article/3597198/13-inch-macbook-
       | pro...
        
       | mciancia wrote:
       | It's funny to see Intel falling from 1st place in essentially 2
       | player game around two-three years ago to the 3rd place now
        
       | hyperrail wrote:
       | In reviews of the new M1 Macs as a whole, I'd like to see a
       | comparison with Qualcomm's best: meaning the fastest current
       | Windows ARM laptops. Up to now, their performance has at best
       | only kept up with comparably priced x86 models, despite pressure
       | on Qualcomm by Microsoft and the device makers. Maybe some new
       | competition will shake Qualcomm out of its complacency...
        
         | alwillis wrote:
         | _In reviews of the new M1 Macs as a whole, I 'd like to see a
         | comparison with Qualcomm's best: meaning the fastest current
         | Windows ARM laptops._
         | 
         | You may recall that Apple already embarrassed Qualcomm years
         | ago when they shipped 64-bit ARM-based chips at least a year
         | before Qualcomm could do it.
         | 
         | When the new iPhone ships, the next fastest phone is the iPhone
         | being replaced by the new flagship phone. The Android phones
         | based on Qualcomm's best processors are way behind Apple's mid-
         | level and entry-level phones.
         | 
         | The A series chips used in Apple's phones and tablets are way
         | faster than anything Qualcomm is shipping for laptops, never
         | mind the M series.
        
           | hyperrail wrote:
           | From what I've heard myself in the past, I don't think
           | Qualcomm is as far behind as you say, but I would not be
           | surprised if it was.
           | 
           | This puts me in mind of one time that I was ranting to a
           | Microsoft colleague over lunch about how MS shouldn't be
           | exclusive with Qualcomm for ARM chips given the very low
           | rewards over the years. They said to me that when Windows
           | Phone 7 first was under development in 2008-2010, choosing
           | Qualcomm exclusivity seemed best because Qualcomm was the
           | only one willing to make decent BSPs for us at Microsoft.
           | 
           | Upon reflection, I realized that this was still the case as
           | of our conversation years later. The Mediateks, Samsungs, and
           | Nvidias of the world either did not work with Microsoft at
           | all, or got spurned by Microsoft themselves, or gave up after
           | 1 or 2 high-profile failures (such as Surface RT). Texas
           | Instruments was a notable exception as they gave up on ARM
           | SoCs altogether, thus killing what would have become a TI-
           | based Windows RT tablet platform.
           | 
           | Now neither me nor my coworker was in a position to actually
           | know what was going on here, but I think this anecdote
           | illustrates the value of a trustworthy business partner even
           | when their products look mediocre.
        
             | ksec wrote:
             | Qualcomm was never far behind.
             | 
             | And the 64bit ARM Apple chip surprised even ARM themselves.
             | As ARM didn't even have a reference Cortex design out when
             | Apple shipped their first 64 bit SoC. ( Apple was part of
             | early member programme ) And no one thought they will need
             | 64bit so early. ( Which was also true at the time ).
             | 
             | Qualcomm has to optimise for cost, for the same Die Space
             | Qualcomm already includes a Modem, while Apple has Modem as
             | separate pcs of silicon. It isn't Qualcomm is technically
             | subpar, they just have different objective and goals. And
             | vendors are already calling foul for Qualcomm's continue
             | increase in price. ( Which is actually normal due to the
             | complexity of 5G, CPU, GPU and leading edge node
             | development )
        
       | samgranieri wrote:
       | This is really impressive for (what I perceive to be) an entry-
       | level computer. I always enjoy reading the technical rigor of
       | AnandTech's deep dives, and seeing Apple's first chip go head to
       | head with Intel and AMD's chips indicates the future is bright. I
       | can't wait to see how the 16 inch MacBook Pro, the iMac
       | replacement, and the Mac Pro replacement test out.
        
       | pdpi wrote:
       | Someone posted these results on Reddit earlier:
       | https://www.reddit.com/r/macgaming/comments/jvrck7/m1_macboo...
       | 
       | As far as I know, DotA 2 is running on Rosetta.
        
         | faitswulff wrote:
         | Those are some significant improvements. I'm actually really
         | tempted to get a Macbook Air now versus my current plans to
         | build a cheap gaming rig.
        
           | joshstrange wrote:
           | Ehh, if you are going to game you are going to want the Pro
           | at a minimum but honestly the mac gaming scene is still
           | sparse. It all really comes down to what type of game you
           | want to play I guess but even if all the games you want are
           | on mac then I'd still say you want the fan in the MBP.
        
             | faitswulff wrote:
             | I'm still playing nearly decade old games. I just need some
             | thing competitive in decade old gaming so that my wife and
             | I can play together.
        
               | nottorp wrote:
               | Not Apple then. They ditched 32 bit applications and in 2
               | years they'll ditch 64 bit x86 applications.
               | 
               | You want your old games to run forever, you sadly have to
               | do that on x86 Windows (or maybe Linux with more or less
               | of a headache setting them up).
        
               | coder543 wrote:
               | Agree 100%. If you wouldn't buy an iPad to play your
               | games, don't buy a Mac to play those games either. Unless
               | by "games" you literally just mean WoW, which seems to be
               | one of the only major cross platform games that seriously
               | cares about Mac support.
               | 
               | Personally, I would lean towards suggesting the purchase
               | of a console. The new generation has some really nice
               | consoles, and the Nintendo Switch is still really fun in
               | other ways.
        
               | faitswulff wrote:
               | Rosetta 1 was around for 5 years, so I expect Rosetta 2
               | to last at least that long, but that's still a good
               | point. Thanks!
        
       | akritrime wrote:
       | I am confused about one thing, in multithreaded scenarios, can an
       | application use 8 threads or 4? Also how does the scheduling
       | work, can I pin a task that I know will be demanding on the
       | firestorm cores?
        
         | MBCook wrote:
         | Applications can use all 8 cores. There is one benchmark in the
         | article where they somehow turn off the efficiency cores to
         | figure out they add about 30% to the total CPU power in multi-
         | core.
        
           | akritrime wrote:
           | Oh I see it. I missed their SPEC2017 page.
        
       | CivBase wrote:
       | I wouldn't care if their laptops have 10x the performance and
       | power efficiency. Apple's closed ecosystem and "big brother knows
       | best" mindset are bad enough on their own - but together, they
       | are downright intolerable.
        
       | davidhyde wrote:
       | > "Whilst the software doesn't offer the best the hardware can
       | offer, with time, as developers migrate their applications to
       | native Apple Silicon support, the ecosystem will flourish."
       | 
       | I think that more developers would be excited about migrating
       | their apps to support native apple silicon if Apple wasn't so
       | developer hostile at the moment. I am referring to stuff like
       | Apple's Online Certificate Status Protocol
       | (https://blog.jacopo.io/en/post/apple-ocsp/) and their Apple Tax
       | war.
       | 
       | They need customers but they also need developers.
        
         | rimliu wrote:
         | You should quit HN for a day or two. The stuff echo-chamber of
         | HN repeats often has very little to do with reality. Apple has
         | many thousands of developers. It is not hostile to them in any
         | way, shape or form. For some reason this cliche is most often
         | repeated by the people who never wrote a single line of code
         | for iOS or MacOS
        
           | millzlane wrote:
           | >It is not hostile to them in any way, shape or form. For
           | some reason this cliche is most often repeated by the people
           | who never wrote a single line of code for iOS or MacOS.
           | 
           | Maybe the word "hostile" is the wrong word. But I have apps
           | on windows that ran on 95 that still work to this day without
           | having to be "rewritten". It's no surprise it's often
           | repeated by people who never wrote a single line code for an
           | OS that they have come to expect will change things so
           | dramatically that they will have to spend more time and
           | effort supporting those OS changes and not creating software.
        
           | qz2 wrote:
           | Wrong. Apple's iOS/macOS development footprint is far far far
           | smaller than the footprint of generic software development
           | using macOS. I have seen a lot of developers shitting the bed
           | this week over the state of macOS both in person and in
           | various other places.
           | 
           | We already have a couple of people who have got so fucked off
           | with macOS their macs are running Ubuntu 24/7 and they're
           | buying Dell/Lenovo next time. Hell I sold my Apple kit
           | earlier this year because I was completely fed up of dealing
           | with broken shit all the time. It's just a horrible
           | experience.
        
       | x87678r wrote:
       | Is there a media ban? I thought we'd get a bunch of macbook
       | reviews today. I guess wont be long, they'll be in the shops
       | soon.
        
         | MBCook wrote:
         | Today is the day the embargo drops it seems. So all the normal
         | sites are posting their reviews.
        
       | fithisux wrote:
       | Good incentive to warm to Risc-V ecosystem.
        
       ___________________________________________________________________
       (page generated 2020-11-17 23:01 UTC)