[HN Gopher] GFXBench: Apple M1
       ___________________________________________________________________
        
       GFXBench: Apple M1
        
       Author : admiralspoo
       Score  : 131 points
       Date   : 2020-11-13 19:34 UTC (3 hours ago)
        
 (HTM) web link (gfxbench.com)
 (TXT) w3m dump (gfxbench.com)
        
       | quelsolaar wrote:
       | Why are all performance measurements of the M1 done against stuff
       | that is far below state of the art?
       | 
       | So its faster then a 3 generations old budget card, that doesn't
       | run nVidia optimized drivers, over I'm assuming Thunderbolt. So?
       | 
       | So its faster then the last Mac book air, that was old, thermal
       | constrained, and had a chip from Intel that has been overtaken by
       | AMD.
       | 
       | Every test is single core, but guess what, modern computers have
       | multi cores and and hyper threading and that matters.
       | 
       | Apples presentation was full of weasel words like "in its class"
       | "compared to previous models". Fine thats marketing, but can we
       | please get some real, fair benchmarks, against the best the
       | competition has to offer before we conclude that apples new
       | silicon is gift from god to computing?
       | 
       | If you are going to convince me, show me how the CPU stacks up to
       | a top of the line Ryzen/threadripper and run Cinebench. If you
       | want to convince me about the graphics/ML capabilities, compare
       | it to a 3090 RTX using running Vulkan/Cuda.
        
         | fastball wrote:
         | Do you see a lot of laptops with threadripper CPUs and 3090 RTX
         | graphics cards?
         | 
         | I sure don't.
         | 
         | "Best-in-class" isn't a weasel word - it's recognition that no,
         | this $1000 laptop is not going to be the fastest computer of
         | all time. Just faster than other products similar to it.
        
         | moogleii wrote:
         | Hey, take a breather if it helps. No one here is responsible
         | for convincing you of anything. But you sound a little upset.
         | 
         | If it helps, take into consideration performance:power ratios.
         | None of your scenarios are fair otherwise, and I personally
         | haven't seen anyone here claim the M1 will outperform
         | everything. Hence, "in its class."
         | 
         | Maybe you saw some errant comments on PCMag.com claiming the M1
         | was the be all end all of computing?
         | 
         | Good luck.
        
         | sschueller wrote:
         | It's a gift from God alright. It's the start of a full lock
         | down. You no longer own your device. God does.
        
         | marta_morena_28 wrote:
         | > If you want to convince me about the graphics/ML
         | capabilities, compare it to a 3090 RTX using running
         | Vulkan/Cuda.
         | 
         | Woot? So you are buying a new Fiat Punto and compare it to the
         | latest spec of a Koenigsegg? What are you even doing?
         | 
         | What we need to know is how these perform against previous
         | MacBooks and potentially Microsoft Surface and Dell XPS. those
         | are competitors.
        
         | runjake wrote:
         | The M1 is a mobile chip. Why compare it against the high-end?
         | 
         | It seems pretty ridiculous to put the 10 watt M1 CPU up against
         | big time GPUs that require several hundred watts.
         | 
         | Regardless, all the benches anyone desires will be out next
         | week.
        
           | Dahoon wrote:
           | If it will only be available in phones then it should clearly
           | be compared to phone SoCs. If it will be in laptops too it
           | should be compared to laptops/PC too of course.
        
         | matthewmacleod wrote:
         | I don't know why this particular comparison is noteworthy, but
         | this is not a top-of-the-line CPU or GPU, is not intended to
         | be, and those comparisons would be meaningless. It's a 10W part
         | for lower-end devices.
        
           | quelsolaar wrote:
           | Apple decided to call it "PRO", so I think its fair to treat
           | is as such.
        
             | theodric wrote:
             | Ah yes, the famous Apple MacBook Air Pro, available never
             | in history
        
               | quelsolaar wrote:
               | https://www.apple.com/macbook-pro-13/
        
         | pertymcpert wrote:
         | AnandTech ran SPEC2006 on A14, and M1 should be faster.
        
         | kushan2020 wrote:
         | How is comparing an internal GPU with a dedicated external 3090
         | RTX GPU fair ?
        
         | threeseed wrote:
         | Because M1 is their entry-level, laptop class offering.
         | 
         | It makes zero sense to compare them to high-end desktop CPUs
         | and GPUs.
        
           | banachtarski wrote:
           | 1050 Ti is below entry level at this point though.
        
             | dr_zoidberg wrote:
             | Comparisson could be against either an MX250/MX350 if they
             | wanted to compare to nvidia (however that's not integrated
             | in a SoC-like manner) or the AMD Vega on Renoir, or Intel
             | Graphics on Ice/Tiger Lake (I honestly lost track of what
             | intel calls their iGPUs these days, they went back and
             | forth between confusing naming conventions, but it's the
             | CPU gen/model that's important anyway).
        
             | mikepurvis wrote:
             | The 1050 Ti was the premium discrete GPU option in the XPS
             | 15 in 2018. That only got upgraded to the 1650 _last year_.
             | Maybe that 's as much a ding on Dell as anything else, but
             | either way, lots of us are still rocking those laptops and
             | they're hardly "below entry level".
        
               | duncanawoods wrote:
               | Nah. I have this laptop too and at the time, the 1050TI
               | was considered underwhelming but "well this laptop isn't
               | for gaming, it's a business laptop". The contemporary
               | Surface Book 2 had a 1060 with almost double the
               | performance and people were kind of pissed.
        
               | fastball wrote:
               | The Surface Book 2 with 16GB RAM and 512GB SSD: $2499.
               | 
               | The new MacBook Air with 16GB RAM and 512GB SSD: $1399.
               | 
               | Geekbench: https://browser.geekbench.com/v5/cpu/compare/4
               | 679429?baselin...
        
               | duncanawoods wrote:
               | Well thanks for posting CPU benchmarks in a discussion
               | about GPUs.
        
               | fastball wrote:
               | Wasn't the point.
        
             | pankajdoharey wrote:
             | Again Nvidia had how many yrs of evolution in GPU before
             | they reached 1050 Ti? Everyone starts at some point.
        
           | quelsolaar wrote:
           | Fine, Compare it against a Asus zephyrus g14.
        
             | threeseed wrote:
             | It's marketed as a gaming laptop and has half the battery
             | life.
             | 
             | Not sure it's necessarily the best comparison.
        
               | dr_zoidberg wrote:
               | Half the battery on a higher powered CPU (the 4800H is
               | 45W and the 4900HS is 35W if I recall correctly), plus a
               | discrete graphics GPU (on top of the iGPU) and active
               | cooling vs passive cooling, plus a 120Hz display on the
               | G14 (and probably Apples is 60Hz).
               | 
               | So a more power-hungry notebooks battery lasts half as
               | long, but still in the double digits. Not quite a
               | surprise there.
        
           | lovelyviking wrote:
           | >Because M1 is their entry-level, laptop class offering. It
           | makes zero sense to compare them to high-end desktop CPUs and
           | GPUs.
           | 
           | Sure It makes zero sense to compare them to high-end desktop
           | or laptops, since it lack the main attribute of Personal
           | computer - the ability to control it and install OS of your
           | choice.
           | 
           | Therefore it falls into another category like phones, ipads
           | and other toys just with attached keyboard.
        
         | Zetto wrote:
         | Also Ryzen only got better on its 3rd gen release last year.
         | Back in 2017, the R7 1700x can't compete with the 8700k on
         | single thread performance.
        
         | robertoandred wrote:
         | This is how you know that Apple haters are running scared.
        
           | Dahoon wrote:
           | Please don't do that on HN.
        
         | paulpan wrote:
         | > So its faster then a 3 generations old budget card, that
         | doesn't run nVidia optimized drivers
         | 
         | This is a key point. Nvidia GPUs have not been supported in
         | macOS since Mojave, so this seems like an apples-oranges
         | comparison. Unless the benchmark for M1 was also run on Mojave
         | (unlikely), then there's 3 years worth of software optimization
         | potentially unaccounted for.
         | 
         | Possibly a more realistic comparison is between the M1 and the
         | AMD Radeon 5300M. Shows between a 10-40% deficit in
         | performance:
         | https://gfxbench.com/compare.jsp?benchmark=gfx50&did1=798775...
         | 
         | That said still an impressive showing given the TDP and the
         | fact it's an integrated GPU vs. a dedicated GPU. It seems to
         | hint that with enough GPU cores and a better cooling solution,
         | it's not unreasonable to see these replacing the AMD Radeon
         | 5500M/5600M next year in the MBP 16 and iMac lineups.
         | 
         | EDIT: pasted wrong compare link
        
         | dmlittle wrote:
         | > If you want to convince me about the graphics/ML
         | capabilities, compare it to a 3090 RTX using running
         | Vulkan/Cuda.
         | 
         | You can't really compare a laptop SoC with a dedicated graphics
         | cards like the 3090 RTX. One is using a battery on a laptop and
         | the other is plugged in to a power source with dedicated
         | cooling.
         | 
         | Yes, Apple is claiming this is a better solution but that's
         | mostly for laptops. While they did release a Mac mini, they
         | still haven't released an Apple SoC Mac Pro or iMac. Those
         | would be fair game for such comparison.
        
         | phoobahr wrote:
         | Probably because the announced hardware is clearly entry level.
         | The only model line that gets replaced is the MacBook Air which
         | has been, frankly, cheap-is and underpowered for a long time.
         | 
         | So you have a platform that is (currently) embodied by entry
         | level systems that appear to be noticeably faster than their
         | predecessors. Apple has said publicly that they plan to finish
         | this transition in 2 years. So more models are coming - and
         | they'll have to be more performant again.
         | 
         | It seems pretty clear that the play here runs "Look here's our
         | entry level, it's better than anyone else's entry level and
         | could be comparable to midlevel from anyone else. But after
         | taking crap for being underpowered while waiting for intel to
         | deliver we cn now say that this is the new bar for performance
         | at the entry level in these price brackets."
        
           | ta76893547 wrote:
           | It would be interesting to see the comparison to a Ryzen 7
           | PRO 4750U, you can find that in a ThinkPad P14s for $60 less
           | than the cheapest macbook air (same amount of ram and ssd
           | size) so that seems like a fair comparison
        
             | spockz wrote:
             | It's just a shame that the screens in the Lenovo AMD
             | devices doesn't hold a candle to the MacBooks.
        
               | toolz wrote:
               | in what way? I've got a levono ideapad with ryzen7 4800u
               | that outperforms my 2019 16" by a long shot
        
             | Veedrac wrote:
             | Mostly a sweep in Apple's favour, though multicore is only
             | 20-25% better.
             | 
             | https://browser.geekbench.com/v5/cpu/search?q=4750U https:/
             | /browser.geekbench.com/v5/cpu/search?utf8=%E2%9C%93&q...
        
               | StillBored wrote:
               | Assuming that geekbench is reflective of actual perf (I'm
               | not yet convinced) there is also the GPU, and the fact
               | that AMD is sitting on a 20% IPC uplift and is still on
               | 7nm.
               | 
               | So if they release a newer U part in the next few months
               | it will likely best this device even on 7nm. An AMD part
               | with a edram probably wouldn't hurt either.
               | 
               | It seems to me that apple hasn't proven anything yet,
               | only that they are in the game. Lets revisit this
               | conversation in a few years to see if they made the right
               | decision from a technical rather than business
               | perspective.
               | 
               | The business perspective seems clear, they have likely
               | saved considerably on the processor vs paying a ransom to
               | intel.
        
               | refulgentis wrote:
               | Sad to see downvotes on this: it's like there's a set of
               | people hellbent on echoing marketing claims, in ignorance
               | of (what I formerly perceived as basic) chip physics -
               | first one to the next process gets to declare a 20-30%
               | bump, and in the age of TSMC and contract fabs, that's no
               | longer an _actual_ differentiation, the way it was in the
               | 90s.
        
               | toolz wrote:
               | is a higher score worse? If not it's a sweep in favor of
               | ryzen
        
               | CryptoBanker wrote:
               | Higher scores are better
               | http://support.primatelabs.com/kb/geekbench/interpreting-
               | gee...
        
               | easde wrote:
               | You're making completely the wrong comparison. On the
               | left you have Geekbench 5 scores for the A12Z in the
               | Apple DTK, and on the right you have Geekbench 4 scores
               | for the Ryzen.
               | 
               | The M1 has leaked scores of ~1700 single core and ~7500
               | multicore on Geekbench 5, versus 1200 and 6000 for the
               | Ryzen 4750U.
        
           | nwallin wrote:
           | > Probably because the announced hardware is clearly entry
           | level.
           | 
           | Yes, but why compare it to an entry card that was released _4
           | years ago_ instead of an entry card that 's been released in
           | the past 12 months? When the 1050 Ti was released, Donald
           | Trump was trailing Hillary Clinton in the polls. Meanwhile,
           | the 1650 (released April 2020, retails ~$150) is
           | _significantly_ faster than the 1050 Ti. (released October
           | 2016, retailed $140 but _can 't be purchased new anymore_)
        
             | varenc wrote:
             | The 1050 is still a desktop class card. The M1 is in tiny
             | notebooks and the Mac Mini, none of which even have the
             | space or thermals to house such a card.
        
               | StillBored wrote:
               | The main point was that its a 3 generation old desktop
               | card which is obviously not as efficient as the modern
               | mobile devices.
               | 
               | Lets see what a 3000 series nvidia mobile design does on
               | a more recent process before declaring victory.
        
           | romanoderoma wrote:
           | My Xiaomi laptop is entry level and costed 300 euros
           | including shipping from China and taxes
           | 
           | It still does 8 hours on battery after 3 years
           | 
           | Being entry level performances don't matter much, but it's
           | still a good backup
           | 
           | Bonus point: Linux works great on it
           | 
           | For 13 hundred dollars (30% more in Euros) I can buy a pro
           | machine, one that at least will give me the option of using
           | more than 16GB of RAM
           | 
           | The new Apple Silicon looks good and I love that they are
           | finally shipping some decent GPU, but price wise they're
           | still not that cheap
           | 
           | ------
           | 
           | The downvotes are because I said Xiaomi or for something
           | else?
           | 
           | LG sells a 17 inches, 2.9 lb (same weight of the Air) 16GB of
           | RAM (up to 40) and a discrete GPU (Nvidia GTX 1650) for 1,699
        
           | stefan_ wrote:
           | Apple doesn't make anything entry-level. The entry-level is a
           | $150 ChromeBook, it's not "the cheapest that Apple sells".
        
             | cosmotic wrote:
             | The Air _IS_ entry level; it 's slow, low resolution, has
             | meager io, etc. It just happens to be at a price point that
             | is not entry level.
        
               | theodric wrote:
               | > low resolution > 2560 x 1600
               | 
               | Damn, you've got some high standards.
        
               | cosmotic wrote:
               | Competition has 4k displays on their thin and lights.
               | They also don't use macOS which has problems running at
               | resolutions other than native or pixel-doubled. The
               | suggested scalings are 1680 by 1050, 1440 by 900, and
               | 1024 by 640. None of those are even fractions of the
               | native resolution so the interface looks blurry and
               | shimmers. Also, all the suggested scalings are very small
               | so there isn't much screen real-estate.
        
               | MR4D wrote:
               | Depends on how you define "entry level". The Porsche
               | Cayman is an entry level Porsche, but starts at $60,000.
               | 
               | I don't know anyone who would call that an entry level
               | car, but any Porsche-phile would.
               | 
               | The new Air is fast and reasonably high resolution with a
               | ~4K resolution screen. But it is _Apple's_ entry level
               | laptop.
        
               | cosmotic wrote:
               | I concede the new air is likely faster than the ancient
               | i3 they used to put in them.
        
               | Klinky wrote:
               | No it's not. It was designed to be extremely light with
               | many compromises to make that happen. Yeah it got
               | outdated, but that doesn't mean it was entry level.
        
               | Kwpolska wrote:
               | Here are all the differences between the M1 Air and M1
               | Pro, from [1]:                   Brighter screen (500
               | nits vs. 400)         Bigger battery (58 watt-hours vs.
               | 50)         Fan (increasing thermal headroom)
               | Better speakers and microphones         Touch Bar
               | 0.2 pounds of weight (3.0 pounds vs. 2.8 -- not much)
               | 
               | The SoC is the same (although the entry-level Air gets
               | 7-core GPUs, that's probably chip binning). The screen is
               | the same (Retina, P3 color gamut), both have 2 USB-C
               | ports, both have a fingerprint reader.
               | 
               | [1]: https://daringfireball.net/2020/11/one_more_thing_th
               | e_m1_mac...
        
         | toiletfuneral wrote:
         | This comment doesn't seem serious
        
         | klelatti wrote:
         | I'm sure you understand the performance differences between a
         | 10W part with integrated graphics designed for a fanless laptop
         | and a desktop part with active cooling and discrete graphics.
         | 
         | This article from Anandtech on the M1 is helpful in
         | understanding why the M1 is so impressive.
         | 
         | https://www.anandtech.com/show/16226/apple-silicon-m1-a14-de...
        
           | devwastaken wrote:
           | The point of a computer as a workstation is it goes vroom.
           | Computer that does not go vroom will not be effective for use
           | cases where computer has to go vroom. It doesn't matter if
           | battery life is longer or case is thinner. That won't
           | increase compile times or render performance.
        
             | fractionalhare wrote:
             | A laptop is not a workstation.
        
               | romanoderoma wrote:
               | I work with a Lenovo P70 with 64GB of RAM, a quadro M300
               | GPU and a 4K monitor plus 2 external monitor ports (HDMI
               | and display port)
               | 
               | I did not choose it, they gave it to me at work, but it's
               | definitely a workstation
               | 
               | It's not a lightweight laptop, but it's a laptop
               | nonetheless
        
               | borski wrote:
               | Some laptops are workstations. This is not one.
        
               | drdaeman wrote:
               | Docked laptop is. With a benefit that if you want to work
               | on the road you can take it without having to think about
               | replicating your setup and copying data over.
        
             | reaperducer wrote:
             | _The point of a computer as a workstation is it goes
             | vroom._
             | 
             | The M1 is not currently in any workstation class computer.
             | 
             | It is in a budget desktop computer, a throw-it-in-your-bag
             | travel computer, and a low-end laptop.
             | 
             | When an M series chip can't perform in a workstation class
             | computer, then your argument will be valid. But you're
             | trying to compare a VW bug with a Porsche because they look
             | similar.
        
               | Ahwleung wrote:
               | The "low-end laptop" starts at $1300, is labeled a
               | Macbook Pro, and their marketing material states:
               | 
               | "The 8-core CPU, when paired with the MacBook Pro's
               | active cooling system, is up to 2.8x faster than the
               | previous generation, delivering game-changing performance
               | when compiling code, transcoding video, editing high-
               | resolution photos, and more"
        
             | klelatti wrote:
             | Do not use MacBook Air when you need vroom!
        
           | jl6 wrote:
           | This is true, but I think a lot of folk are assuming that a
           | future M2 or M3 will be able to scale up to higher wattage
           | and match state-of-the-art enthusiast-class chips. That
           | assumption is very much yet to be proven.
        
             | klelatti wrote:
             | Indeed, but given they are on TSMC 5nm and the apparent
             | strength of the architecture and their team I think most
             | will be inclined to give them the benefit of the doubt for
             | the moment.
             | 
             | Actually biggest worry might be the economics - given their
             | low volumes at the highest end (Mac Pro etc) how do they
             | have the volumes to justify inveesting in building these
             | CPUs?
        
             | pwthornton wrote:
             | There seems to be no evidence that Intel will be able to
             | keep up with Apple. The early geek bench results show the
             | M1 laptops beating even the high end Intel Mac ones. And
             | that's with their most thermally constrained chip.
             | 
             | Apple will be releasing something like a M1X next, which
             | will probably have way more cores and some other
             | differences. But this M1 is incredibly impressive for this
             | class of device. Intel has nothing near it to compete in
             | this space.
             | 
             | The bigger question is how well does Apple keep up with AMD
             | and Nvidia for GPUs and will they allow discrete GPUs.
        
           | jandrese wrote:
           | I think Apple brought this on themselves when they announced
           | it would be faster than 98%[1] of the existing laptops on the
           | market. They didn't caveat it with "fanless laptops" or
           | "laptops with 20hr of battery life", it's just supposedly
           | faster than all but 2% of laptops you can buy today.
           | 
           | You say something like that about a low power fanless design
           | and every tech nerd's first reaction is "bullshit". And now
           | they want to call you on your bullshit.
           | 
           | [1] https://www.apple.com/newsroom/2020/11/introducing-the-
           | next-...
        
             | IfOnlyYouKnew wrote:
             | You're misinterpreting what they say. Quote:
             | 
             | "And in MacBook Air, M1 is faster than the chips in 98
             | percent of PC laptops sold in the past year.1"
             | 
             | There is a subtle difference between "98 percent of laptops
             | sold" and your rephrasing as "2% of laptops you can buy
             | today".
        
               | refulgentis wrote:
               | You're both right, but he's more right because the subtle
               | difference you mention _is the problem_: they went out of
               | their way to be unclear.
               | 
               | After the odd graphs/numbers from the event, I was
               | worried it was going to be an awful ~2 year period of
               | jumping to conclusions based on: - Geekbench scores -
               | "It's slow because Rosetta" - HW comparisons that compare
               | against ancient hardware because "[the more powerful PC
               | equivalent] uses _too_ much power" implying that "[M1]
               | against 4 year old HW is just the right amount of power",
               | erasing the tradeoff between powerfulness and power
               | consumption into only power consumption mattering
               | 
               | The people claiming this blows Intel/AMD out of the water
               | need to have stronger evidence than comparing against
               | budget parts from years ago, then waving away any other
               | alternative based on power consumption.
               | 
               | Trading off power for power consumption is an inherent
               | property of chip design, refusing to consider other
               | chipsets because they have a different set of tradeoffs
               | mean you're talking about power consumption alone, not
               | the chip design.
        
               | klelatti wrote:
               | I really would suggest reading the Anandtech article
               | linked above - I think it will help to clarify where the
               | M1 stands against the competition.
        
             | Spooky23 wrote:
             | >I think Apple brought this on themselves when they
             | announced it would be faster than 98%[1] of the existing
             | laptops on the market. They didn't caveat it with "fanless
             | laptops" or "laptops with 20hr of battery life", it's just
             | supposedly faster than all but 2% of laptops you can buy
             | today.
             | 
             | Exactly. It's a meaningless number.
             | 
             | They also conspicuously avoid posting any GHz information
             | of any kind. My assumption is that it's a fine laptop, but
             | a bullshit performance claim.
        
             | bitL wrote:
             | Old quad-core 2020 Macbook Air was probably faster than 98%
             | of the existing laptops on the market given what specs have
             | most volume sold (<$500).
        
             | martin_bech wrote:
             | I dont know if you've actually done the numbers.. but most
             | laptops on the market, have low to mediocre specs. It would
             | surprise me if more than 2% are pro/enthusiast.
        
               | jandrese wrote:
               | Apple didn't specify if they're counting by model or
               | total sales, but virtually everything in the Gamer Laptop
               | category is going to be faster in virtually every
               | measure.
               | 
               | https://www.newegg.com/Gaming-Laptops/SubCategory/ID-3365
               | 
               | As a Joe Schmoe it's hard to get good figures, but it
               | appears the total laptop market is about $161.952B[1]
               | with the "gaming" laptop segment selling about
               | $10.96B[2]. Since gaming laptops are more expensive this
               | undercounts cheap laptops, but there are other classes of
               | laptop that are going to outperform this mac, like
               | business workstations.
               | 
               | There might be one way to massage the numbers to pull out
               | that statistic somehow, but it is at best misleading.
               | 
               | [1]
               | https://www.statista.com/outlook/15030100/100/laptops-
               | tablet... [1]
               | https://www.statista.com/statistics/1027216/global-
               | gaming-la...
        
               | wmf wrote:
               | I assume Apple means unit sales. It makes sense that 98%
               | of all laptop units sold are not the fastest.
        
             | klelatti wrote:
             | '98% of laptops sold over the last year' not 'that you can
             | buy' ie not 98% of all models on the market (whatever that
             | means).
             | 
             | And their statement will have been through all sorts of
             | validation before they use it so it's almost certainly not
             | 'bullshit'.
        
               | qz2 wrote:
               | Faster than 98% of the cheapest OEM laptop available!
        
               | jandrese wrote:
               | I just had the thought that the figures could be skewed
               | by education departments all over the country making bulk
               | orders for cheap laptops for students doing remote
               | learning.
        
               | qz2 wrote:
               | Very good point.
        
         | rowanG077 wrote:
         | How is it relevant how it stacks upto a thread ripper and 3090
         | rtx? You are comparing an ultra book with a large high
         | performance workstation pc then. That makes absolutely no
         | sense.
        
         | grecy wrote:
         | > _If you want to convince me about the graphics /ML
         | capabilities, compare it to a 3090 RTX using running
         | Vulkan/Cuda. _
         | 
         | You're seriously suggesting we should compare the integrated
         | graphics in a fanless bottom-of-their-line laptop that starts
         | at $999 to a $1,500 graphics card ?
         | 
         | (Let's be brutally clear - that graphics card can do precisely
         | squat on it's own. No CPU, no RAM, no PSU, no display, no
         | keyboard, etc. etc. etc.)
         | 
         | Surely that makes no sense in any universe.
        
       | banachtarski wrote:
       | For reference, the 1050 Ti card is considered the minimum spec at
       | many AAA game studios.
        
       | olliej wrote:
       | It's been a long time since I've actually followed graphics
       | benchmarks at all - where is this on the scale from intel
       | intergrated to modern discrete laptop gpu? E.g. a radeon 5500 or
       | similar?
       | 
       | [edit: to be clear I mean modern feature set type stuff, not just
       | raw framerate]
        
       | [deleted]
        
       | bjoli wrote:
       | So, it is slightly better than a 1050TI. What does that mean? I
       | do nothing to stress my graphics card at all, so explain it like
       | I am 5, I guess...
        
         | shmerl wrote:
         | Not really a serious gaming option, but OK for lower end
         | (ignoring that Apple doesn't care about gaming in general).
        
         | entropicdrifter wrote:
         | It's certainly 'good enough for government work' as the saying
         | goes, but it's pretty bad for gaming at retina screen
         | resolutions.
        
           | vmception wrote:
           | hits 60fps steadily which is a nice feat, but just barely and
           | definitely not future proof, but as low power and integrated
           | that's amazing!
           | 
           | my skeptic hat has gotten smaller, still there, just smaller
        
         | mrbuttons454 wrote:
         | A 1050ti has a TDP of 75w, and the Apple M1 has somewhere
         | between 10-18w (not sure if actual numbers have been published,
         | but I didn't see them.)
        
           | davio wrote:
           | 10W published for MBA
        
             | mciancia wrote:
             | And that is for the whole SoC, not just GPU
        
         | jandrese wrote:
         | A 1050Ti is a three generation old card and was bottom of the
         | line when it came out, so not too bad for integrated graphics.
         | 
         | It's plenty for everything but modern gaming, and since those
         | games aren't likely to be ported to ARM on Mac anytime soon
         | it's not a huge problem. Apple has always had something of a
         | rocky relationship with game publishers, at least on the Mac.
         | Lots of older games will probably work fine, assuming the
         | driver situation isn't a nightmare. Apple is somewhat notorious
         | for neglecting graphics card drivers unfortunately.
        
           | olyjohn wrote:
           | Saying that it's plenty for everything but modern gaming
           | isn't saying much either. Every other integrated graphics
           | solution has been fine for everything but modern gaming.
        
           | Zetto wrote:
           | GTX 1050 2GB was the bottom of the line when the 1050 Ti came
           | out. Both were released in Oct 2016, there's also GT 1030
           | which was released few months later.
        
           | kitsunesoba wrote:
           | It looks like World of Warcraft at the very least is getting
           | a day-one Apple ARM build, which suggests that porting isn't
           | too bad. It should be a relatively easy transition for any
           | game that already ran on macOS or iOS.
        
         | jamesgeck0 wrote:
         | The GTX 1050 Ti was a lower-mid-range gaming card a couple
         | generations ago. AFAIK it can still run most new games,
         | possibly with compromises for smooth performance (low graphical
         | settings, 30 FPS, or sub-1080p resolution).
        
         | kcb wrote:
         | It's very unlikely that real world gaming performance will
         | match a 1050TI. People are assigning mythical properties to
         | this new SOC.
        
         | mrkstu wrote:
         | Considering that the devices that were replaced were integrated
         | GPUs only, the fact that these devices now run close to current
         | gen discrete GPUs is a big jump.
         | 
         | The bigger question to be answered is whether this is a
         | baseline that will be surpassed handily by the higher end
         | released coming later, or that this is about as good as it gets
         | now.
         | 
         | I'm assuming that the reason these were released separately was
         | because the later arriving devices have significantly differing
         | SoC's with even better performance, and maybe even discrete
         | GPUs with variable, scaling performance.
        
           | olyjohn wrote:
           | The 1050Ti is hardly a current gen GPU. It was a budget card
           | when it came out years ago.
        
           | MangoCoffee wrote:
           | >run close to current gen discrete GPUs is a big jump
           | 
           | 1050 Ti is not current gen. Geforce 10 series have been out
           | for more than 4 years now. the current gen is RTX 30 series
        
             | mrkstu wrote:
             | I was trying to say 'close to a recent gen discrete CPU'
             | not 'close (in performance) to a current gen CPU.'
             | 
             | 1050 is just one generation removed from most current
             | discrete GPUs, I don't believe the 3000 series is out yet
             | on laptops, the 2080 just came out last year.
        
           | kllrnohj wrote:
           | > the fact that these devices now run close to current gen
           | discrete GPUs is a big jump.
           | 
           | The 1050 Ti (4 years old) has about the same performance as a
           | GTX 680. A card from 2012.
           | 
           | This comparison makes absolutely no sense. You'd want to
           | compare the M1 against either the current generation
           | integrated, such as the Vega 8 or 11 in the Ryzen 4xxx mobile
           | CPUs or the Intel Xe-LP in the current tiger lake CPUs, or
           | you'd want to compare it against last gen integrated.
           | 
           | Comparing it against a discreet card from 4 years ago with
           | the performance of a card from 8 years ago is just... weird?
        
       | _venkatasg wrote:
       | I'm beginning to think I can replace my Macbook Pro 2015 with the
       | new Macbook Air...the numbers are so tempting.
       | 
       | Let's see how the reviews turn out, where people will use at
       | real-world tasks.
        
         | banachtarski wrote:
         | What is tempting about this? This benchmark shows that the M1
         | performs marginally worse than the low-end sku from 3
         | generations ago.
        
           | DeRock wrote:
           | No, it shows it performs better? Go look again at the numbers
           | in the linked comparison. Also calling it low-end is
           | misleading, the 1050Ti was solidly mid-tier, and also a
           | dedicated graphics card used in desktop machines. This is
           | comparing it to integrated graphics in a lightweight laptop
           | (the M1 in the recently announced MacBook Air).
        
             | qw3rty01 wrote:
             | 1060 is budget mid-tier, I don't really see how you could
             | say the 1050ti is anything but low-end when looking at
             | either performance or price
        
           | _venkatasg wrote:
           | For mac users these numbers are amazing lol
        
             | banachtarski wrote:
             | lol what's hilarious is the numbers probably go up just by
             | installing windows and using the actual first-party nvidia
             | driver.
        
         | felipesoc wrote:
         | I would wait some time until more software and os projects gets
         | native support and kinks are ironed out.
         | 
         | I don't know your workflow but maybe some dependency in your
         | pipeline has issues and then you are in a world of pain trying
         | to figure out why.
        
         | entropicdrifter wrote:
         | I'm especially interested to see how it deals with heavy loads
         | heat-wise. With no fan it could end up getting up to leg-
         | roasting temps
        
           | jandrese wrote:
           | It seems like Apple may err on the side of just letting it
           | get painfully slow instead of crisping your legs. The primary
           | differentiator between the Air and Pro is that the Pro comes
           | with a fan and a $300 premium.
        
         | mbesto wrote:
         | This is exactly the camp I'm in. I won't touch the new MBPro
         | because of the touchbar (and keyboard woes), but I'm willing to
         | concede the peripheral inputs if it means the thing is much
         | lighter/slimmer and has better performance.
        
           | dawnerd wrote:
           | FYI the keyboard has since been fixed and is, apart from some
           | slight differences to key layout, the same as the air.
        
         | seanalltogether wrote:
         | I was tempted as well until I realized it's only 8GB of memory.
         | There's no way i could survive on less then 16 currently.
         | 
         | edit - ignore that I'm an idiot :)
        
           | mbirth wrote:
           | For $200 you can upgrade to 16GB during the order process.
        
         | unclemase wrote:
         | You have 2 weeks to return it ;)
        
           | leecb wrote:
           | In the US, you have even longer during the holiday period:
           | 
           | > Items purchased at the Apple Online Store that are received
           | between 10 November and 25 December 2020 may be returned up
           | to 8 January 2021.
           | 
           | https://www.apple.com/shop/help/returns_refund
        
         | Zealotux wrote:
         | The idea of changing my 2015 MBP for something else is a tough
         | one to accept, my biggest beef with the recent Air being the
         | heat issues: the fan on the previous Air is so useless it's
         | laughable[1] and they completely removed it on the new one, I
         | remain skeptical.
         | 
         | [1] https://www.youtube.com/watch?v=iiCBYAP_Sgg
        
           | somehnguy wrote:
           | I just upgraded my 2015 MBP to a 16" 2019 MBP. I specifically
           | bought it now so that I have an Intel chip because I don't
           | want to deal with the headache while things slowly switch
           | over.
           | 
           | I hadn't realized how much faster computers had gotten since
           | 2015, this new machine runs circles around the old one. The
           | keyboard is great and I actually like the touch bar. 0
           | regrets, it's an upgrade in every single way.
        
       | jeswin wrote:
       | If you switch the OS to Windows DirectX:
       | 
       | OnScreen Aztec High Tier gives you 54fps on the M1, vs 106fps on
       | the 1050Ti. Normal tier is capped at 60fps for OSX while Windows
       | goes up to 156fps.
       | 
       | That's a better comparison since it represents the best optimised
       | platforms for this hardware. NVidia on OSX isn't a very
       | meaningful comparison.
        
       | PragmaticPulp wrote:
       | Compared to the AMD Radeon Pro 5500M, the base GPU in the 16"
       | MacBook Pro:
       | https://gfxbench.com/compare.jsp?benchmark=gfx50&did1=907542...
       | 
       | Compared to the AMD Radeon Pro 5600M, the +$700 upgrade in the
       | top of the line MacBook Pro 16":
       | https://gfxbench.com/compare.jsp?benchmark=gfx50&did1=907542...
       | 
       | Note that the Onscreen numbers are capped at 60fps on OS X, so
       | ignore any Onscreen results at 59-60fps.
        
         | aardvarkr wrote:
         | Considering that there's no difference between the 1050ti in
         | the OP and the 5500M that PragmaticPulp posted I'm inclined to
         | say this test sucks. Userbenchmark.com shows there should be a
         | substantial (38%) improvement between those two. Take these
         | early results with a HUGE grain of salt because they smell
         | fishy.
         | 
         | https://gpu.userbenchmark.com/Compare/Nvidia-GTX-1050-Ti-vs-...
        
           | kllrnohj wrote:
           | Well, two things there.
           | 
           | 1: Userbenchmark.com is terrible and nobody should use it for
           | anything. At least their CPU side of things is hopelessly
           | bought & paid for by Intel (and even within the Intel lineup
           | they give terrible & wrong advice), maybe the GPU side is
           | better but I wouldn't count on it.
           | 
           | 2: The real question there isn't "why is the 1050 Ti not
           | faster?" it's "how did you run a 1050 Ti on MacOS in the
           | first place, since Nvidia doesn't make MacOS drivers anymore
           | and hasn't for a long time?"
        
             | Dylan16807 wrote:
             | > Userbenchmark.com is terrible and nobody should use it
             | for anything. At least their CPU side of things is
             | hopelessly bought & paid for by Intel (and even within the
             | Intel lineup they give terrible & wrong advice), maybe the
             | GPU side is better but I wouldn't count on it.
             | 
             | To provide some elaboration on this: Their overall CPU
             | score used to be 30% single, 60% quad, 10% multicore. Last
             | year around the launch of zen 2 they gave it an update.
             | Which makes sense; the increasing ability of programs to
             | actually scale beyond four cores means that multicore
             | should get more importance. And so they changed the
             | influence of multicore numbers from 10% to... 2%. Not only
             | was it a blatant and ridiculous move to hurt the scores of
             | AMD chips, you got results like this, an i3 beating an i9 h
             | ttps://cdn.mos.cms.futurecdn.net/jDJP8prZywSyLPesLtrak4-970
             | ...
             | 
             | And there was some suspicious dropping of zen 3 scores a
             | week ago, too, it looks like.
        
               | brokencode wrote:
               | I don't see that as evidence of blatant bias for Intel.
               | The site is just aimed at helping the average consumer
               | pick out a part, and I think the weighting makes sense.
               | 
               | Most applications can only make use of a few CPU-heavy
               | threads at a time, and these systems with with 18 cores
               | will not make any difference for the average user. In
               | fact, the 18 core behemoth might actually feel slower for
               | regular desktop usage since it's clocked lower.
               | 
               | If you are a pro with a CPU-heavy workflow that scales
               | well with more threads, then you probably don't need some
               | consumer benchmark website to tell you that you need a
               | CPU with more cores.
        
               | Dylan16807 wrote:
               | But lots of things do use more than 4 cores. To suddenly
               | set that to _almost zero_ weight, when it was already a
               | pretty low fraction, _right when zen 2 came out_ , is
               | clear bias.
               | 
               | > In fact, the 18 core behemoth might actually feel
               | slower for regular desktop usage since it's clocked
               | lower.
               | 
               | It has a similar turbo, it won't.
        
             | mastazi wrote:
             | Regarding 2. I think that none of those benchmarks were run
             | on MacOS. Their benchmark tool seems to be Windows-only
             | https://www.userbenchmark.com/ (click on "free download"
             | and the MacOS save dialogue will inform you that the file
             | you are about to download is a Windows executable).
        
             | aardvarkr wrote:
             | 1. Today I learned something new. Still, can't let great be
             | the enemy of good. It may be imperfect but it's the source
             | I used. Do you have a better source I can replace it with?
             | 
             | 2. That's a good question and I don't have an answer for
             | that.
        
             | Rebelgecko wrote:
             | Re #2, the Nvidia web drivers work great if you're on High
             | Sierra
        
           | xfalcox wrote:
           | Please don't use userbenchmark for anything. Site is so
           | misleading that it's banned from both r/amd and r/intel.
        
         | deergomoo wrote:
         | Damn, I would happily take the 10-20% performance hit to avoid
         | having the laptop turn into a jet engine as soon as I connect
         | it to a monitor.
        
           | sod wrote:
           | You can have that trade off already by disabling turbo boost.
        
         | trynumber9 wrote:
         | I thought the 5300M was the base GPU. That's what I have
         | anyway.
         | 
         | https://gfxbench.com/compare.jsp?benchmark=gfx50&did1=907542...
        
         | ndesaulniers wrote:
         | Looks like there's interesting "offscreen" optimizations that
         | might need to be re-implemented for M1, IIUC.
        
         | zaksoup wrote:
         | I think those are the same link
        
         | p0nce wrote:
         | Also the M1 is built on TSMC 5nm.
         | 
         | The AMD Radeon Pro 5600M is built on TSMC 7nm.
        
         | bredren wrote:
         | We need this compared with the RX 580 running in the blackmagic
         | egpu.
         | 
         | That's the most relevant CPI compare, given it is the entry
         | level apple endorsed way to boost Mac gpu.
         | 
         | It also helps understand the value of the 580 and Vega on
         | pre-m1 macs.
        
       | kushan2020 wrote:
       | I would be interested in knowing if I can play Civ VI on a Retina
       | display with this thing.
        
         | lalaithion wrote:
         | I use a GTX 970 to play Civ VI at 1440.
         | 
         | So, probably!
        
       | pier25 wrote:
       | I've never been so excited and so turned off at the same time.
       | 
       | The ARM Mac hardware is looking fantastic, but OTOH macOS is
       | getting worse every year...
        
         | thijsvandien wrote:
         | Exactly this. The idea of seriously powerful machines with
         | great battery life is awesome. An even more proprietary, locked
         | down system with software that keeps getting worse? Not at all.
        
         | ajharrison wrote:
         | Yawn. So tired of the "macOS is going to shit" meme.
         | 
         | It is by far the most advanced operating system which serves
         | newbies and professionals alike. I know people who use it to
         | browse Facebook and use Messages/Music/Safari etc and those who
         | use it to manage servers, build apps, and more.
         | 
         | The progression of software of this scale is often slow and at
         | times annoying, but to write it off and say it's getting worse
         | is absolutely insane.
         | 
         | What a lot of people don't understand about the approach Apple
         | has to software, especially their own, is that they inch
         | towards the ultimate and ideal version, and yes, during some of
         | those transitions things may feel more broken than usual, until
         | they don't.
        
           | hocuspocus wrote:
           | > It is by far the most advanced operating system which
           | serves newbies and professionals alike. I know people who use
           | it to browse Facebook and use Messages/Music/Safari etc and
           | those who use it to manage servers, build apps, and more.
           | 
           | Advanced according to what metric exactly? And your second
           | sentence can be said about Windows or any modern OS really.
        
           | pier25 wrote:
           | It's true many problems end up being solved, but you don't
           | see a problem shipping broken software on which millions
           | depend to work?
           | 
           | I started using macOS back on Panther and I don't trust Apple
           | to ship a reliable update anymore. I'm still on Mojave
           | because even today Catalina is broken for a lot of people.
        
           | po1nter wrote:
           | > most advanced operating system
           | 
           | Sounds like someone is regurgitating Apple's marketing
           | speech.
           | 
           | How do you define "Most advanced"? because, for me, an OS
           | that you can't use to run apps because Apple's servers were
           | down is anything but advanced to me.
        
         | ktpsns wrote:
         | Absolutely! For me it feels as if Mac OS repeats the errors of
         | MS Windows Vista and 10. For instance, the horrible telemetry
         | as well as the suboptimal "one interface for touch and mouse
         | pointer" paradigm.
        
           | LeoPanthera wrote:
           | Can you expand on "horrible telemetry" please? And since no
           | Macs have a touch screen, I'm not sure what you mean by the
           | "one interface" thing. Big Sur may resemble iOS but it is not
           | the same interface at all.
        
             | cosmotic wrote:
             | Big Sur incorporates apps from iOS using Catalyst. iOS
             | devices all have touch screens.
        
             | tannedNerd wrote:
             | I think the thought is since the tap targets all got so
             | much bigger for Big Sur it no longer feels like an OS
             | designed for a pointer, more one designed for a finger.
        
             | kllrnohj wrote:
             | > Can you expand on "horrible telemetry" please?
             | 
             | Phoning home on every executable launch. Both because it's
             | bad for privacy, and because the implementation of it is
             | absolutely horrific such that when Apple's servers went
             | down it basically locked up everyones computers at the same
             | time.
             | 
             | It's on the front page still, even:
             | https://news.ycombinator.com/item?id=25074959
        
               | my123 wrote:
               | Is it even really telemetry as we'd consider it? It's
               | OCSP digital signature verification... to check if an app
               | signature wasn't revoked. (or a website cert, or anything
               | really)
        
               | kllrnohj wrote:
               | It sends a hash of every executable you launch to Apple,
               | how would that not be considered telemetry?
        
               | my123 wrote:
               | Not the hash of the executable, but the certificate that
               | it's signed with.
        
               | dsr_ wrote:
               | What's the ratio between certs and applications on your
               | Mac? Is it pretty close to 1:1, excluding Apple's own
               | products?
        
             | jjoonathan wrote:
             | When installing Windows, I had to unplug the ethernet
             | during a particular setup screen to avoid having every
             | login checked against my cloud account. It put Candy Crush
             | and Farmville ads in my start menu without consent. I
             | remember having to spend effort to get Cortana to go away
             | and (maybe) not send my searches to the cloud.
             | 
             | In MacOS, we've recently seen: pushy siri, sending search
             | results to the cloud, and yesterday the OCSP failure made
             | it obvious they were sending logs of every app launch to
             | the cloud :/ . It's the same direction, even if they aren't
             | yet quite as lost as Microsoft.
        
               | threeseed wrote:
               | You opted in to enabling Siri and they are simply
               | validating the signatures on the apps.
        
               | s_tec wrote:
               | I know, right? What's with all these people, expecting
               | their personal computers to respect their privacy? If you
               | want cool features, just be quiet and let Apple send
               | whatever data they want to their servers. It's fine!
        
               | filoleg wrote:
               | Did you miss the part where the parent reply said "opted
               | in enabling Siri"?
               | 
               | I mean, if you enable a completely optional feature that
               | requires giving up a bit of privacy for its literal
               | intended functionality, how is it Apple's fault? And
               | unlike Cortana on Windows 10, you can disable Siri
               | feature just with a click of a button, or you can just
               | click a button to not enable it in the first place. When
               | you start your new Mac for the first time, it asks you
               | very explicitly if you want it enabled or not.
        
               | jjoonathan wrote:
               | Ever tried to opt out of Siri? I have. It nags you
               | constantly. You search, looking for a gist to nuke it,
               | but what you find doesn't work. Finally, you give in,
               | just to get the damn thing to shut up.
        
               | ink404 wrote:
               | I'm currently opted out and haven't been getting nagged
               | at all on MacOS 10.15
        
               | jjoonathan wrote:
               | How'd you do it? Any chance you remember the command that
               | worked?
        
               | internet2000 wrote:
               | Just uncheck the box on System Preferences:
               | https://i.imgur.com/FCuNZMO.png
               | 
               | I don't even remember Siri also exists on Mac OS most
               | times.
        
               | retromario wrote:
               | How does one disable the signature validation?
        
               | Reason077 wrote:
               | In /etc/hosts:                 0.0.0.0 oscp.apple.com
               | 
               | then:                 sudo dscacheutil -flushcache; sudo
               | killall -HUP mDNSResponder # refresh hosts
               | 
               | Or alternatively:                 defaults write
               | /Library/Preferences/com.apple.security.revocation.plist
               | CRLStyle None       defaults write
               | /Library/Preferences/com.apple.security.revocation.plist
               | OCSPStyle None       defaults write
               | com.apple.security.revocation.plist CRLStyle None
               | defaults write com.apple.security.revocation.plist
               | OCSPStyle None
        
               | Rebelgecko wrote:
               | I don't think I ever opted in to Siri. How can I turn it
               | off? I've disabled it from the menu bar but it's still on
               | the touchbar, just waiting for me to slightly miss the
               | delete key
        
               | threeseed wrote:
               | It's one of the installer screens you get every time you
               | install or upgrade OSX.
               | 
               | You would've seen it at least a dozen times by now.
        
               | ordinaryradical wrote:
               | To remove Siri from the touchbar:
               | 
               | System Preferences -> Keyboard
               | 
               | Click on "Customize Control strip"
               | 
               | You then drag and drop items on and off the touchbar. It
               | is a totally inane, unintuitive interface and it took me
               | forever to find it. Also, I couldn't figure out how to
               | change it because the option DISAPPEARS if you're trying
               | to customize in clamshell mode. The touch bar has to be
               | open
               | 
               | I can't tell if it's deliberately bad UX, but I spent
               | months being asked if I wanted to turn on Siri typing on
               | this keyboard...
        
               | jooize wrote:
               | I feel like macOS asks me about Siri and privacy at login
               | after every major update, with an unskippable setup
               | window, but at least after account creation. Open System
               | Preferences > Siri and disable Ask Siri. You can edit the
               | Touch Bar via a menu in Finder.
        
               | my123 wrote:
               | System Preferences -> Siri and then disable the Ask Siri
               | checkbox in the left.
        
               | Rebelgecko wrote:
               | Alas, that didnt remove it from the touchbar
        
               | my123 wrote:
               | System Preferences -> Keyboard -> Personalize Control
               | Strip
        
               | ngcc_hk wrote:
               | It is hard with sys Perf. and keyboard ... and not good
               | in interpreting one's voice as well.
        
           | john_alan wrote:
           | They (Craig F) stated that's not what they are doing. They
           | aren't merging paradigms.
           | 
           | They are bringing the best bits of iOS to macOS
        
             | systemvoltage wrote:
             | UI changes in MacOS Big Sur are totally uncalled for.
             | They're designed for touch displays.
        
               | sooheon wrote:
               | Agreed. Giant titlebars taking up vertical screen space
               | is a clear concession for fingerability.
        
               | tpush wrote:
               | No, it's a "concession" for legibility. And I think they
               | look very nice (much nicer than the old grey ones), too.
        
               | threeseed wrote:
               | It's designed to unify the Mac and iOS experiences.
               | 
               | Why ? Because once everyone starts using iOS apps on
               | their Mac eg. Netflix, Outlook Mac-only apps will slowly
               | disappear. Hence you will need a look and feel that works
               | on touch.
        
               | nkozyra wrote:
               | Do people really see this happening? It's been one of
               | those big promises for nearly a decade and we're really
               | nowhere near closer to it.
               | 
               | You can run android apps on chrome(/chromium) and other
               | than for novelty I don't know anyone who does so.
        
               | canofbars wrote:
               | The Android tablet scene is almost non existent. With the
               | ipad pro and magic keyboard you could realistically use
               | an ipad as a laptop if the software you needed was on it.
               | 
               | I imagine that eventually devs will target pro software
               | for the ipad and have it come to macos for free.
        
               | threeseed wrote:
               | How are we nowhere close when you can run iOS apps today
               | on M1 ?
        
           | threeseed wrote:
           | Not sure what you mean by telemetry.
           | 
           | Apple asks you every time you upgrade OSX whether you want to
           | send anonymous data to Apple and third parties. You just need
           | to click no.
        
             | gumby wrote:
             | They also check app signatures for revocation at first
             | launch (and maybe other times).
             | 
             | By the standards of modern disk and network, couldn't they
             | download revocation caches the way they do with malware?
        
               | acoard wrote:
               | >By the standards of modern disk and network, couldn't
               | they download revocation caches the way they do with
               | malware?
               | 
               | The whole point is to check if a cert has been revoked.
               | If you have an out of date cache, you'll falsely approve
               | a cert that should be revoked. I'm not defending the
               | system as a whole, but if you care about revoking
               | authentication - which they clearly do - then a cache
               | directly undermines that goal.
               | 
               | A malware hash doesn't get revoked, new ones just get
               | added.
        
               | Dylan16807 wrote:
               | So update it every hour.
               | 
               | Or every time it feels the need to check a program,
               | instead of asking about that program, it could ask for
               | all revocations from the last day.
        
               | spullara wrote:
               | They are checking the certificate. Not app signatures.
        
               | cosmotic wrote:
               | The certificate can be (and is) hashed.
        
               | [deleted]
        
           | k2enemy wrote:
           | Don't forget the incessant nagging and notifications. That
           | was what drove me to MacOS back in the days of 10.3.
        
             | cosmotic wrote:
             | You cannot reasonably turn off all the security warnings
             | and permission requests.
        
             | dionian wrote:
             | turn them off
        
       | docsaintly wrote:
       | Why is everyone surprised by this? Fudging numbers, stealing
       | credit, and blatantly lying is what Apple has always done. Of
       | course they put a beautiful marketing layer on top of it all so
       | its slightly less obvious.
        
       | bt3 wrote:
       | Without directly commenting on the performance of the M1 chip, I
       | still believe the biggest hurdle is software compatibility.
       | Apple's "universal binary" seems dubious, and I don't believe
       | Rosetta 2 is anything more than emulation software which will
       | have performance ramifications.
       | 
       | Microsoft has faced this same problem themselves. Releasing the
       | Surface Pro X is a great example of a machine that is limited by
       | software.
       | 
       | As others have noted in other threads, Apple's ability to run iOS
       | apps natively on the M1 chip seems like a great mechanism to
       | lower the switching costs, though I maintain the chasm left to
       | cross is software.
       | 
       | This is all of course, notwithstanding the "locked down OS"
       | concerns from the front page for the past day or so. Does an M1
       | Macbook Air with BigSur make a competent development machine?
        
         | xsmasher wrote:
         | "universal binary" is just a binary with multiple executables
         | for multiple CPUs inside. Hardly unusual, it's been used for
         | 32+64 bit before, and to package multiple flavors of arm.
         | 
         | Rosetta 2 is emulation in once sense, but it precompiles
         | applications in to M1 code for speed.
         | 
         | Running all x86-64 mac apps plus iOS apps plus any updated /
         | universal apps is pretty far from a software shortage.
        
         | matthewmacleod wrote:
         | _Apple 's "universal binary" seems dubious_
         | 
         | There is absolutely nothing even remotely dubious about
         | universal binaries. They were used during the PPC->Intel
         | transition and again for x86->amd64. You can create them right
         | now.
        
           | myrandomcomment wrote:
           | NeXT shipped a single binary that supported 68000, Sparc, HP
           | and Intel CPUs. This is nothing new.
        
           | kzrdude wrote:
           | They also used a version of that in the 68k -> PPC
           | transition.
        
         | john_alan wrote:
         | Wat.
         | 
         | They've done universal bins before. They work. What's dubious?
         | 
         | Rosetta is translation not emulation.
        
       | joakleaf wrote:
       | OpenCL Geekbench results for MacBook Pro :
       | https://browser.geekbench.com/v5/compute/search?utf8=&q=Macb...
       | 
       | About 60% of Radeon Pro 5500M (in the 16" 2019 Pro)
        
       | KabirKwatra wrote:
       | The 1050ti is not supported on modern macos running metal. See
       | the difference when the card is on windows or Linux running real
       | nvidia drivers as the card was built to.
        
       | trimbo wrote:
       | How do they have one? How do we know these are real numbers?
        
         | unclemase wrote:
         | User submitted benchmarks from Geekbench
         | https://browser.geekbench.com/v5/cpu/search?utf8=%E2%9C%93&q...
        
         | jeffbee wrote:
         | Perhaps it was run on the developer transition kit.
        
         | exacube wrote:
         | trust
        
         | oneplane wrote:
         | How do we know anything? Is this real life?
         | 
         | I believe we simply assume that the reputation and trust built
         | over time allows us to take the numbers as 'real enough' from
         | that program & website.
        
           | trimbo wrote:
           | But isn't this just another thing people download to run
           | benchmarks and then the result is uploaded?
           | 
           | Anyway, answering my own question: multiple results have been
           | uploaded in the past two days and uploaded by anonymous
           | users. Maybe tech reviewers running it with devices they
           | received early?
           | 
           | https://gfxbench.com/allresults.jsp?order=date&page=1&D=Appl.
           | ..
        
             | threeseed wrote:
             | Reviewers receive units at least a couple of weeks before
             | launch with strict embargoes not to talk about it or
             | publish anything.
             | 
             | Also large developers like Adobe, Microsoft would have had
             | retail units months ago to help with final testing.
        
           | kushan2020 wrote:
           | Thats how flat earth became a thing.
        
             | oneplane wrote:
             | No, that's because some of the people don't have first
             | principles and trust at all and instead of validating they
             | just make stuff up ;-)
             | 
             | I was trying to point out that disputing things is fine,
             | but the whole basis of a website where benchmarks are
             | uploaded is trust-and-reputation-over-time to the point
             | where enough other people can re-run the same tests on
             | their machines (one they get them) to validate the results.
             | Heck, you might almost call that science!
             | 
             | Right now, there aren't additional results and you can't
             | easily reproduce them because the machines aren't wide
             | spread or available. But we can take the track record and
             | reputation of the site and application and use that to
             | value the integrity of the published benchmarks to be
             | 'likely correct'.
        
       | boardwaalk wrote:
       | We don't know if this is the 7 or 8 core M1 GPU, nor whether this
       | is the actively cooled MacBook Pro/Mac Mini or the passively
       | cooled Air.
       | 
       | So this seems pretty useless.
        
       | dawnerd wrote:
       | It's still mind boggling why the 13" MBP w/ M1 can't run dual 4k.
        
         | txdv wrote:
         | dual 4k or dual monitor?
        
         | akhilcacharya wrote:
         | Can't the RPi 4 support dual 4K?
        
           | DeRock wrote:
           | Technically the M1 can support 2 displays too, 1 @ 6k/60Hz
           | and 1 @ 4k/60Hz. This is done eg. for the Mac mini. However,
           | for the laptops, one of those is consumed by the built in
           | screen.
        
             | monocasa wrote:
             | Which is weird because every laptop I've had for 15 years
             | has had a mux on the RAMDAC outputs so you could connect as
             | many external screens as you can have max outputs and just
             | turn off the builtin screen.
             | 
             | Including Macs.
        
         | lambdasquirrel wrote:
         | It's the first release of a new product? Features take time.
        
           | rovr138 wrote:
           | I mean, all they're asking (and me too) is feature parity
           | with what existed on the line before.
           | 
           | If they can't deliver that, it probably should have been held
           | back.
        
           | mywittyname wrote:
           | That's a pretty damn important feature, especially for
           | professional users.
        
             | boardwaalk wrote:
             | Agreed. If you think about what's common between all the
             | types of "professionals" that Apple targets, from coding to
             | music production to whatever... one of them is wanting
             | multiple screens to have everything laid out in front of
             | them.
        
         | oneplane wrote:
         | I suspect (and this is just speculation) the M1 is very closely
         | related to their iPad architecture which was never designed for
         | more than one external display, as such the architecture wasn't
         | really suitable and between building a 'computer' based on that
         | chip and re-architecting the design it just wasn't feasible or
         | required to support it on the first release.
        
           | dawnerd wrote:
           | Thats exactly my thought too. Makes me wonder if it will
           | support two monitors in clamshell mode since the mac mini
           | supports two. I guess we'll have to wait and see. Maybe it's
           | something they can patch in.
        
             | oneplane wrote:
             | I'm also curious as to how the graphics connections /
             | routing works; would they have an internal bus and then
             | split and mux it or something? Or perhaps an eDP or LVDS
             | bus that can't be switched out to some HDMI or DP
             | transceiver in clamshell?
             | 
             | Let's see if iFixit can get us some nice high-res pictures.
             | Heck, someone might leak schematics and any normal users
             | can probably show a IOTree for the system. Always nice to
             | explore those on new machines.
        
       | tripham wrote:
       | These numbers are quite good if you realize that the M1 iGPU is
       | comparable to a discrete GPU. Also the power consumption is for
       | the entire package not just the discrete GPU alone.
        
         | j3i4j43JJJ wrote:
         | 1050 TI was released in 2016 (5 years ago) and even then it was
         | much slower than 1060-1080 cards.
        
       | [deleted]
        
       | dang wrote:
       | Recent, related, and massive:
       | 
       | https://news.ycombinator.com/item?id=25065026 ("Apple Silicon M1
       | chip in MacBook Air outperforms high-end 16-inch MacBook Pro")
       | 
       | https://news.ycombinator.com/item?id=25049079 ("Apple unveils M1,
       | its first system-on-a-chip for portable Mac computers")
        
       | [deleted]
        
       | j3i4j43JJJ wrote:
       | The old 1050 TI only has 2.1 TFLOPS. Compare M1 against the real
       | beast RTX 3090 with 35 TFLOPS - that's 17 times more than in 1050
       | TI. 2.1 TFLOPS doesn't impress me at all.
        
       ___________________________________________________________________
       (page generated 2020-11-13 23:00 UTC)