Their last keynote was clearly a gymnastics exercise to ignore Intel CPUs and dismiss laptop performance while later praising their own chips which power a tablet that has no software ready to use that much speed.
The fact that one of the most secretive companies executes a PR-stunt by providing an exclusive interview to one of the most respected tech outlets only confirms this strategy. Expect similar movements in the following months.
Now, it is only a matter of "when", not "if", Apple will start selling laptops with their chips.
As as an aside, this strategy is extremely similar to the one they used when dropping the headphone jack on the iphone: "leak" the news to a respected outlet, perform damage control before the keynote, and test the reaction of the market. When they introduced the headphoneless iphone, that topic was so beaten up that it got much less attention compared to a surprise revelation.
I’d love the ability to carry all my software and data around in a phone without lugging around a laptop or having to buy a desktop computer. And I’d love to never concern myself with transferring or syncing data again.
And why not let us connect an iPhone to an eGPU for desktop gaming?
If Apple invests in enhancing CarPlay, it might be a sign that they are scaling to a wider convergence market. If they don't invest or abandon it, then maybe convergence won't happen. They're famous for saying they're not working on projects that they are actually working on, so we have to read between the lines.
After upgrading from an iPhone 6 to an iPhone XR, I've been thinking how close the latest gen devices are to traditional computers anyway. The configuration/settings and features are so far beyond the first gen devices that I think convergence will happen, nobody knows what it will look like yet.
That sounds like the most un-apple thing I can possibly imagine. Their ethos was always to build a device that does something incredibly well - a phone that can be a phone but also a desktop computer is everything but this.
Because that would mean Apple doesn’t get to sell you two devices.
But attach a proper(!) keyboard with touchpad to an iPad Pro, put macOS on it instead of iOS, and there's your next MBP ;)
Sure - one device could do everything, but that doesn't drive sales or profit.
Also makes me not really happy to buy any current hardware of theirs.
In the Jobs' days they'd promote the practical value of a product. For example they might say how the new iPod can have 50k songs, not how big the storage is, let alone the storage type. They'd mention a smaller form factor or improved battery life, not about HDD to SSD.
So when they talk about SoCs, cores, GPUs, intel or anything else hidden, are they signaling to customers? Maybe its signaling to investors that apple is innovating and that ought translate to profits; maybe the inner geek in all of us?
The dirty trick with hidden tech's performance figures is that they don't directly translate to customer value. As you mentioned, you're dissuaded from buying because the new stuff will be so much better. Maybe it will but the old apple would tell you that the new macs can process video in final cut 10x faster, or you don't have to buy a separate gaming rig, i.e. you can do more stuff better.
The good part about focusing on what the products can do is that you can't fake it. You can't fake it to [geeks] like me, [non-geeks like] my mom, my kids, investors, etc.
E.g. this from 2015 (!): https://www.anandtech.com/show/9396/samsung-sm951-nvme-256gb... This from this year: https://www.anandtech.com/show/13438/the-corsair-force-mp510...
Sample size of one and all that - but for me it was a selling point. The PowerPC G5 (PPC 970) was intriguing when it first came out. Having been with Intel, Cyrix and AMD systems since the 286 and used them since XT (Intel 8088) days, it was nice to muck around with something new and paired with OSX Tiger it was such a fun world to explore. The move to Intel felt like a bit of a letdown, but by that time I loved OSX and Snow Leopard cemented it as my OS of choice.
Apple desktops/laptops moving to custom silicon would excite the nerd in me. I want competition.
1. It helped them jump from a failing CPU platform to a non-failing CPU platform. PPC was not keeping up with x86 anymore and Apple's two PPC vendors were going in opposite directions because there wasn't enough of a market for CPU's for Macs. (There were even rumors of future Power Macs migrating to a full POWER CPU rather than PPC.)
2. It meant you could run Windows, and hence Windows apps, on your Mac if you wanted to, without CPU emulation. This is still a fairly important use case.
3. It may have also simplified matters even for Mac application development, since you didn't have to switch ISA's in addition to switching operating systems. Making matters worse, PowerPC defaults to big-endian and x86 is little-endian.
How does this apply to a potential ARM switch?
1. You can't really say x86 is "failing" if it's still the industry standard, but Apple might believe (rightly so, given the market size of iOS) that they finally have the ability to sustainably outperform the performance of x86 on their A-series chips.
Most of the PowerPC bet was that a newer and more elegant architecture would outperform x86 and provide Apple a competitive advantage, and while that may have occasionally been true sometimes, it was never a huge deciding factor. Intel and ARM kept up because they were able to make investments in keeping x86 afloat. Ironically, Intel themselves also bet that a newer, more elegant architecture would make x86 obsolete, namely Itanium, only for AMD to invent x86-64. Not even Intel themselves could stop the x86 train.
With the rise of mobile devices, ARM now has the same market power as x86, if not more, simply because there are many more ARM-based devices manufactured and sold than PC's. Apple in particular has been able to invest heavily in their A-series chips and has full control of their CPU roadmap and destiny. Perhaps this time, x86 may finally be rendered obsolete. Don't count on it, though.
2. This is really mostly dependent on Apple's strategic priorities. With more and more application functionality moving to mobile and the web, being able to run Windows is less and less important. At the same time, being able to run Linux is more important; for many developers, running a Linux VM in Vagrant or Docker lets us develop in a similar environment to the servers our code will eventually run on. Sure, you can run Linux itself on ARM, and perhaps there will be more Linux distros that support ARM when and if Apple switches the Mac, but it won't actually be the same as the server unless ARM makes serious inroads in the server market.
Maybe they're betting they can surpass x86 enough that they could emulate x86 at respectable speeds. Since they would be migrating CPUs again, they will probably provide a CPU emulation layer again, like they did when migrating from 68k to PowerPC and then from PowerPC to x86. Keeping this emulation layer around would have more of a benefit because, after awhile, nobody needed to run 68k or PowerPC code anymore. This has never been true for x86 code, and it won't be for a long time, so look for Macs to continue to run x86 even if Apple switches.
3. I think A-series is also little-endian by default, and for x86, see above. Maybe Apple is banking on getting more value by running cross-platform iOS/Mac apps than cross-platform Windows/Mac apps. This will probably impact Mac gaming the most, but that's never been a priority for Apple.
I went back to remind myself of the details of the switch from powerpc to intel. It was announced at wwdc 2005 (june) when they released a developer transition kit. The announcement included a commitment to ship computers running x86 by wwdc 2006, so there was pretty much 12 months lead time even for outside developers. Apple also committed to moving to intel fully by the end of 2007, a 30 month total process.
I think Apple is further ahead of the game this time around for how quickly they can go from announcement to shipping product. OS X had been running internally on x86 for years, but this time tons of apple software has been publicly running aarch64 for many years. I do think they need more than 30 months to complete a transition this time around as the user base is much larger and apple may never want to invest the serious dollars it will take to build the giant chips they get from intel. It's one thing to swap out the macbook processor. It's a whole other world to do 130 watt dies.
I'm expecting either wwdc 2019 or 2020 we get an announcement, with products shipping after the OS release in the fall of that year.
Apple's laptop naming has been getting steadily worse since the introduction of the retina macbook pro. The macbook modifying words have lost all meaning when systems labeled pro have major expansion and repair limitations, the device named air isn't the smallest or lightest and the model without a modifier isn't the cheapest. They missed an opportunity to restore some naming sanity this year, but a switch to arm could present it again.
As great as A12X is, it's not touching discrete GPUs and CPUs allowed to burn wattage approaching triple digits. If I was apple I'd lean into this and restore "pro" as a designation that means something. Pro devices would stay x86 and be marketed as supporting more software. Non pro devices would make the jump to arm on their regular update cadence. It would give Apple tons of time to get their custom chips to xeon level scale for core count and interconnects. Even gives apple the option to continue using x86 indefinitely as investing in 100+ watt chips may not have the returns to make it worthwhile. Then the ipad pro becomes poorly named, but I can't solve all of apple's self created problems.
If I was apple this is the mac product matrix I'd have when the dust settles:
macbook: First to move as it's perfect for A12X since it already only has one usb-c and due for a refresh. Drop the intel tax and now it's around $1099. Apple could even use the exact panel from the 12.9 inch ipad minus the touch gear.
macbook air: would need the next generation A chip to support more usb-c and more ram. New sub-$1000 price for the 128 gb model and outrageously long battery life for web browsing or note taking. No need to make it any thinner or lighter.
macbook pro: Kill the weird non-touchbar model that was clearly supposed to be the new air but priced way, way too high. spec bump the 13 and 15 inch, especially discrete GPUs
imac: switch to arm or kill and replace with giant beautiful screen that ipad and iphone docks with.
imac pro: spec bump, but this is as close to a perfect device apple has released in a long time.
mac mini: Use the apple tv case to build an arm mac with A12X or higher that can be sold quite cheap. Use as great PR to give away xcode development systems to schools and developing nations and get swift into the hands of people learning to develop applications
mac mini: relabel as mac mini pro and pretty much keep as is
mac pro: Make it unbelievably expensive but also user repairable, multi GPU on their own standard cards, tons of ram and big xeon chips.
This eloquently summarizes much of the disconnect customers have been feeling about the Mac product lineup. Jobs never would have allowed this to happen.
I’m thinking you’re going to have a hybrid architecture for MBP. The T2 will expand to support all of Apple’s own software plus all upgraded one sold through a revamped Mac Appstore (plus hopefully your own compiled stuff when security settings are off). This can power down (all but ~2 cores) and instead power up an x86 coprocessor that supports everything else. If Intel doesn’t deliver that, AMD will. For Macbook I think you’re on the right track, just that the Air will be replaced with a 13 inch Macbook model with the same architecture.
From an Apple perspective you could easily justify the price for all the old Intel Chip. 1. Intel has a 2 years+ node advantage which you cant get anywhere even if you pay. 2. Intel has the best performance / watt CPU on the market, also highest performance Core on the market, you cant get it anywhere even if you pay. 3. x86 compatibility, which is more like a x86 tax. Although you could get it from AMD.
Now the first two is gone. TSMC has now edged Intel in node, Apple themselves has the best Pref / Watt work in A12X. AMD has proved to be very competitive at the high end. And Intel is still charging Apple the same when they have far less value.
Apple is now being hold up by Intel, but I don't think Apple could dump Intel just yet. There are two thing that is holding Apple up.
Thunderbolt - TB is currently still an Intel only technology. There are no other host controller on the market other than Intel's one. And they cost a fortune ( relatively speaking ) Apple has invested a lot into TB, but Intel is making all the same Firewire mistakes. May be Apple is working on a USB 4.0 solution, and they would dump TB once and for all in 2020. Intel promised to make TB as an open standard in 2018 but has yet to do so.
Modem - Apple relied on Intel Modem for iPhone, which is Apple's bread and butter. Before any move on the Mac side Apple will need to think about its consequences on Modem. In worst case Apple could switch away from x86 and Intel decide to hike the price of Intel Modem. I think the revenue of Modem is roughly the same as revenue of x86 from Mac. Intel 10nm isn't performing. And Apple is not happy with Intel on both front.
I hope I’m wrong though.
while I have seen many laud the iPad Pro for its power I haven't see any mention of heat or how long it can sustain a workload. Personally I do not want to see the Mac line change processors again for two reasons, like I mentioned prior many of us have a lot invested in software that runs on OSX as well as Windows that these machines can run, second I don't need that wall to go any higher.
Would moving off of traditional desktop CPUs harm that? Is there a way to do compatibility at the OS level without sacrificing half of the performance gains?
I think this comment underestimates the importance of performance to the iPad and Apple's long term vision for it. While "real Photoshop" won't be ready until next year, it's clearly aiming to make the iPad a real solution for compute heavy graphics tasks. More will undoubtably follow. Why not edit video on a film set on the iPad? I can see it happening. Your comment sort of makes it sound like Apple just threw these into the iPad as a PR strategy and the "real plan" is to transition the Mac. But Apple's plan is to make both of these pro product lines as beefy and power efficient as possible and target real professional creative workflows.
Wheres the example of software that truly shines on these chips? Wheres the software like After Effects, Houdini, Octane Render that you can truly see the power of the machine rev up when on the right hardware.
They're wheeling Photoshop out as proof of this devices power yet as a designer I certainly don't consider Photoshop a heavy piece of software anymore and the only reason it ever chugs is it doesn't use the machines power effectively, mostly single core and disk speed constrained.
This power has been available to iPad developers for a few years now so shouldn't we be seeing truly powerful pro apps that take advantage of it emerging? Are these chips actually powerful for real world pro tasks or are they just talented at providing Geekbench scores.
I guess it depends on what you mean by "ready" how long the list is, but I'd suggest that Photoshop, console-quality games, AutoCAD, video editing tools like iMovie, all can take advantage of the speed.
What console-quality games actually ship on the iPad?
The problem with games is that to actually match an Xbox One S's graphics you don't just need to match its 5 year old hardware in performance. You also need to have the capacity to actually fit the game & its textures. All 40-80GB of it.
Who is going to ship an actual console-quality game at console-quality sizes on a device whose base model can't even hold it?
And Apple is notoriously stingy with RAM. This new one bumps it up to 6GB, which is nice, but still less than the 8GB in an Xbox One S. How much does iOS reserve of that, and how much do the games actually get?
CPU:
https://www.anandtech.com/show/13392/the-iphone-xs-xs-max-re...
GPU:
https://www.anandtech.com/show/13392/the-iphone-xs-xs-max-re...
There's little technical detail that wasn't in the ipad review's benchmarks[1] and previous speculation on using apple chips in desktop machines[2].
Everything new they gave no details on, and the only in-depth answers were hard hitting questions like "you could have made a slow chip, why did you decide to make a fast one instead?" and "why is apple so good at teamwork?" (maybe not the questions that were asked, but they were the questions that were answered :)
[1] https://arstechnica.com/gadgets/2018/11/2018-ipad-pro-review...
[2] https://arstechnica.com/gadgets/2018/04/apple-is-exploring-m...
If I were to speculate, due to his decades of industry experience evaluating hardware platforms, his role in Apple is to provide strategic direction and guidance on how to build the best hardware platforms. I can't think of any other role they would have wanted him for... which could be a failure of imagination on my part.
So... no, it's not "Apple marketing". Anand Shimpi is involved!
That's the feeling I got, as well as perhaps being a human abstraction layer between the engineers and the executives. If you can describe in relatively understandable terms something that is technically difficult to thousands of laypeople (well, that's unfair - Anandtech was for nerds but being a nerd doesn't make you an IC/EE engineer), you'd be an asset to both the corporate and engineering teams.
And that translation works both ways - understanding the direction of the company with regards to future products vs what you want to get out of your silicon teams (e.g. when Anand mentions thermal envelopes, that might include an understanding of the limitations stemming from the potential form/design and material of a future product).
I wonder how that colors the reviews - if the outlet is too critical about a device, they can quickly lose these special privileges.
They’re already shipping their custom T2 chips in their laptops. The compiler toolchain can build great binaries for their A chips. They’ve swapped CPU architecture before and the modern Mach binary format can hold versions of the executable built for different architectures.
They will probably need a Rosetta equivalent to emulate x86 for all the applications that are slow to switch. That might be tricky because of the huge surface area of the x86 instruction set. But I think it will only be a matter of time. It might also explain why they have kept the MacBook and MacBook Air product lines - they might want one of them to stay with intel’s cpus and the other to switch to their A* chips going forward. Or maybe they’ll just wait another generation or two and switch CPUs across their whole line in one go.
Well, and all the applications that won't switch. Also even on the Mac virtualization/containerization is not nothing. The Mac is a different market and use profile then iOS, and while Apple based on past history won't support an old arch indefinitely neither are they likely to completely blow off backwards compatibility. Compared to previous transitions dropping x86 would have extra complexities as well, so previous experience may not be entirely applicable. In particular Apple would be moving away from the full fat computer standard rather then no change or towards one, which may change the payoff for users despite Apple being much bigger. The absolute performance differences (immediate and future) also aren't likely to be as big.
I don't want to underestimate them, and huge disruption is inevitably coming down the pipe anyway and Arm may well emerge a winner there regardless, but it's also just a really big challenge.
>That might be tricky because of the huge surface area of the x86 instruction set.
Transmeta was able to do a decent job, and I think Novafora is still around and licensing their IP? Granted a lot of instructions have been added since then, but Apple certainly has a lot of expertise there as well and a great deal of capital to aim at the issue.
To think that deep in the Apple labs that don't already have A-X laptops running - and have for, for a while is not thinking like Apple would.
Anyways, Apple has never been one for smooth transitions. Their history is dotted with big, bold changes. If they kept x86 they would slow the adoption of their new architecture. Apple will likely take a "take it or leave it" attitude like they did with the CD drive and headphone jack.
Despite their ongoing efforts to make the iPad more capable, I think and hope they'll recognize the value in keeping it simple enough for anyone to use, and thus having a separate macOS experience with more tools/flexibility in a laptop/desktop form factor.
Depends on what you mean by "much". They have made good backwards compatibility an important part of every single architecture transition so far, and on the Mac there were good 3-4 year official transitions at least (the 68k emu still ran under Blue Box/Classic Environment so it lasted through 10.4 Tiger, Rosetta lasted through 10.6 Snow Leopard). And that's official, in practice there have continued to be longer last options.
It's certainly not the degree that Microsoft has traditionally cared, but it's not at all been blown off either.
[1] https://www.bloomberg.com/news/articles/2018-04-02/apple-is-...
[2] https://www.wsj.com/articles/qualcomm-suffers-setback-in-ant...
[3] https://www.reuters.com/article/us-apple-qualcomm/apple-not-...
When you strip away all the stuff a laptop doesn't need, you're left with... an x86 chip!
You're left with an ARM chip
FaceID is not solely dependent on ML, it's also managed by the secure enclave co-processor which is also used for Touch ID which is available on Macs now. ML helps to reduce false positives.
Apple's T2 chip is an ARM-based processor that's already in almost all of the newest Macs, it is used as a storage controller (which allows Apple to encrypt the drive very fast and transparently), security enclave processing (touch ID on MBA), Siri processing, and more. Every year, more and more of the processing is moved to Apple's T series co-processor.
Apple's custom silicon allows them to integrate software and hardware on a deeper level. Intel develops CPU for the mass market. Apple develops for their own customers only.
With Apple's focus on on-device ML, I would guess this will be the first part of the A-series trifecta (CPU, GPU, Neural) to be included on a Mac and exposed to developers. I can imagine a bunch of possibilities for such a chip, not just FaceID.
Because of the sandboxed nature of iOS, and the current immaturity of the Files app/feature, it's hard and inconsistent how you can exchange files between apps on the iPad. As a result, you're still mostly stuck to using one app to do things in iOS. The workflow is still fragmented.
[1] https://boingboing.net/2018/11/06/ipad-pro-deemed-amazing-fo...
Iow: If you want a pc, get a pc.
But from a software usability standpoint, it's not.
On that Macbook Pro I could run several VMWare sessions running Windows and Linux (have run 3 at the same time in the past). I can run Handbrake encoding videos across cores while still browsing the net in Chrome with 4 windows open each to different profiles each with 5 to 20 tabs. Have 4 terminal windows open, at list one of them serving a dev webpage. Run VSCode and Unity and Visual Studio and other stuff all at once. I've also done things like compile Chrome from source. Run XCode, run 2-3 iOS simulators.
I get that an A12X can't do those exact tasks as it's not the same instruction set but could it do the equivalent and get similar perf?
That's amazing if true. An iPad Pro weighs 1/4th of my MBP (2014). My MBP's fans spin like crazy when running a high intensity app and the case gets too hot to touch.
I'd love to believe a machine that has no fans and doesn't get hot and weighs 1/4 as much could actually have the same or more perf for real but when I actually use an iPad it rarely feels as fast and given it doesn't multitask well there's no way for me to check that perf is really comparable in real world use cases.
Anyone have any insight? Is it just because the chip was redesigned to be more efficient it can match or exceed the i7 in my MBP? Should Amazon be filling their AWS racks with A12X based machines that get the same per at much less heat and power? (Yea I know they can't by A12X chips buy still). Don't iPhones and say top Samsung phones generally show similar perf?
But your workload doesn't seem to include those things, so hard to say. Critically A12X is unlikely to have hardware virtualization support, so your use case of VMware would be slow even if it wasn't doing any binary translation.
Also your MBP's fans spin because it's trying to achieve higher sustained performance. Typically mobile devices will just instead thermal throttle hard. Like, lose half their performance hard. How well can the iPad Pro sustain its performance? That's a real big question.
Re AWS racks: No, they can't. A12X in the server world would be a joke. It'd be competing against things like AMD's Rome which is 64 cores / 128 threads with, and this part is critical, up to 4TB of RAM with 128 PCI-E lanes. Even if the A12X could compete on raw CPU throughput it can't compete on I/O, virtualization, etc... The A12X is also going to be pulling a lot more power than you might expect. It's not that much more power efficient.
Best of both worlds.
Don't need x86? Don't spin up the Intel chip, doing something that requires x86, spin it on up.
This way you get software compatibility with the power sipping of the ARM CPU.
I had a PowerMac 6100/60 with the DOS Compatibility card that had its own sound chip, video controller and optionally RAM.
My 6100/60 had 24MB of RAM and the card had 32MB of RAM.
Before that, I had an LCII with a ‘//e card.
I doubt that modern Apple would ship a hybrid x86/Arm laptop though.
Now here's my prediction: Apple does not really want to build an ARM-powered MBP. Instead, they will eventually allow iPads to double boot into iOS and/or MacOS.
Call me crazy, but this would be huge. Of course, Apple would still build traditional laptops, maybe even with ARM processors in them, but only as a byproduct of their iPhone / iPad product line.
You can't use OSX with your finger. There are millions of places across the OS and applications where the hit point is too small for a finger. You just need to compare the keys on the iOS keyboard and then compare that to the traffic lights on OSX.
The MBP's are also battery-powered & thin?
But you seem to be taking geekbench 4 here as gospel. I'd take that with a grain of salt. A really, really big grain of salt.
Even with that said I'm not even seeing any MBP 2017 results in the article...?
They give the impression by saying it’s a custom GPU that it’s a from scatch in-house design but it’s unlikely to be the case.
Moving to a "custom GPU" was basically Apple saying "Ok, thanks, we are taking over from here".
All this power sounds great, but I really don't know what else I'd do with it, beyond surfing an ad-riddled internet on my couch.
If any developer has a life-changing daily use case for their iPad, I'd love to hear it.