A reminder about how Hey Siri privacy works: https://techcrunch.com/2015/09/11/apple-addresses-privacy-qu...
> In no case is the device recording what the user says or sending that information to Apple before the feature is triggered,” says Apple.
> Instead, audio from the microphone is continuously compared against the model, or pattern, of your personal way of saying ‘Hey Siri’ that you recorded during setup of the feature. Hey Siri requires a match to both the ‘general’ Hey Siri model (how your iPhone thinks the words sound) and the ‘personalized’ model of how you say it. This is to prevent other people’s voices from triggering your phone’s Hey Siri feature by accident.
> Until that match happens, no audio is ever sent off of your iPhone. All of that listening and processing happens locally.
The idea of having a device that listens to everything around in your home is unnerving, for me atleast.
The problem with saying "they would never do something like this" or "I have nothing to hide so why do I care" is that your trusting people thousands of miles away from you who are heavily influenced by the government...
If your not comfortable having a policeman sit on your desk while you work on your computer then I might not be a good idea supporting products like these.
Personally, Siri can screw off. I am more than happy to check the weather myself. :)
There is no conspiracy. If you don’t want your phone or other device to listen continuously, do not turn on a service that requires it to do that.
It exists to sell me shit and make people money.
Not nefarious, I accept capitalism. But at least let me turn it off.
How much longer before they release MB/MBPs with the same chip, doing the same thing? How much longer after that before we see SDKs released for writings apps that run on Mac, but are completely executed within the ARM processor? I think this is coming far sooner than many people think.
This is quite a large technical undertaking, with a very small pay-off given the tiny percentage of users who develop on their Mac for iOS devices. I can't see this being something that happens until x86 CPUs have been removed entirely.
Why? iOS apps are designed for touch. Macs don't have touchscreens.
No need to speculate. Check out the TouchBar MacBooks.
...and making Macs even more closed and proprietary, moving away from the (semi-open, but also quickly closing[1]) PC again. This is huge, and scary.
Apple knows that on desktop computers, the genie is out of the bottle. They wouldn't block booting other OSs, just like they they still give options to disable Gatekeeper and SIP despite the FUD.
They only became more open out of necessity.
For fun here's the "transitions" talk Steve Jobs gave announcing the switch from PowerPC to Intel processors. https://www.youtube.com/watch?v=ghdTqnYnFyg
For simple apps this is more or less true.
More complicated cases might not behave same.
For example, ARMv8 and x86 have different memory models. So execution won't be equivalent. You can have concurrency bugs in your code that only manifest on ARMv8.
I'm sure there are a lot of other examples of differences as well.
Serious workloads is very inefficient without a proper windowed desktop environment combined with a Finder and a Terminal that can access files across all apps.
Also, having an x86-compatible CPU at the core enables high enough performance for virtualbox/vmware, which is a huge enabler in that it allows running windows or linux VMs side by side (or even integrated) with the native desktop.
A converged ARM-based Macbook sounds very much like a toy computer; a facebook+youtube media consumption station with an option for perhaps doing limited creative work such as composing an iMovie or touching up some photos at the most.
What you describe as a "toy computer" is actually an indestructible and hella-great main business computer for a huge majority of people.
Most people aren't designers or videographers or programmers -- including most Mac users. What you dismiss as "Facebook" is actually the browser, which is how most business users do their business day in, day out (plus some office suite and so on).
So what you describe will be more like a Chromebook done right.
With all the competition from Microsoft's Surface lineup we'll probably see the iPad shift even more towards the Mac side of interaction as well.
To your second point about performance: have we really seen what a high wattage ARM CPU can do?
For development, what’s the advantage over the current system where your app is compiled for x86? It would still run in something like the simulator app and have similar drawbacks.
I’m more excited for possibilities like FaceID, not to mention other uses for the secure enclave.
Purely speculating from a position of ignorance, but I wonder if this might be able to take over for a variety of coprocessors.)
Curious to see if new Mac from now onwards will still support TouchID as Apple has said that they are abandoning Touch ID on the iPhone. Makes no sense to support TouchID on just one platform, the minor one.
I don't get this point. How does ARM translate to "walled garden"? I see no reason to tie ARM and the walled garden-approach together, just because that's the way it is on mobile. Non-certified apps would work just fine on OSX if the developers build for ARM.
Assumption 1: Most interesting OSX apps are active ones, where the authors have the source code, and are thus able to cross-compile and publish ARM binaries.
Assumption 2: Due to volume (and using an indenpendent foundry) the A11 chip is at least 10x cheaper than an Intel CPU.
I'd love to see a Macbook with 10xA11s, for a total of 20 high-performance cores, even if it can only run non-legacy apps ("legacy" meaning apps that depend on x86 code somehow).
Does anyone know how much space the four inefficient cores on the A11 take up? Perhaps the A11 volume is just so large that Apple can spit them out at a cost so low that it doesn't matter if four of the cores remain unused? I don't think they'd be of much use in a Macbook -- at least not the 40 of them that would be available in a 10xA11 setup.
Have a look at the die photo at [0]. Not much.
[0]: http://techinsights.com/about-techinsights/overview/blog/app...
From software side they've already done similar things in the past (fat binaries, Rosetta).
A single A10 will not add 50% of computing power, even ignoring its GPU.
Also, according to Geekbench, the single-thread performance of the 2.4 GHz A11 is 70%[1] of the 3.7 GHz i7-8700K. I'm guessing Apple could bring the A11 even closer to the i7 by increasing the clock frequency, which may not make sense for mobile devices (because of disproportionally higher power consumption).
So, again according to the Geekbench figures, four A11s (at mobile clock frequency) is 50% faster than a 6-core i7-8700K[1].
[1] https://browser.geekbench.com/ios_devices/52 https://browser.geekbench.com/processors/2062
One could hope that, due to the negative PR, Intel would give up trying to prevent Apple from delivering 2x the performance at a lower cost. In any case, I think this patent needs to be challenged thoroughly, as it seems to favor only Intel while punishing consumers.
I'm glad they're trying this first in Macs that're expected to always be plugged in, rather than laptops.
Intel ME wasn't bad enough... so Apple builds a second backdoor you can't remove, with the explicit purpose of bugging your room?
No thanks.