Well, rarely completely forbidden, but e.g. I think OpenBSD has been W^X by default for quite some time (though IIRC with a WX allowed flag per… FS mount?). Now on FreeBSD it's not default but it's there, and if you turn it on, you have to mark WX-mapping binaries by running `elfctl -e +wxneeded`.
Firefox actually became W^X compliant all the way back in 2015: https://jandemooij.nl/blog/wx-jit-code-enabled-in-firefox/
> arguments about third party browser security on iOS
Well, W^X is just one mitigation technique. But also, the "security" arguments have always been kinda dubious. I don't think there's that much difference (at least philosophically) between an interpreter bug causing arbitrary crap to happen inside your app's sandbox and a JIT bug doing the same.
I tried porting a much simpler JIT to M1 and ran into the problem that Rosetta 2 was simply better at translating an AMD64 JIT than my attempt at a JIT. It could’ve been related to W^X performance, but I actually suspect the real answer is that Rosetta’s optimization passes were doing things the JIT did not do natively. I don’t know how to debug that, though, because from the debugger’s PoV, emulated processes look just like native Intel processes.
* 9900k is boosting to 5ghz which is sacrificing efficiency.
* 9900k PC is delivering a much higher framerate, so it'd also have much higher GPU utilisation.
* Afaik RTX3090 will have high power draw even at low utilisation (large card, lots of memory).
From anandtech:
>Should users be interested, in our testing at 4C/4T and 3.0 GHz, the Core i9-9900K only hit 23W power. Doubling the cores and adding another 50%+ to the frequency causes an almost 7x increase in power consumption.
https://www.anandtech.com/show/13400/intel-9th-gen-core-i9-9...
Look at the 3090s power consumption during media playback: https://www.techpowerup.com/review/zotac-geforce-rtx-3090-tr...
And GPU it's much faster than the Intel integrated.
> We really didn't expect this to work or we probably would have tried it sooner.
The only game rosetta is beating native on is rogue squadron 2. Since Dolphin is a JIT, this seems to be a case of where Rosetta's JIT is smarter than Dolphin's in terms of which ARM instructions are chosen when converting from the Intel instructions than Dolphin when converting from the emulated PPC instructions.
Unless you're comparison is the 8559h and not the "native" bar. I mean, the 8559h is a mid range older Intel CPU and it's hard to understate how much Intel stagnated since Sandy Bridge (and especially since Skylake).
According to the article, the AArch64 JIT isn’t as complete as the x86 one so some less common instructions are emulated, not JITed. I imagine a game that uses a lot of these is slower with the native ARM version.
From reading this blog, Gamecube games often made heavy usage of the memory-sharing capability of the hardware - which made emulation on PCs a performance challenge.
From my understanding it's not that useful for reading back either as the main bottleneck there is the fact that you need to sync gpu and cpu rather than transfer speed.
A problem that I don’t think applies to consoles is that GPUs don’t use the same texture format CPUs do - they swizzle them in proprietary ways and it needs conversion even if there are no memory transfers.
And why would software rendering be fast on the M1?
No one was competing for fastest single thread because no one needs it.
Well maybe marketers need it.
A big part of the reasons why web applications are so darn fast on the M1 is it’s single-threaded performance. Remember that JavaScript itself is single-threaded
Engineers making pronouncements like this bring me no end of amusement. Surely we've learned by now that we're pretty terrible at these sorts of guesses?
Ok, never played that game. I’ll better try not to play that on my ancient Intel powered laptop ...
Different games run differently well on Dolphin if you got older hardware. While Mario Kart Double Dash runs perfectly fine in full screen, „F-Zero GX“ suffers massive slowdown in some levels on my 7 years old CPU/GPU combination. Interestingly, both games employ the „heated air“ effect on similarly looking levels — but still I got 40 FPS vs. 60 FPS in that case. I wouldn’t mind but the sound needs to be in sync with the graphics subsystem on the Game Cube — audio is broken with even slightly slower frame rates, unfortunately.
There isn't one. Apple's silicon team is at least 1-2 years ahead of all the other ARM vendors when looking at mobile performance, and none of those vendors are even trying to do anything in the desktop space (yet).
the m1 is the renaissance of laptops
This performance would've been available on iPads years ago if it wasn't for Apple's blanket ban on JIT and the likes.
Apple is one of those companies whose hardware I'd love to have if it wasn't for their software and general corporate decisions. Until I can run a proper version of Firefox on iPad, I'll have to stick with the objectively inferior hardware for the coming years.
It is there to prevent apps circumventing the review process and security model ie. apps pretending to do X during the review process and then doing Y when in use or obfuscating their use of private APIs. Now you can argue these restrictions are unreasonable but many of us don’t want our iPads or iPhones to be like our computers.
And many of us do. I would never buy a locked-down piece of hardware like that. But I don't think it matters either way what either side wants, because it's what Apple wants that matters. They want to keep their walled garden's walls air tight, and there are apparently enough people that are OK living in that garden that it works.
I'm positive that they have done the calculus that they'll make more money in the long/short term by behaving this way. Google did a similar calculus, with a different set of values (if not an entirely different set of variables altogether) and came up with a different answer. Although it's interesting to see how their position has shifted over the years to be a bit more like Apple in some regards. Regardless, the point is they don't care what you want once they've gotten to the point of getting your money. Past that, they only care about maximizing their profit.
It seems to me a rather simple fix: give users an "unrestricted mode" just like Android has the ability to install from third party. By default keep it locked down, but allow the USER to make that decision, with ample warnings all over the place about what they're about to do.
Heck, for all I care make them go to an Apple store to have it "unlocked" so an employee can walk them through what it actually means and how dangerous it is so the average joe schmoe doesn't just click the button by accident.
You can run whatever code you want. Doesn't matter whether it has a JIT, or whether it loads all its code from a webserver dynamically, or anything else.
The sole criteria is "thou shalt not circumvent the app store review process." That means, do not change the functionality after they've reviewed it.
The ban on third party browsers and JIT is so that you cannot make fully-featured or competitive apps that don't go through the store. Microsoft tried something similar in Windows 8 (certain DirectX features only available for Metro apps, strict guidelines what a Metro browser can do, ...). This is the reason Safari on iOS is lacking certain features wrt. PWAs, and the reason Flash was banned outright (instead of saying e.g. it has to be made more reliable).
If web apps were as powerful on iOS as they are on Chrome or ChromeOS, then many iOS apps including games would be written as web apps, and Apple would not get their 30% share. If someone would port a JVM or .NET CLR to iOS, then you could sideload those apps and circumvent the app store, too.
Doing this is still dead simple and in no way requires a JIT.
Well, there is no ban on interpreted code but JIT -> just in time compilation, which in many cases produces high quality (gcc -o2) code.
If Apple was actually concerned about circumventing the App Store review process for the purposes of security, they would implement OS-level sandboxing and security models (e.g. something capability-like) - this is both far more secure and allows for more freedom to make apps.
But they don't, because it's not about security - it's about profit.
I think the more specific match to what Apple is blocking with these rules is anything that resembles an App Store-like experience. Apple doesn't want anything that can download and run arbitrary apps, because that would dilute their platform control and other advantages. There's an excellent piece about why Apple is so afraid of this (https://stratechery.com/2013/why-doesnt-apple-enable-sustain...).
This motivation provides a more specific match to preventing arbitrary code execution: An App Store-like experience is almost impossible without downloading and executing code. It also matches the exception that Apple provides for "educational apps designed to teach, develop, or allow students to test executable code may, in limited circumstances, download code provided that such code is not used for other purposes" (https://developer.apple.com/app-store/review/guidelines/#2.5...).
Furthermore, this perspective is support by other policies as well:
1. This is why Apple doesn't allow third-party web rendering engines on the App Store. A third-party web engine could also be used to create an App Store-like experience.
2. See 4.2.7 (https://developer.apple.com/app-store/review/guidelines/#4.2...), the rules around what remote desktop apps can do. These restrictions seem specifically written to prevent remote desktop features from being used to create an App Store-like experience.
So, while I think rule 2.4.2 does help with the goals you listed, if it were just about those goals, these rules would be written differently (e.g., allowing downloading and executing scripting languages). And I think there's more evidence that rule 2.4.2 is more about preventing third-parties from providing App Store-like experiences.
Thats completely irrelevant to anyone else's iPad, which has no impact on your iPad's security. Would you be in favor of Apple banning whatever Mac OS app you use because I don't want to use it?
And what even is the point of having a secure iPad if you're also going to run an insecure computer?
This nails it. I want full control and the ability to run anything on my computer. I want my phone to "just work" and I never want to fuck with it or worry about what's on it. They are different devices with different roles.
What I do wish for is open ARM hardware with similar performance. I am totally certain that it is coming now that Apple has demonstrated just what is possible. Ampere, Samsung, Marvell, etc. are surely working on high performance designs now if they weren't already.
There is nothing magic about what Apple did with the M1. They built a really high performance ARM core by applying a lot of the same things that have been done for high performance in the X86 world but without the X86 dead elephant strapped to their back. The M1 can be duplicated if not exceeded.
You can execute whatever arbitrary code you want on an M1 Mac, up to and including completely custom kernels, if the user set their Mac to allow such code. It's not locked down and its not an iPad or an iPhone. I agree that the iPhone and iPad are unacceptably locked down, but the Mac is not, and there's absolutely no reason to group them together.
You can't run unsigned ARM binaries on M1 Macs. Hell, you can't run un-Notarized apps on any Mac running a recent macOS release without knowing some arcane trick to open them.
On the other hand, I see so many software developers vehemently refuse to notarize their Mac versions. Notarization is far less egregious and it pains me to see so many straight up refuse to even consider it as an alternative to the iron grip of the App Store.
Yes, Android does have some apps that can’t exist on iPhone, but I wouldn’t say that most people find them compelling or care about them. Those that do already have Android.
Good, I'm glad I'm not the only one who does this.
> Notarization is far less egregious and it pains me to see so many straight up refuse to even consider it as an alternative to the iron grip of the App Store.
Notarization requires developers to pay Apple $100 every year if they want to notarize their software.
Sure, Apple did it for selfish reasons and they'll keep their platforms locked up as much as they can get away with, but the end result is a benefit for all as powerful and open RISC systems proliferate.
No, not in the next few years. In 10 years? Maybe. https://news.ycombinator.com/item?id=26917136
The fact is that because Apple sells the iPhone for high profit margin and earns from the services and software sold on each iPhone, they can afford to stick a big, expensive chip in there. In contrast, profit margins on most Android phones are razor thin. Qualcomm has to design a chip that performs relatively well for as 'expensive' as the market can bear, as when their top chip is too expensive, OEMs will just build their phone with one of the lower tier Snapdragons.
Once you adjust for transistor count, Snapdragons et al are much closer to Apple's A-series than you'd think at first glance.
As for the M1, what shouldn't be discounted is the fact that Apple controls the entire stack, which means they could build in special features into it that together with Rosetta 2 make running X86 relatively performant.
The problem you've got here is that the iOS model has worked incredibly well for Apple. Without that you wouldn't have had the investment that has delivered the M1.
I'd love to be able to run Firefox on the iPad. I also disagree strongly with some of Apple's decisions - especially on the App Store. However, where we are now is a better outcome than a hypothetical position where iOS is less successful and Apple is using inferior CPU designs. After all I can always buy an Android tablet if I want to run Firefox.
> Until I can run a proper version of Firefox on iPad
FireFox for iOS works just fine. The Gecko vs WebKit difference doesn’t really matter in practice.
If you want general purpose computing, just get a Mac. You can run Firefox and any other program you’d like. It would be great if there was an opt-in developer mode on iOS that bypassed certain restrictions, but I also understand why Apple chose to go with security and simplicity as 99.9+% of their customers have no need nor desire to go beyond the security and platform restrictions.
I have both an iPad Pro with keyboard case and a MacBook. Even if I could run whatever I wanted on the iPad, I’d still be reaching for the MacBook because it’s just a better physical platform for doing anything other than simple touchscreen and stylus work.
There’s even a way to get W^X memory regions on iOS by abusing ptrace: https://saagarjha.com/blog/2020/02/23/jailed-just-in-time-co...
It can’t be submitted to the App Store or deployed with TestFlight, but you can build and install an app using that hack just fine on your own device.
Open source browser vendors, like Firefox and Chromium, could provide builds that enable a full browser engine experience on iOS devices, were they to think it worth the effort.
Yeah; sorry - it really does - I vastly prefer Gecko’s rendering engine and notice it’s considerably speedier and more responsive on my MacBook Pro side by side to the newest Safari. The app even opens faster.
Not only that, but I imagine there are a ton of web devs here in the comments who have a requirement testing on the actual FireFox, not some light skin on top of the existing WebKit engine with bookmark sync support.
FireFox for iOS is the farthest thing from ‘FireFox’. It’s in name only.
I do wonder how the 12” iPad Pro with Magic Keyboard would compare. I haven’t used one yet, but I suspect it would be pretty good. The 12” display seems a little large for a tablet, though.
The power cables are too short and the obsolescence is too planned in modern Apple hardware.
As for RAM, you can get 13" MacBook Pros with 32GB of RAM and 16" MacBook Pros with 64GB of RAM, and if that's not enough you're probably not going to have the easiest time finding options with more from other manufacturers, especially in a form factor that isn't a gigantic brick.
Do you mean that you want a matte screen? If not, Mac screens are generally among the brightest on the market and have very good anti-reflective coatings. Almost any competing laptop in the same price bracket is likely to have a lower quality screen.
I'm fairly confident Apple hardware has a much longer usable lifespan than competing products—that's why the maintain so much value in the second-hand market.
It's not in their interest to sell/provide and support an experience they didn't make. Their success shows that experiences are all that really matter despite what the vocal HN userbase routinely shares.
As though Apple is an open source company selling to developers.
This is the same reason I don't use Microsoft or Google's stuff unless I have to. Knowledge wants to be free.
It's like arguing that we should have no laws whatsoever because they impact your freedom to do whatever you want—but no one wants to live in that society. If you don't like Apple or Google or Microsoft that is still your privilege but arguing that is what everyone should have is disingenuous.
The Internet supports your freedom to say or do whatever but people every day show that without some limits everyone suffers. You might think you are smart enough to defeat all those who will try to take you out, but there are much smarter people than you or I out there and lots more of them, and many of them are evil.
As per the article, those visiting the comments were primarily interested in discussion about JIT performance, comparisons between ARM and x64 instruction sets, GameCube/Wii emulation, etc.
Instead, every single post on HN even tangentially involving Apple is taken over by these self-important haters and their mindless takes, which are often full of false assertions anyhow.
You are a platform war spammer and nothing more. It’s a shame the admins won’t put a stop to this, as it’s turning Hacker News into a vehement cesspool for discussion.
We've marked the GP comment off topic now. Certainly it was an example of a generic top subthread, which are the black holes of HN discussion: https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que....
Between this and your other recent flamewar comment on the topic (https://news.ycombinator.com/item?id=26994894), I fear you may be falling prey to the community-bias fallacy that I wrote about in a completely different context earlier today: https://news.ycombinator.com/item?id=27268490. HN isn't pro-Apple or anti-Apple — it has users who fall into both of those camps, as for any $BigCo, but unfortunately they each perceive the community as dominated by the other side, which leads to a lot of very bad flamewar comments from both camps.
Please don't post like that any more. Instead, if you see a generic comment taking a thread badly off topic in a predictable way, let us know at https://news.ycombinator.com/newsguidelines.html so we can downweight it. Giving us a heads-up in such cases is one of the highest-leverage things people can do to improve the quality of HN threads.
The only reason apple was able to pull of the M1 transition the way they were able to is BECAUSE they have such control over their ecosystem.
Windows / Microsoft tried with itanium and ARM with much less success.
The platform war folks always take the most negative view possible, cannot even IMAGINE why apple might have chosen the approach they chose.
Well, M1 is what happens when you control your platform. PowerPC / Intel / ARM -> this control has let apple evolve dramatically.
This is what I am looking forward to. In the case for Apple Silicon, the next generation will be even better and is not far off from announcing the newer processors that will supersede M1 in WWDC.
The M1 only shows what's possible for Apple Silicon and the newer generation of ARM-based Macs will impress further. So will skip this one for now and wait what WWDC has to offer for the next generation.