This sums up perfectly my thoughts about Apple hardware right now. I am annoyed by the fact that I won't be able to connect any of my devices to the new MacBook Pro, but I'll buy one anyway, because I need to get things done to earn money.
I also agree that the pro market is ripe for disruption (again). Interestingly enough, Apple began its rise to stardom from the pro segment, which it is now abandoning. Foolishly, I think, because it's a relatively easy disruption path for the next company.
Indeed they don't quit. Quite the opposite, pros dump sub-optimal tools ASAP and buy something decent to finish the job. The productivity of a tool that meets requirements hugely outweighs any additional cost.
In the UK Draper brand hand-tools were well regarded until the 2000s, when they started applying their brand to cheaply-manufactured tools from China. Word of insufficient quality and durability spread amongst 'pro' users who quickly stopped buying them.
Draper later tried to reintroduce a 'Draper Pro' line with higher quality but the brand was irrepariably tainted. Now they're mainly found in the hands of occasional DIYers who remember the brand name.
Nearly a century of brand value destroyed in a few years, to no positive purpose whatsoever. What a waste.
I kind of hope so.
I am not happy with Apple's latest moves with their Mac lineup. I have been using my current Macbook Pro for six years. I expected to buy one of the newly-announced Macs, but they really aren't speaking my language anymore. So I ordered a refurb Macbook Pro from 2015. It's still not what I would prefer, because the battery is glued in, the RAM can't be upgraded, not the latest processor, etc. But at least it still has USB-A ports, magsafe, and a physical touch pad. I guess now I've bought myself another six years' worth of time to see if Apple comes up with something that is more to my liking.
If someone were to introduce an entirely new and exciting platform, with ways to make money in its ecosystem for a programmer like me, I would strongly consider it. I've done it before. Way back in 1999, I abandoned Windows and switched to BeOS. I worked for Be, Inc. for almost a year, before they went out of business.
Even though it didn't go so well last time, I would make the jump to a new platform, a second time, in a heartbeat.
My ideal situation would be to migrate to PC hardware, but the problem is I'm dissatisfied with the OS situation there. Windows is still annoying to use after all of these years, the Linux desktop experience seems to always be perpetually behind OS X and even Windows, and making a Hackintosh is an EULA violation (which matters to me since I use my computers in public, professional environments). I agonized about this over three years ago before eventually succumbing and buying a MacBook Air despite my dissatisfaction with Apple's trend toward non-upgradeable computers. I don't want to buy another non-upgradeable computer, but I don't like Windows and the Linux desktop, either.
I would be very interested in some sort of alternative OS, and I think the time is ripe for the development of one.
Microsoft Surface Book
Razor Stealth
Dell XPS 13
Unless you are sticking with the macOS ecosystem for old time's sake, it's trivial to switch to a Windows workflow.
Or Linux for that matter.
* RISC-V
* Rust
* WebAssembly
The first, RISC-V, breaks with existing architectures. The spec is open, and public feedback is largely positive. Money has been committed to real implementations on silicon. There is common tooling already, and it's expected to grow more robust.
Rust sits in the middle, iterating on systems-level concerns with a much higher standard of compiler tech. Everyone praises its community and the responsiveness of the devs. It isn't the only player in the field for overturning C and C++, but it has a lot of the momentum.
And, of course, WebAssembly doesn't fix the Web, but it does return us to the idea of what Java was supposed to be, 20 years ago - a common layer for sandboxed application code. Of the three this one is probably the least established, but is also getting an amount of care and cooperation that is quite above average for Web technologies, and shows early signs of reaching adoption outside of the browser context. With such a powerful client runtime, both the existing Web and desktop paradigms become open to disruption, as is already in nascent form with the current wave of "desktop framework, browser engine inside" apps.
When you put together all three, you have a much more robust stack, something that you can really imagine the future of computing on. It has missing parts, but that might be where you and I come in.
I want better hardware, but as a pro software developer, I can't really adopt a fundamentally different operating system, even if it is "better."
I don't care how nice your hand-crafted kernel is; if I can't run a JVM, Node.js, and the Android toolchain on it, I can't do my professional work on it.
Sometimes a radical shift is what's needed. Apple is the new Microsoft and might just be ripe for a disturbance.
Then, on-top, you can launch "classic" OS like windows, linux, etc. Then also, the new OS.
https://www.crowdsupply.com/raptor-computing-systems/talos-s...
I've had a chance to test some Power8 systems, and they perform very well. Given the choice, I'd take one over a Xeon system, for intensive professional work.
* Robustness
* Upgradeability
* longevity
* reasonable performance
Having the latest and greatest isn't essential, since I've had my current pair of thinkpad and custom build desktop for a few years now and after a some upgrades(new SSDs, maybe a new GPU and extra RAM for the desktop, etc.) I expect to use them for a few more, maybe even until 2020, if nothing blows out. They perform reasonably well for my needs and are very versatile machines. Right now I can replace the HDD, upgrade the M2 SSD, put a new screen on my laptop and replace the battery myself. That is worth a lot to me and I'm willing to pay more for a machine that isn't a sealed monolith, that is the biggest anti-feature for me. I grew up back when you could just open up your PC case and swap components out and I'm absolutely unwilling to give that up. I don't really want to buy a new laptop every two years, if I do now it will be because I want one, not because I need one.
What laptop do you have where the screen is upgradeable? Or are you talking about doing it with a standard laptop, opening it up and replacing the panel? Is that usually viable with laptops?
The main reason I prefer my current MBP to my previous Toshiba Portege is, tbh, that the screen is not a shitty 1360x768. I never even thought about replacing it.
I know not all workloads can work this way, but a lot of it can.
There just isn't a need to do this stuff anymore. Even making your own computer anymore in expensive, confusing, and generally not fun anymore.
Similar for laptops. Replacing batteries, upgrading ram, and upgrading storage are very common.
You really think that if you blow $2 or $3k on a new laptop that somewhere during it's life you might not want a larger SSD or more than 16GB ram?
Sure it might have made it a mm or two thicker, definitely a price I'd pay for something that's easier to fix (or have fixed) or upgrade.
* Changed the RAM (on both)
* Changed the battery (both)
* Switched a HDD to a SSD
* Replaced the touchpad that died
* Had to do some work on the screen that is slowly dying out
I'm quite happy to do that on computers that are between 6 and 9 years old, it means I can get the most out of them, without paying for a new one each time I need just a little bit more.I would be completely unable to do that with a newer MBP.
Not everyone wants to build there own of course, but saying it's expensive, confusing and not fun isn't really true.
- Easier to repair / upgrade. Dont glue or solder in parts if it is not absolutely neccessary. If the device gets a few mm thicker, its not the end of the world.
- Specifically, make the battery switchable. There is a battery capacity limit for flights in the US, and I've heard this is one reason the new MacBook Pro has the specs it has. A switchable battery would be a way around it.
- Be completely honest about your incentives, and then side with the customer. Say: "We would like to glue down everything, so you can't repair it and have to buy a new one when the battery fails - but we won't." This is a business disadvantage in the short term, but you gain trust and can charge more to customers who know what's important.
- Give it a matte, high-dpi touchscreen. Note, I don't mean mattED, where you stick a matting foil on top of a regular glass screen. I mean native matte, like good business desktop LCD screens.
- Here is an important, overlooked point: Make it "just work". In the sense of "software eats the world", you can do a lot by getting the software right. Have a dedicated team make sure that all popular OSes (Linuxes, Windows) work properly.
- A good keyboard is really important. Let the keys have enough travel, make sure that they have standard sizes, that cursor and home/end keys are easily reachable.
- Give customers the ports they need, or at least make the dongles cheap. Offer a docking station, or recommend one.
(- And if you want to create a Myth, source your components (graphics, WiFi card, touchpad) cleverly, and people will be able to make Hackintoshes out of your machines. Just be careful never to advertise this, or to give instructions :-D.)
I believe a smaller vendor could pull this off nowadays. Even if you don't have the economies of scale, you are selling to a pro segement who is willing to pay more for a "no-comprimizes" device.
(Edit: cleaned up)
The reason is that a removable battery needs twice as much enclosures as a non-removable one, adding something like 1/5 of an inch to the thickness. The second reason is that Apple has switched to shaped batteries basically filling any available space, which makes removability as good as impossible.
A third reason is that nobody actually wants removable batteries. An MBP gets 10 hours of battery life, 13 if you're on a plane and turn off the wireless. Add an hour of food service to it and you're good to go around the world.
If that's not enough, you can just get an external battery.
Swapping day-to-day isn't the use case, avoiding a day in the shop when it's time to replace the degraded battery is.
[1] https://twitter.com/worrydream/status/791767756928462848
[2] https://twitter.com/worrydream/status/793501918790242304
If it took you two years to get that far, you may want to find some less stressful topics to ponder. Also, please enlighten me: what exactly is Microsoft's previous investment in "pro hardware"? And what "pro software" has Microsoft been investing in in the past that targets "video editors, 3D modelers, audio engineers, data scientists"?
WTF? This article is almost literally "The MacBook sucks, my friend agrees and we've been throwing around buzzwords and then we stopped"
Seriously: there's no coherent thought in this "article". I don't even know what these so-called "professionals" are missing in the author's view.
" fast machines with plenty of memory and myriad ways of moving data in, out, and around them." – Well, yeah, fast is great. But it's not Apple's fault that CPU speeds are stagnating. It's simply approaching physical limits, as well as CPUs having reached a level of performance where people prefer to invest resources into power efficiency.
AS one of those so-called "data scientists" I'll also let you in on a trade secret: the stuff I do on a notebook could comfortably run on a phone. It's a text editor, a browser, and ssh. That's because we don't do number crunching on a notebook. It's a cluster, or sometimes a workstation with a couple of GPUs.
Everybody also seems to miss that we've seen an actual leap in notebook performance: SSDs had a huge impact because HDDs were (by far) the limiting factor for almost all workloads.
Regarding the "myriad ways to move data around" – no thanks. Now I'd consider it quite failure to ever have actual data on a notebook. But I'd guess even if you're working on local data, USB 3.1 and thunderbolt are probably what you'd want to use?
"Pros don’t quit because their tools are suboptimal."
Yeah, they do. Give someone a shovel and ask them to dig a tunnel.
"That’s practically the definition of “professional” – a pro gets the damn thing done."
No, the definition of a professional is "getting paid", which, by the way, separates them from your little thought experiment. Alternatively, "professional" is slang for a prostitute, which actually does fit your definition of "getting the damn thing done", so maybe I've been reading this wrong.
"That cycle of dependence, along with the need for stability and predictability in one’s tools, makes product incrementalism the norm in pro computing."
I still don't know what "pro computing" is, but surely "pros" are today using the same operating systems as "non-pro" are? So the non-professional computing is also moving incrementally, right? Then I don't get why you're trying to derive some sort of causality ("need for stability...") that's specific for one of the two segments when they move in parallel.
"It should be no surprise as to why nobody has attempted the sort of ground-up overhauling of pro computing that we mapped out: it’s expensive, slow, and risky to do something big, new, and different."
Or maybe it's just stupid. Because our tools are pretty good (being the product of actual professionals "getting the job done") and there's no reason to throw them out for unnamed pie-in-the-sky fantasies.
"Getting the job done" is the pretty important aspect for me here. And the reason why I've (again) bought a new MacBook. I am a freelance software developer. I can use all the tools I need on a Mac and they work fine. I generally don't care about hardware or software failures - maybe I was lucky, but my previous Macs never disappointed me in the last years. I don't have to spend time configuring drivers or fighting updates.
Another aspect is the look and feel of a Mac. When I am onsite at clients, giving presentations or acquiring jobs, it makes a huge difference if I have a Mac or Dell/HP/Lenovo/whatever notebook standing on the table. People, especially managers, mostly value appearance more than competence - sad but true.
Limited technology (pre-HDD) accidentally gave us privacy (data on disconnected floppy disks) and security (OSs on ROMs).
We may not have as large corporate sponsored basic research, but we do have the Internet. We should be able to leverage that advantage to some effect :)
Your plan means I'm signing up for a period of pain of unknown duration. It could be four hours or two weeks. And for the next three years I might have a laptop that doesn't sleep properly when I close the lid, or that can't talk to the printer at the office but there's a forum thread somewhere where somebody thinks they solved it.
One of the characteristics of any "pro" market is that these are people who simply aren't going to waste their time messing around with something when there's an alternative that just works and lets them go back to doing whatever it is they do that earns money.
It depends on what kind of work you are doing, but for me this is not true. I need to do a lot of unix-ish development, so that means installing XCode (OK), installing Homebrew (or MacPorts or Fink and finding out which one is better), Sublime (or emacs or vi), a bunch of other tools. I need a week or so until I get it where I want it. This is not too different from my experience on Linux or Windows. Installing Linux is the easiest part - in fact, in most workplaces, you just give it to the IT department and they do it for you in a couple of hours.
1: E.g., System76.
Everyone calls themselves pros, but nobody wants to get their hands dirty with Linux. Basically everyone's a pro-sumer.
When I go to the mechanic he doesn't care if the tools get him dirty, he uses the best tools for the job, not for his clothes. I get that some people want to look at gorgeous UIs but nobody told me how that gets the work done.
I want to use the best tool, and I try to define best as objectively as possible. Faster CPUs is better. Native support for docker is better. Walled garden is worse. macOs is better if you do ios apps. UI look&feel is debatable. Preference on terminal emulator is debatable. More memory is better. If you do devops and ssh onto linux boxes, linux is better. etc.
There is absolutely no need for someone writing software to ever open up his computer, or to compile the kernel they're running natively (if you're actually working at kernel-level, you'll do most of it in VMs). All that stuff can be fun, no doubt. But it's your hobby, and really no reason to feel like a superior "professional" vs. the lower classes of "prosumers".
I tried to switch but there is basically no professional software on linux for non-server related tasks. Just replacing something like Fantastical (calendar) or OmniFocus (gtd) is hard/impossible. Other pain points are replacements for DevonThink (reference manager), Evernote (note taking), papers (bibliography/search), lightroom (photo management and editing), keyboard maestro (macros). list just goes on and on.
If I'm getting paid, I'm not spending the time on fixing all the small issues that pop up with Arch. I love it otherwise. Sometimes it's fun to blow a few hours on a weekend fixing some random bug that has only ever happened to you. Sometime's its not fun and just really inconvenient. I always keep a backup Windows partition just in case something bad happens and I really need to use a computer for a bit or if I just need to run some random Window's only app.
The sure fire way I can think of right now is trying to drag a window from one desktop to another in the overview.
I'm quite good at breaking software, considering doing a semi regular video on favourite things that break.
(Speaking as a current user of an old MacBook Pro which is my first and last Apple product. Also I had an iPad but sold it long ago.)