1. Laptop
2. Phone
3. Car
4. Washing machine
5. Handheld GPS
6. E-reader
7. TV
Is there some intrinsic different between a device where the manufacturer has programmed it using an ARM/x86-based chip vs a microcontroller vs some other method that means in the 1st case I have the right to install whatever I want? Because that feels like what's happened with cell phones: manufacturers started building them with more capable and powerful components to drive the features they wanted to include, and because those components overlapped what we'd seen in desktop computers, we've decided that we have an intrinsic right to treat them like we historically treated those computers.
Phones get a lot of attention in this regard because they've replaced a large amount of PC usage, so locking them down has the effect of substantially reducing computing freedom.
> I'd say that if you figure out how to run software of your choice on them the manufacturer shouldn't be able to legally stop you.
That's already the case. The manufacturer can't come after you for anything you do to your device. They can:
1. Set up their terms of service so that things you do to alter the device are grounds for blocking your access to cloud/connected services that they host on their infrastructure
2. Attempt to make it difficult to run software of your choice.
3. Use legal means to combat specific methods of redistributing tools to other people that compromise things they do in number 2.
For all intents and purposes, a laptop computer and a smart phone are one. This is, for example, evidenced by the fact we run general purpose "applications" on them (not defined ahead of time), including a most general app of them all (a web browser).
For other device types you bring up, I would go with a very similar distinction: when you can run an open ended app platform like a browser, why not be able to install non-browser based applications as well? Why require going through a vendor to do that?
I'm not saying I dislike the concept of being able to run my own code on my devices. I love it. I do it on several devices, some of which involve circumventing manufacturer restrictions or controls.
I just don't think that because manufacturers started using the same chips in phones as computers, they magically had new requirements applied to them. Phones had app stores before they were built using the same chips. My watch lets me install apps from an app store.
Legislation like EU Cybersecurity Act hopefully pushes things into more of a fundamental rights thing by demanding that devices don't go into the trash pile as soon as the vendor stops issuing security updates by mandating an ability to keep operating these devices without negatively affecting Internet at large (by, for example, becoming a part of a botnet).
This is already possible with many general compute devices by putting a version of up-to-date GNU/Linux or FreeBSD or... on it. And for a smaller subset of GC smartphones, with AOSP-based Android.
I'm asking why taking a device that uses a microcontroller and making a new model with an ARM chipset and a Linux-based OS seems to suddenly make people treat the ability to install custom software on it as a fundamental right.
Then consoles started shipping with recognizable internals, and we had waves of people very frustrated at things like Sony's removal of OtherOS, or Nintendo's attempts to squash the exploits that enabled Wii Homebrew.
Likewise, I'd be fine with banking apps on phones requiring some level of trust, but it shouldn't affect how the rest of my phone works so drastically.
Similarly, the banking app on your phone should be representing your interests, 100%. It may need to keep secrets, such as a private transaction signing key, from your bank or from your boyfriend, but not from you. And it definitely should not be collecting information on your phone against your will or without your knowledge. But that is currently common practice.
My washing machine could be programmed to do all of those things you're worried about without any writeable memory. Why does the parts the manufacturer puts into it turn it from an appliance that washes my clothes to a computer that I have a right to install custom code on?
Maybe in theory your washing machine could be programmed to do those things without writable program memory. Like, if you fabricated custom large ROM chips with the malicious code? And custom Harvard-architecture microcontrollers with separate off-chip program and data buses? But then the functionality would be in theory detectable at purchase time (unlike, for example, Samsung's new advertising functionality: https://news.ycombinator.com/item?id=45737338) and you could avoid it by buying an older model that didn't have the malicious code. This would greatly reduce the maker's incentives to incorporate such features, even if it were possible. In practice, I don't think you could implement those features at all without writable program memory, even with the custom silicon designs I've posited here.
If you insist that manufacturers must not prevent owners from changing the code on their devices, you're insisting that they must not use any ROM, for any purpose, including things like the PLA that the 6502 used to decode instructions. It's far more viable, and probably sufficient, to insist that owners must be able to change any code on their devices that manufacturers could change.