That would defeat the security purpose.
Anyone within your local network (which practically speaking very often means the next Wifi your device could find) could attack you.
chrome://flags/#unsafely-treat-insecure-origin-as-secure
I don't think Firefox has anything equivalent though? This bug on the topic is unassigned: https://bugzilla.mozilla.org/show_bug.cgi?id=1410365> Mozilla will provide developer tools to ease the transition to secure contexts and enable testing without an HTTPS server.
https://blog.mozilla.org/security/2018/01/15/secure-contexts...
But the bugzilla entry they linked to with that has been unassigned for two years, so maybe they changed their minds or figure the localhost exception is sufficient.
https://bugzilla.mozilla.org/show_bug.cgi?id=1410365
The last comment proposes a whitelist for development domains, but no response to it.
ths helped me (with the 68.0 win edition):
about:config
set the
media.getusermedia.insecure.enabled
from false to true
Specifically, you need HTTPS for WebRTC, but you obviously have to use a self signed cert because local IP. You can ignore the cert error and load the page, but connecting to the websocket for signaling will still fail because websocket on iOS requires a non-self-signed cert.
Non-HTTPS websocket would work, but not from a HTTPS host. So you're in a situation where you need HTTPS due to WebRTC, but you can't use HTTPS due to websockets.
In trying to push people to HTTPS by disabling features on HTTP, we're making development a _much_ worse experience. I'm not sure that's right.
I do wish there was a public solution offering this type of easy dynamic DNS with https. (Sharing the script I wrote could cost a lot on dns hosting and increased server expenses.)
openssl genrsa -out key.pem 2048
openssl req -new -key key.pem -out certificate.csr
openssl x509 -req -in certificate.csr -signkey key.pem -out certificate.pem
ip route add local 192.0.2.123/32 table local dev lo
which makes your system act like this is a local address without actually being one.
EDIT: nvm i just realized that wont solve your issues with mobile development...
This developed over the years without any input or choice from the end-user. The device manufacturers, platform owners (Apple, Google, Microsoft, Mozilla) and app developers joined together and forced this surveillance aparatus on all end-users.
This power balance has to change.
(For video, anyway. I don't see any similar solution for audio.)
Correct. And that was my reason for NOT covering the camera. Because I would be able to see if it was on due to some malware. However, I did not expect a vulnerability like Zoom's, where a simple website would be able to trigger a webcam. Combined with external monitors, the LED would be potentially missed for a good amount of time. So I've reversed my position since then.
What exactly is the risk? Have there been any actual cases of someone being spied on with their laptop webcam that would have been prevented by a switch? I'm only aware of cases where the webcam switch would not have helped (e.g. roommate sets up notebook to record owner naked). Even that is incredibly rare, or if not rare, almost never reported.
https://www.dailymail.co.uk/sciencetech/article-5228017/Hack...
https://www.dailymail.co.uk/news/article-2638874/More-90-peo...
https://globalnews.ca/news/2158281/what-you-need-to-know-abo...
https://www.telegraph.co.uk/technology/news/10131456/Hackers...
This site claims a guy made a business selling software to hack and remotely control webcams, complete with paid employees and $350,000 in income:
Also, there are many security programs that can seruptitiously take photos or videos using the camera. Usually this is to help in recovery after theft.
People would get someone infected, and then share the credentials so everyone could watch. So, I personally know of a handful of people that were spied on 20 years ago.
https://en.wikipedia.org/wiki/Robbins_v._Lower_Merion_School...
The use case is that you leave them turned off by default in case someone pwns you, and only turn them on when you need to use them.
Still miss the physical mic mute button on my old Thinkpad X230... and it didn't have a webcam button for that, but we've _almost_ had all of the right features in the past...
https://en.wikipedia.org/wiki/Berkeley_Software_Distribution
Smartphones , for all their faults , at least are far less vulnerable to viruses than pcs.
Or at least iOS vs Mac.
In this case the camera or microphone is the least of your worries.
This will make it impossible for people to talk to each other, without first needing to be connected online to some certificate authority, or without some extraordinarily difficult pre-installation process, which is often not even possible on a phone.
HTTPS was important, but now its being used to shoe horn dependency on centralized online-only authority. Perfectly ripe to censor anyone.
The bigger problem is that there has to be a single server hosting the app in the first place, which IMO is a severe flaw in the Web's architecture. But this change doesn't really make the situation worse.
I want to be clear though, I need it so that the user doesn't have to install the cert themselves, or have to be online to approve.
Previously, a user would connect to the local wireless network, then the router would open them up to a directory listing of the local apps available on the network (like the video/audio call), they click the link (just points to the dynamic subnet IP of a static file server) to load the offline HTML page which then connects to call anyone in the network, including users on neighbor and neighbor-of-neighbors routers.
Basically our own decentralized telecom!
What is the preferred way to include https in your development flow? Have an nginx or apache running? What about automated tests against a running application?
This is still mildly annoying.
How about making that work, first?