https://github.com/diggerhq/digger/issues/1179
Surely we aren’t the first open-source company to face this dilemma. We don’t want to alienate the community; but losing visibility of usage doesn’t sound great either. Give people the “more privacy” button and most are going to press it. Is there a happy medium?
We shouldn't have to pay with our privacy.
Be aware that the EU is working to make opt in mandatory too. But you won't be their target of course, it'll be the truly evil companies like Microsoft, Meta and Google.
> Give people the “more privacy” button and most are going to press it. Is there a happy medium?
This should really tell you enough. If you already know users don't want to share the info if you would ask them, you're doing the wrong thing by withholding the option.
The big techs spend so much money developing dark patterns exactly for this reason.
Also, the issue is not really about that - it's about the submitted data not being anonymised. That's the bare minimum you should be doing regardless of the information being opt-in/out.
And in practice, do you gain enough from active collection to justify the fight? What about instead searching GitHub for digger being used in the public pipelines and building stats from that?
Anyway, their main beef appears that you are not properly anonymizing the telemetry.
> I think oss developers that don't understand how users use their software can't maintain or improve it.
For trivial improvements, that's definitely not the case.
For example, if someone files a GitHub issue saying "please add new menu shortcut XYZ". That's the kind of thing which doesn't directly need in depth understanding of how users are using the software.
More major features though will indeed benefit from understanding how users use it. That's not really a case for telemetry though, rather it's a case of having a wide enough user base that users ask for changes/features/fixes themselves.
Sometimes the users asking for changes/features/fixes is us ourselves. ;)
A medium? Ask people. FOSS is different. There is a level of expected transparency and trust along with it that you can’t get back when broken. Changes like this are a very easy way to break that trust. Trust is everything here.
I expect to be tracked by commercial tools for the most part, but I still try to control how.
How would you feel if a trusted tool changed something you deem important without telling you? Generally a response is feeling betrayed.
The fact that someone skilled enough took time out of their lives to make this PR means that it matters enough to seriously reconsider.
FOSS has so much fragmentation trouble already.
If you want to discuss this in more depth I'll be happy to elaborate and give you ethical guidance, but presently that's probably just adding to the noise.
The big one with telemetry, is unintended side effects due to correlation and deanonymisation - which is actually dead hard to anticipate - very easy to get wrong like rolling your own cryptography :)
The other, around consent and defaults, is that even if your telemetry is perfectly anonymous, benign and beneficial to the end user, you may trigger a security alert and over-zealous investigation and reporting. This can have a massive impact on your reputation, as happened to Audacity. It's really not worth taking the risk.
Hope that helps.
[0] https://www.emerald.com/insight/content/doi/10.1108/S2398-60...
Another option that I've seen is to inform the user that there is telemetry, but make the option to disable it some kind of configuration by editing a JSON file. This way you still allow for the option to disable telemetry, but add a small amount of friction to do so.
Since this software in question is open source, if you don't allow the disabling of telemetry I guarantee a fork will appear in the near future where it has been disabled altogether.
Ask yourself if this is a hill you wish for your project to die on?
Just ask the user on the first launch.
It's your project you can manage it however you want. I'd recommend communicating this clearly at least but otherwise listen to your users(in aggregate) and make your decision.
All software should, regardless of it being open-source or not. It should also be opt-in.
Not even anonymizing the data is also a MASSIVE red flag
Unquestionably.
What I do with your software is quite frankly none of your business. You do not have the right to know what I do with my hardware. Your metrics are not my concern.
FOSS is a very personal thing. Try framing it as a conversation between you and your users:
"Hey, we work hard on this and would really appreciate it if you share telemetry so we can focus our efforts better"
Vs
"This application now collects telemetry. You cannot turn it off. You want privacy? Too bad."
The difference is asking your users to share data and demanding they give up their privacy on your whim.
> Give people the “more privacy” button and most are going to press it.
You should have a good long think about what this means and why removing privacy options is making your users upset.
When I implement telemetry in a FOSS project, I go well out of my way to do it in the most respectful way I can. This usually takes the form of a detailed list of what is and is not being collected, along with why I need it. During first run, there is either a very prominent notification or a full on blocking step where the user must choose whether to enable telemetry. This includes text explaining why I want telemetry and a link to the documents.
My philosophy is that users are not idiots and they are not children that need coddling. I make my request, give them the information, and let them make their choice. That's it, end of transaction. The user made a choice and the matter is totally out of my control.
Anything less is not respecting your users.