> My bet is that privacy experts bugging people about it would have been tolerated just fine,
But they weren't tolerated. Comment sections of such articles were full of passive aggressive bullying about how "such a feature was only for nerds to care about and people shouldn't really do it". How much of that was ad industry paid, who knows. Rumors and reports abounded that articles were deranked in especially one well known search engine and how to videos were demonitized and hidden from recommendations in the current biggest video site. (And to not just point fingers at Google properties, there were similar rumors about algorithm shenanigans with posts on Twitter and Facebook, who both also have adtech firms deep into tracking.)
Obviously, privacy experts bugging people about it and browsers making it the default would get different responses: one is easier to deal with skullduggery and the other is "safe" enough to make passive-aggressive PR releases about as it gives you someone to point fingers at while you "take your ball and go home". I mentioned way above that it was a "good" excuse, and that's exactly how I think of it. They could pretend to be the victim and point the blame at a company with much less adtech as the real villain (for doing what users wanted and what was good for users); win/win.
Sure at this point we don't have clear evidence of skullduggery and it is mostly academic/hypothetical and a small assumption how the adtech firms would have reacted if something had slipped their nets and actually went viral. I don't think its a big assumption from there how they would have reacted nearly as quickly in "take their ball and go home" mode. The only difference there is whatever excuse they come up with to blame it on, and I can imagine all sorts of excuses they might have come up with. I'm "happy" for them they found such a "good" excuse.
> But browsers changing the default is qualitatively different from the user being able to set it.
I think we're never going to agree here. It's not qualitatively different. People had plenty of choice in browser at the time and the two browsers that did it had tiny minority user bases. It was quantitatively different. It was a lot of people opting in to more privacy at once with a browser upgrade. You can claim all you want that some number of those people were simply lazy and made "no choice", but the statistics don't actually agree with that.
One, because they were already minority browsers.
Two, if we want to get specifically into Edge details at the time Edge was already the browser with the highest adoption of Do-Not-Track even before the version that turned it on by default. Edge put the feature front-and-center in the Settings window and made it easy to find. Edge also did a very gentle prompt "Hey there's this new feature that could enhance your privacy. Do you want it on? [Learn More]" in the versions leading up to turning it on by default. Microsoft pointed out at the time that the leading feedback they kept getting from users from those prompts was "Obviously this is a good idea, why are you even asking, please turn it on by default." Most users that upgraded to the "on by default" version would have seen one of those prompts. A few were convinced by propaganda at the time that Microsoft was trying to "do some evil" (by asking if you wanted more privacy?) and switched to Chrome.
> And it's not like people were choosing Edge because it would opt them out.
I know I convinced a few people. Anecdata isn't data, but statistically Edge started briefly growing in users again right around that time. Not by much certainly (not enough to save Edge), but clearly some. (Some people were fed up with how hard Chrome made that setting to find.)
As a user, it was qualitatively better user experience with DNT for the brief windows where adtech actually abided by their "promises" (lol) and respected it. As an Edge user at the time, I don't think the web has felt as nice until Firefox added their "Enhanced" Tracking Protection, years later. (And Apple's similar tools just recently.)
> That's a completely negligible percent of people.
I think by definition it was not negligible if it spooked the adtech companies to quit so fast.
> Another way the two are different is very simple: If you had a big fraction of users manually flipping the switch, and the advertisers tried to cancel the feature because that was too many users, you could mobilize those tens of millions of people into a powerful political campaign to bake DNT into law.
I disagree. My entire point is that they never would have allowed it to get to "tens of millions" of people in the first place. Whether by skullduggery or passive aggressive PR notes doesn't matter. They very calculatedly stopped when it was a tiny fraction of people just enough to impact the bottom line. It's not a big assumption on my part that no matter how we got to that small of a fraction of people learning about the feature and actually using it, they would have always have stopped it before it became popular (whether or not you believe the skullduggery to be real or a conspiracy theory), at the very least because their shareholders would have demanded it because it was impacting perceived profits.
> As for the rest of your post, I'm confused about what you're accusing the Chrome devs in particular of doing? I agree that they're a big problem, see also flock, but with DNT they're not responsible for what the advertising group does.
I'm mostly admonishing all of Google for acting like an evil company. If we want to talk about the Chrome team's specific responsibilities: I believe that as a professional it is your job to make sure that you follow ethical standards. The Chrome team knows that their checks are signed by the adtech teams. That's a conflict of interest that makes it very hard to maintain professional ethics. I can't tell any individual developer on the Chrome team that they should stand up and walk away from that conflict of interest for the betterment of the web and the profession. How you navigate your ethics code is a personal matter. I can blame the team collectively for not standing up to the people writing their checks as a massive ethical failure. I doubt we'll see a Chrome dev team strike anytime soon, but that's within their rights.
> What could they have done better?
The obvious answer is a feature that was actually browser enforceable similar to today's Firefox's Enhanced Tracking Protection (which is good enough a lot of "publishers" call Firefox itself an "ad blocker" today, despite the fact that it blocks no ads just trackers), on by default and with an "opt out of privacy" model rather than an "opt in to privacy" wish-it-were.
It may have been "impossible" to do, because they would have actually needed to confront that conflict of interest in their hearts. They would have needed to tell their owners and masters that they were going to have to eat a couple of down quarters in profits until they either adjusted the market to charge appropriately for untracked advertising or managed to build a propaganda machine big enough to convince users in bulk that privacy wasn't in a user's best interest and that they should opt out of tracking protection.
But the guts to make that sort of ethical push, would have been the right thing to do, for everyone. Doing it then, and doing it with the majority browser, would have been an impactful statement. That would have been a "Do No Evil" Chrome moment for sure, and it is certainly obvious and easy to imagine that they had the power to do something like that, just not the ethics or morality.