I think a code of professional ethics around software engineering is long past due. Journalists started doing this in the 20s[1] after a series of events, including the Spanish-American war, made the awful potential of ethical lapses in journalism obvious to everyone.[2]
We can't continue to maintain the reflexive belief that technology is neutral and is only dangerous depending on how it's used. At some point people have to be willing to refuse to work on certain things because of the obvious social implications those things would have.
I don't know how anybody could be working on things like lethal drones, facial recognition, locked bootloaders, deep packet inspection, or other freedom-reducing technology without considering the consequences of their work.
And I recognize that not everybody thinks the technologies I mentioned above are categorically wrong, but it'd be cool to start a conversation to draw lines about what is.
1. http://www.spj.org/ethicscode.asp 2. Spare me, I know the profession isn't perfect and ethical lapses still abound, but at least we have some way of knowing when an ethical standard has been broken.
I can think of several existing codes of ethics that might apply to software engineering.
For starters, the ACM Code of Ethics was adopted over 20 years ago. http://www.acm.org/about/code-of-ethics
The IEEE Code of Ethics dates to 1963, which is when AIEE merged with IRE. http://www.ieee.org/about/corporate/governance/p7-8.html
IEEE's Computer Society also has its own code of ethics, adopted jointly with the ACM in 1999. http://www.computer.org/portal/web/certification/resources/c...
Finally, there's the Obligations of the Order of the Engineer, which has been around since 1970. http://www.order-of-the-engineer.org/?page_id=6
NCSU's Ethics in Computing website has links to most of these, and more. http://ethics.csc.ncsu.edu/basics/codes/
My own experience in the software industry is that professional society membership and conference attendance is relatively rare, especially when I compare it to other fields I have exposure to, like the library world, where membership in at least one professional society is de rigueur. I wonder if the problem is not a lack of a code of professional ethics but rather a lack of exposure to them?
Even "lethal" drones--it's not like there's one software developer who makes LethalDrone OS. There are many components to it that have very positive possibilities, for example auto stabilizing flight controls, which can and will end up being used in search and rescue drones.
To me, that use case for facial recognition always felt like a front. It's an edge case, used as a justification for technology where the base case is surveillance agencies using it to identify whoever they might be after in public.
This shows your bias as much as it does the website's.
Also, opinions are not to be believed, you agree or disagree with them, both are options.
Good luck Google!
Similarly, when people use smartphones, they could be recording with the back camera, but people are okay with that. They assume no recording takes place simply because it's the more likely situation.
I'm not saying there's no privacy issue here. I'm saying this ship has sailed. Why pick on Google and not your local convenience store?
If there were, that would be way creepy in and of itself. However, this is worse: the world's most powerful advertising company is turning consumers into spies.
It was bad enough for Google to track users' interests and whereabouts using its online services. Then with Streetview and its apps on Android, it started massively gathering physical information as well. With Glass, it'll have a cheap workforce that gathers the information of others as well.
Even if you've never consciously used a Google product or service in your life, if you live in a developed nation, Google knows about you and it already uses that information to influence your behavior [1][2].
Imagine how your life will change if only 1% of the people around you start using Google sponsored video cameras that are stealth, have high quality imaging, are always on, location aware, and always connected to the Internet: that's what Glass is.
[1] http://www.ted.com/talks/lang/en/eli_pariser_beware_online_f...
[2] http://www.nytimes.com/2011/05/29/technology/29stream.html?_...
The Sophisticate: "The world isn't black and white. No one does pure good or pure bad. It's all gray. Therefore, no one is better than anyone else."
The Zetet: "Knowing only gray, you conclude that all grays are the same shade. You mock the simplicity of the two-color view, yet you replace it with a one-color view..."
I'm sorry, is this the same New Scientist that suggested (arguably hyped) a piece of technology that has no hope of working in the real world as a plausible replacement for other modes of transportation? OK, just thought I'd clear that up. Thanks for playing... next.
Note, for context: http://en.wikipedia.org/wiki/EmDrive#New_Scientist_article
This is no different than Google's "Try these too" feature when browsing for images.
When you're digging into NS for panic fodder, you know you're desperate. You know what else is creepy, follows us around, but we all take for granted? Voice recognition.
I'd love to live in a small town where everything from mobile phones onwards had to be left at the gate.
Of course you then have all the usual monocultural problems with gated communities.
Like people have said, we're already all watched by CCTV anyway, especially here, so if I really cared I'd be looking for that community already.
Glass will probably be adopted among adventure sports enthusiasts and niches of skilled laborers, but if they want to see them become accepted in everyday life, they need to do a better job assuaging people's concerns around privacy, fashionability, and information addiction.