https://www.telegraph.co.uk/technology/google/11710136/Googl...
Anyway, this part sounds directly illegal, seems like it was just Randstad being greedy but if anyone from Google knew about it then it is bad but I doubt that they couldn't budget enough money to get the scans legally:
> They said Randstad project leaders specifically told the TVCs to (...) conceal the fact that people’s faces were being recorded and even lie to maximize their data collections.
https://www.nydailynews.com/news/national/ny-google-darker-s...
A friend went through the application and they wanted her to digitally sign 46 contracts, one after the other, without a chance to read the following contract before signing the current one. Including one about an arbitration clause. She did see that the first contract offered to send the rest of the contracts printed, by mail, but when she talked to the rep, he acted like he didn't have access to the contracts he wanted her to sign (yeah right and later he'll be like, well you signed y, so you gave up the right to x, probably knows them by heart), and that she should simply sign them and then go back and print them.
Presumably they have to offer to send them by mail for the contract based on online signatures to be binding, so it's interesting that the rep refused to do so. It was especially sad they have a deal with unemployment offices that funnel workers to them using state funds.
The "and/or" in "it sounds like Google and/or its contractor may have been taking some extreme and unsavory shortcuts to cash in." is clearly use of weasel words, the journalist was unable to substantiate the allegation that Google was aware of this unethical practice. Sloppy reporting demonstrating a lack of journalistic integrity.
How do you go from 1 to 2? With the premise "darker-faced people tend to be homeless".
This is not necessarily a false premise -- statistically, it is true, and it is a reflection of systemic injustice -- but the outrage is not whether it's true or false; the outrage is that Randstad exploited this painful fact.
It's so unsupported, of course. Far more likely that homeless people tend to be a available and amenable to the project.
I would happily sell anyone a picture or scan of my face for $5. But I would even more happily have that chance go to someone who needs it more than myself.
This article also mentions that the contractor may have lied to or misled the homeless, which is deplorable. But the behavior described by the title itself is nothing objectionable. The fact that many will object is a phenomenon I've seen called "Copenhagen Ethics": https://blog.jaibot.com/the-copenhagen-interpretation-of-eth...
Would you really? My gut feeling tells me that's not the case for most people for privacy or ethical reasons. Just because those people are poor, we expect them to have lower privacy or ethical standards.
The link you posted has the following example, I think you're referring to that
> BBH Labs was an exception – they outfitted 13 homeless volunteers with WiFi hotspots and asked them to offer WiFi to SXSW attendees in exchange for donations. In return, they would be paid $20 a day plus whatever attendees gave in donations.
That's completely different. Offering Wifi has zero long term effects. It's providing people with a "business opportunity" that wouldn't have access to it otherwise. Giving someone 5 bucks for their face picture (or other biometrics) is totally different and has long term negative effects.
- Provides a link or method to create the scan that takes just a few minutes (on Ubuntu)
- Sends $5 to kauffj@gmail.com via PayPal or via BTC to 17h2GtaBzivnNtP24qoGg4a3pjgShkw7MD
I will complete the process and post the result in this thread.
Most live publicly with their faces on display for all to see and others taking it a step further, participating in Facebook alongside billions of others.
It doesn’t scream facial identity being a major concern.
Your gut is sadly wrong. The majority of people still do not actually care about privacy when there is more than a few cents of value being offered.
And what are the negative effects of giving away biometrics? Is someone with no assets and no stable residence in danger of harm from someone getting a loan in their name? Of being rounded up by the government for their biometrics and not for the much more immediate threat of being criminalized directly to just for being homeless?
What if they could have bargained for $10 or even more instead? I don’t think either company would even blink at the sum, but many desperate people out there would be a lot better off.
I agree with you that some observers are never going to be satisfied and to them there’s always more an individual or a company can do. There is definitely an observer effect.
Similarly, If we took my line of questioning all the way to an absurd extreme, the best outcome would be if all these people got permanent shelter, jobs, and a stable life. But we can’t expect companies with profit targets to do this them. Nobody would feel bad about this exchange, but it would be pretty unrealistic.
So i guess I need to reframe my original question. Why do certain exchanges feel ok while other ones leave a sour taste in everybody’s mouth?
To me it seems like the answer is because the exchange felt unfair. Both parties stand to benefit but, instead of doing something genuinely beneficial for both, the party in power offered the (almost) bare minimum. That sense of unfairness is multiplied when you contextualize the exchange as Very Large Business vs. Small Homeless Person.
Similarly the link to the phenomenon discusses our role as observers, but it doesn’t discuss the parties’ roles in the exchanges. They’re not only observers, they’re also actors. The people performing the homeless study could, for example offered something to the control group at the completion of the experiment.
“Copenhagen Ethics” really just strikes me as a rhetorical tool to defend exploitation. “What, just because I offered this person a job I have to pay them a minimum wage?”
It could help some start-ups that need such a face for demo purposes or other experiments.
There is always a but
Google, et al, want to use my likeness to facilitate database lookups. They are welcome to a perpetual, exclusive license of that data at a quarter of a trillion USD. They know how to get in touch with me; I'm 100% serious.
1. The contractor targeted homeless people
2. They targeted people with darker skin
3. They may not have been forthright or truthful about what they were doing.
Number 3 is clearly wrong. But I think so long as the contractors were upfront and truthful about what they were doing, I don't know if 1 or 2 are problematic.
The only argument I can see for why they shouldn't pay homeless people money for an easy job is that the prospect of money might be so enticing that they're willing to give up personal rights or freedoms (the same argument why we don't allow selling of organs). But $5 neither seems high enough, nor the process invasive enough, that this argument would hold water.
As for ensuring that enough of a sample range is in the database as an attempt at avoiding data bias, this should be a no-brainer good thing.
If you're asking folks on the street and happen to get a lot of unhoused folks because they're around, that's fine. Writing memos telling people to target vulnerable populations because they're vulnerable is gross and deeply unethical.
Are osteoporosis researchers unethical for “targeting” women?
They're asking for 5m of their time and a scan of their face, in exchange for 5$. It's a simple transaction, and unlike what the HN-crowd would like to think, the majority of people in the street would quickly make that deal.
The article is putting a lot of their own feelings and opinions on the situation. Just because you have a fear of Google doesn't mean the whole world does, and no one was forced to do anything they didn't want to.
Also, the article mentions homeless people not going to the media (avoiding leaks), not being vulnerable.
One part that is a bit confusing to me is, the original source makes no references whatsoever to any consent form. Usually you can't collect this sort of data without signed consent, and previous reports [0] do mention such a form. I know most people don't read the form, but I'm curious how you can get away with telling someone you're just playing a game and lie so much when the form should clearly state what you're collecting.
Still, there should definitely be better vetting of contractors and stories like this definitely look very bad, even if the intentions were actually to help reduce ML bias.
[0] https://www.engadget.com/2019/07/29/google-paid-for-face-sca...
EDIT: The original article does indeed mention an show a picture of the "agreement".
To me, it just sounds like the contractor tried to get done with it asap and just half-assed the work.
Mental illness, addiction, the constant 24/7 stress of being homeless, potentially systemtic issues starting from childhood that gets in the way of developing necessary reading skills to accurately analyze and knowledge base understanding of the concepts being read, etc. are all factors that would make reading and understanding any consent form of enough technological-legal terminology a very difficult task.
Since you’re curious: homeless people aren’t important to Google or society in general because they have so little and everyone has all but stopped caring about them. They’re poor, they’re unfortunate, and so they’re exploited. This has been happening since... leafs through book forever. Serfs used to toil away in fields until they perished, and nobody gave a damn about them either.
They did this because they could get away with it, because they (and probably Google) knew there wouldn’t be any consequences. It’s the same old song and dance: the poor get explored for the benefit of the rich, and most people don’t seem to care.
If you hire a contractor, you are responsible for what the contractor is doing unless the contractor is operating outside of the terms of the contract.
If the contractor is within the terms of the contract, saying that "Google is doing this" is not deceptively inaccurate.
Who is going to enforce the law against Google in the name of homeless people? Its not like the government in the US has been a champion of the downtrodden in recent years
This is not a case of that.
Google (or its contractor) could easily have done this in a way that was not objectionable. They simply decided not to.
I mean there's no need to have google's name in there, other than to click-bait-trick people into viewing their subpar journalism with ads.
But, It's kinda shitty to cheat people no matter what. You cannot say hey pixel 4 is gonna have face unlock and i want you face scanned for that obviously, but contractor should have done a better job.
So, try to fix that and... there's hell to pay?
File under: "No good deed goes unpunished."
What seems to have been bad is the contractor misinforming people about what data would be collected (and for what use), and it's not clear what Google had in their contract to prevent that kind of unethical behavior. It is also very questionable IMO to target the homeless "because they won't talk to the media" which was allegedly in the instructions the contracting firm Randstad gave to it's workers.
Disclaimer: While I work as a low level employee at an unrelated team in Google, my opinions are my own and do not represent those of my employer, and this is the first I am hearing of this.
To me, this just parses as "We have some new Politically Correct excuse to exclude poor people from our dataset."
Being so unimportant that the world wants you to remain invisible isn't generally a good thing.
There is always some excuse. There is no condition under which it is sufficiently respectful, politely handled, blah blah blah to be A Good Idea.
No matter what you make, some minority corner case will break your tech and generate outrage. ("How DARE your speech recognition not work on AAVE!", "How DARE your facial recognition not work on burn center victims!" etc.)
That's bound to introduce other kinds of bias into the data.
The problem isn't that they were offering money in exchange for photos of homeless people it's that they were tricking homeless people into giving up their biometric data by telling them they'll pay them $5 just to play with a phone for a few minutes.
If they were honest about what they were taking and why I wouldn't have a problem with it.
The content of the article is interesting enough, but this line at the end caught my attention.
Is it reasonable to expect someone to "immediately reply" before you publish the article? Because that doesn't sound like ethical journalism to me, unless I'm misunderstanding the meaning of "immediately" in this context.
Randstad are very much the former.
It's either investigative journalism or it's not. How long you wait for comment has nothing to do with that. Do you really think this is equivalent to tabloids posting faked photos of some movie star's belly?
Joy at Media Lab has been looking at this issue for a while and advocating for balance. https://www.technologyreview.com/s/612775/algorithms-crimina...
Also I find it weird that Nvidia was able to simulate realisticly looking people last year, and Google is struggling to find humans, can't they use that as ground-truth?
Expanding your database? Great
Forgetting situational ethics? Disgusting
Human eyes also need to be trained on diverse data. It's the cross-race effect: https://en.wikipedia.org/wiki/Cross-race_effect
The main counterargument appears to be that those who sold data "didn't understand what was going on". It's hard to imagine moral convictions in which someone could consistently argue that the homeless don't understand money in exchange for photos, but it's acceptable to leave them to fend for themselves on the street.
Google is, at worst, helping people who need help.
"a contracting agency named Randstad sent teams to Atlanta explicitly to target homeless people and those with dark skin, often without saying they were working for Google, and without letting on that they were actually recording people’s faces"
How can this be the most ethical way to collect data?
The problem isn't in acquiring facial recognition data from homeless people, but in mischaracterising the nature of the experiment when doing so. If the reporting is accurate, they lied to vulnerable people and tricked them into selling their data for cheap.
Companies can't go around hustling people into giving away their private information. It doesn't matter if you think this is "for their own good", a homeless person may want to refuse being catalogued by Google for a variety of reasons.
I don't see how failing to get informed consent counts as "the ethically best possible way".
“They said to target homeless people because they’re the least likely to say anything to the media,” the ex-staffer said. “The homeless people didn’t know what was going on at all.”
Some were told to gather the face data by characterizing the scan as a “selfie game” similar to Snapchat, they said. One said workers were told to say things like, “Just play with the phone for a couple minutes and get a gift card,” and, “We have a new app, try it and get $5.”
Google (or their contractor if you're going to fight about the semantics here) is, at worst, guilty of misleading people about what they were doing, targeting vulnerable people with the expressed idea that they would be less likely to create problems, and not actually improving anyone's conditions in a real way by doing this.
Here's the moral convictions I have: lying to someone about what is happening to you in order to create a functioning business is bad business. It's entirely removed from the fact that small increments of money were given to some homeless people. I don't get to abuse homeless people as long as I give them 5 dollars afterwards. That's not how morality works. These people weren't lifted out of their conditions because of this life-changing sum. They weren't put into treatment centers or given job training. They were purposefully mislead and then compensated less than the price of a combo meal at McDonald's.