For the record, I’m highly critical of Plaid and hope the tech media catches on soon. They do not require developers to communicate which permissions they are asking for when onboarding new customers (I don’t even think that is an option even if developers wanted to) and there’s no central UI for a end customer to review permissions you’ve granted across developers and revoke them. I don’t think they have any requirements to encrypt this data on the developer side and have no idea how they audit developers to make sure they are using various endpoints without violation of their developer terms.
Jeez that does sound terrifying. I mean I guess that's already here in my credit cards databases, but at least (in the USA) I have some legal protections.
The Age of the Smart Machine (1988) is truly visionary and well written.
edit:
I'm currently reading The Age of Surveillance Capitalism.
The book has well developed concepts like 'behavioral surplus and 'instrumentarianism'. There are also clever terms like 'radical indifference', 'observation without witness', 'equivalence without equality'. They are just plain insightful. I can instantly recognize them as something I could not conceptualize before.
- (2019) https://vimeo.com/313429468
Some of my colleagues at work use the term 'digital native' to refer to (young) people who have grown up with ubiquitous computing. Next time someone says that, I should now perhaps say "oh, you mean, the wage slaves of the surveillance capitalists".
That phrase is just a series of boo words concatenated together, as in, "wage (booooo) slave (booooo) of the surveillance (I'll give you that one) capitalists (boooo)." It is too woven in with the "capitalists (boooo)" movement to be effective as a rallying point for people that don't want to overturn all of society.
For me, the problems are lack of explainability and possible bias.
There are many great applications for deep learning and AI in general but some guard rails must be in place for public good.
Then it's back to human bureaucracy.
I see the problem of inexplicability as less salient than (1) responsible, informed deployments of models, and (2) ongoing measurement (especially against a human baseline).
You can deploy explainable models without (1) and (2) and end up with a much, much worse result.
We have had “smart”-everything, it already sounds tired, hence “deep”—everything, let’s see how long it lasts..
One of the best pieces of academic marketing was calling this set of techniques "deep" learning. The word is so rich with connotations, it immediately brings to mind all the synonyms: profound, complex, arcane, etc. It makes people ascribe far more complexity to the system than it actually has.
When in reality, it's just a "massively multi-layered and multi-stage" network. But that doesn't sound nearly as profound, and doesn't allow journalists to spin wild tales.
Deep State is a form of clandestine government made up of hidden or covert networks of power operating independently of a nation's political leadership, in pursuit of their own agenda and goals.
----
[How can we] Find a way to have a serious objective talk with the greater community on the extraordinarily global reaching issues of the impact of Silicon Valley on society, community, culture as a whole.
Look at what we have to just emerge in the last 1.5 decades alone from "unicorns" in silicon valley:
* US policy seemingly being set/disrupted via twitter
* Mental health studies coming out on the negative impact of Facebook
* Election manipulation through ad-powered platforms such as Google and FB
* Massive cultural dialogue and political revolutions being fueled through twitter
* Assassinations being corroborated through Apple an watch
* Global spying and surveillance conducted through all our connected technology
Just to name a few of the globally impactful issues of our day which directly stem from the efforts of Silicon Valley in specific and the tech industry in general.
As the preeminent VC company in the minds of any young entrepreneur who wants to build the Next Big Thing, I would pose that YC actually has a social responsibility to, at a minimum, foster a conversation on these issues in a meaningful, serious and deep manner.
What are the consequences of MASSIVE success of a company?
----
> You’re not technically the product, she explains over the course of several hundred tense pages, because you’re something even more degrading: an input for the real product, predictions about your future sold to the highest bidder so that this future can be altered.
> it’s clear that surveillance capitalists have discovered that the most predictive sources of data are when they come in and intervene in our lives, in our real-time actions, to shape our action in a certain direction that aligns with the kind of outcomes they want to guarantee to their customers.
https://theintercept.com/2019/02/02/shoshana-zuboff-age-of-s...
Capitalism and values are fundamentally a means of resource allocation - if everyone is wrong about what really matters that means it is valued even if its true utility is nill or negative.