Like _what in the actual @#$%? What jackass Silicon Valley startup is funding this operation? It's this kind of stuff, which is illegal, but because it's hard to enforce, people will get away with. These people have obviously trespassed hundreds of times.
The problem is that politicians and the law fail to recognize that there's importance in the quantitative difference between one human doing something, like reading a license plate, and a computer doing the same thing a billion times.
Humans venturing onto private property where they don't belong are regularly greeted by a property owner politely wielding a shotgun and offering to give them directions back to where they belong.
If anything, I think it's the addition of some computer contraption that creates the free pass, like putting on a hardhat-- if you've got gear people are more likely to assume that you're supposed to be there.
The vehicles are plain white and say scofflaw patrol.
Here's an article about it https://www.cbsnews.com/newyork/news/seen-at-11-on-the-hunt-...
welcome to the 21st century. parking garages, theme parks and malls routinely do this as well. entering a private parking lot is surely criminal trespass, but most cops dont care so long as your trespasser is gone and no damage was reported.
most highways, onramps, and many intersections include hidden or overt plate readers as well. airports use them extensively as well. Ive even seen them at a gas station car wash of all places.
the DMV in many states also sells your personal information for profit. through massive defunding and regulatory capture, the entire license process for a vehicle has largely become a thinly veiled clearing house to shore up random state departmental revenue.
Yep. https://www.eff.org/deeplinks/2018/07/eff-responds-vigilant-...
https://www.eff.org/deeplinks/2018/07/california-shopping-ce...
It is? Is it a California specific law?
It is private property, but there is public access. You could ask them to leave, but I wasn't aware a crime had been committed.
1) Did they break down a gate to get onto your private property?
2) Do ANY other folks use "your" private property? Are you falsely claiming ownership rights to something that has a form of common ownership?
3) When you claim it's your property and entry is forbidden - has ANYONE else given permission (explicit / implicit) for non property owners to access? Think of things like family members of an owner, guests, UPS drivers, delivery drivers , maintenance folks etc. Is it clear that anyone who is not an owner can't access? This is normally managed by an attended gate system or similar.
4) Are you one of those folks who like to yell at minority or others in your neighborhood who don't "belong"? We've seen videos of delivery drivers dealing with folks like you who (often falsely) claim ownership of property.
5) If you don't PERSONALLY own the property, have you been explicitly authorized to act as an agent of the property? You'd be surprised at the number of folks like you claiming personal ownership who actually have ZERO rights to trespass a delivery driver etc.
6) You've made a strong claim it's illegal for someone to come onto the property, CA law protects folks unless they have been trespassed. Has that occurred?
Note: Many property management companies partner with third parties to help manage lots. If you have a shortage of parking or someone actively managing lot (not you) they may contract for things like expired tags / stolen vehicles and other tow services. They may walk the lot. They can also pay for someone to come and see which cars may be parking regularly without paying required parking permit fees etc. Many homeowners PREFER that someone sweep their lots.
Let the person answer the questions, and then start the crucifiction [sic]?
FTC Chair Lina Khan [further breaks it down](https://twitter.com/linakhanFTC/status/1557738539202531334):
>1. Firms track & collect data on Americans at a stunning scale—on our location, our health, what we read online, who we know, what we buy.
>@FTC is seeking comment on whether to issue rules aimed at commercial surveillance & lax data security practices.
Senator Markey (MA) has also [responded positively to it](https://twitter.com/SenMarkey/status/1557738870397353984):
>We have reached a crisis point for children and teens’ well-being online. I'm glad to see @linakhanFTC and @FTC working to give users the privacy protection they deserve. Congress must also pass my legislation to further protect young people online. We must act now.
Event details here: https://www.ftc.gov/news-events/events/2022/09/commercial-su...
Sign up now because there is a limit to the amount of comments they'll allow and it's first come first served.
At the bottom of the event page, they say a comment can be submitted at this link https://www.regulations.gov/ but I can't find the corresponding document for this topic. Am I missing something?
This is in principle a good thing, appointed bureaucrats shouldn't be allowed to create law by fiat. However it means Congress actually has to do their damn job and stop delegating their powers through lazy and broad legislation. Good luck with that.
Device and OS makers should transparently and clearly define what features can be accessed by apps, and allow them to easily be administered and disabled easily by device users... Period. Then penalties for abuse and misuse can be addressed for app makers with severe fines.
Any device a consumer buys should never be used to undermine them financially nor ever in terms of their personal privacy beyond basic analytics. That's a well known principle that should never be redefined.
With all of the other advancements in technology to monitor and track individuals that are out there, we should not be personally paying for devices that monitor us and report on us to private companies or anyone else for that matter.
It sounds like you expect them to do this without intervention.
Device and OS makers are not doing this now, and have had plenty of opportunity to do so. If anything the device/OS situation is getting worse - just look at windows requiring accounts, recommending apps and sending telemetry or ChromeOS embedding Google into your device.
So we do need an external force like the FTC to make change happen.
We've already been through several attempts of analysis of policy for app makers, a lot of time has been wasted. Companies already have huge volumes of data on consumers even if gathering was to be shut down now.
Regulation should also address deletion of personally identifiable information that they've already gathered with a well defined policy for moving forward. This issue is far beyond the point where regulatory action should have been taken.
You never know which comments get attention, but you might just see your upvote count bounce all over the place today!
The problem is that the government doesn't want you to have control over your phone either.
I'm okay with imaginary online point fluctuations, it's a small price to pay. Thank goodness it's not a reflection of anything real like my personal savings... hah.
The problem is that the device and os makers are also app makers, and often can circumvent the protections forced down the chain with private apis and hidden features. Yes, app makers can do evil things, but so to can those vendors below them in the stack.
Also, trying to blame and fine will eventually eliminate open source and free solutions because it will force members of open source projects to accept liability.
That being said, even an open source developer can potentially conduct info gathering and/or do serious damage to any consumer that installs their app, so in many cases, there would be no offense cited if there isn't malintent or negligence involved.
We're far past the point where regulation should have been in place. A serious example should be set to create a proper message about this type of activity by private companies and individuals. It is really not necessary for private companies to gather this personal info on individuals for any app. It should be in everyone's best interest to end this espionage-for-profit activity, even if it devastates the opportunistic industry that activity created.
I read the FTC press release and it talks about companies in general and data in general and doesn't even mention apps, devices, websites or advertisers.
Not sure why you think they are only going to be policing app makers?
What does it mean to "undermine them financially"? Offering just enough of a discount that they'll purchase a product that they probably shouldn't?
What are "basic analytics"? What other kind of analytics are they contrasted against, and what would make them no longer basic?
If Dunkin Donuts (a coffee shop) runs an app and gathers data about your purchases linked to you name and ID, they can any time later sell that data to life and health insurance companies, which in turn can use that data to justify charging you higher rates when you sign up to a life insurance or health care plan. That may not be happening now, but it could easily be rampant in the future in thousands of ways, and it's just a minor example of how people can be undermined financially by personal data overreach within private companies.
Social media companies have a lot more data than that if they consistently track users by location (under the false guise of targeted marketing), and there's no real public awareness nor understanding of these issues to this day.
Looking through the list of apps that want to "Control my computer using Accessibility Features", the attack surface is just too damn high: https://imgur.com/a/1CBpSWQ
Methods like encrypting files, creating proper storage segmentation/isolation for each individual app, ending the process of adding "bloatware" to devices, allowing for concrete disabling of cameras and microphone feature access to all apps (and also ensuring that app makers don't break app functionality when those features are disabled), eliminating in-app purchasing, ensuring that app stores clearly define app pricing and app maker credibility.
Those things are just some of the first steps that need to be taken. Educating users is not involved in those steps. The way apps are installed and operated these days is far more confusing than making changes in order to protect their privacy as a default setting.
TikTok and Facebook don't need access to cameras and phones when the apps aren't actively triggered by a user to record something... Somehow the apps require the permissions to be enabled entirely while the apps is being used.
Every company should be required to comprehensively report on what data they track and be bound to responsibility to uphold that to a government watchdog with extreme punishment for mis-use.
We also have to understand that when we speak about devices, we're not just talking about phones, we're increasingly talking about cars, thermostats, home security systems, TVs and many other consumer bought devices that give companies an infinite measure of methods on which they can wire tap consumers, and then sell that data, or even later use towards more harmful purposes like corporate espionage and extortion.
Microphones and cameras were found recently to be hidden in Televisions when consumers had no idea they were, as an example of how far info overreach has gone.
I'd recommend looking deeply into the integration and use of LIDAR on phones... Most people don't even know that/if it is a feature on their phone and on certain cars... It can be leveraged in many deeply invasive ways on individual device users if it is accessed by social media companies, or even worse if a data breach occurs.
First, They (The FTC) should hire proper consultants to properly present the issues involved (Both cynics and optimists), and not just lean on basic understanding, there are a wide range of devices and features, combined with tons of different apps and use cases for them. Resolution is not a simple issue that can be summarized within a few posts online.
Jokes aside, I'm really happy to see this. Privacy is really important and companies need to avoid collecting data they don't need as most of them end up 'hacked' at some point and every company shares their data with all the other SaaS they are built on. Also, I've needed a new desk for a while now.
If they have a problem with you (and they define what constitutes a “problem”) you can end up added to what is essentially a secretive blacklist that’s shared among businesses, and you have no visibility into or recourse against it.
I understand that I’m entering private property and I have the right to not go there. I don’t think that’s a valid argument against this being a terrible idea.
Scanning an ID to get back a confirmation of its validity and not retaining any data in the process isn’t something I have an issue with.
IDK, maybe the FTC should suggest that China may own some of these tracking companies/apps, that would get immediate bipartisan support.
That is a very interesting spread.
Statements from each (notable excerpts)
Chair Linda M Kahn. https://www.ftc.gov/system/files/ftc_gov/pdf/Statement%20of%...
>The data practices of today’s surveillance economy can create and exacerbate deep asymmetries of information—exacerbating, in turn, imbalances of power. And the expanding contexts in which users’ personal data is used—from health care and housing to employment and education—mean that what’s at stake with unlawful collection, use, retention, or disclosure is not just one’s subjective preference for privacy, but one’s access to opportunities in our economy and society, as well as core civil liberties and civil rights. The fact that current data practices can have such consequential effects heightens both the importance of wielding the full set of tools that Congress has given us, as well as the responsibility we have to do so. In particular, Section 18 of the FTC Act grants us clear authority to issue rules that identify specific business practices that are unlawful by virtue of being “unfair” or “deceptive.”10 Doing so could provide firms with greater clarity about the scope of their legal obligations. It could also strengthen our ability to deter lawbreaking, given that firsttime violators of duly promulgated trade regulation rules—unlike most first-time violators of the FTC Act11—are subject to civil penalties. This would also help dispense with competitive advantages enjoyed by firms that break the law: all companies would be on the hook for civil penalties for law violations, not just those that are repeat offenders. Today’s action marks the beginning of the rulemaking proceeding. In issuing an Advance Notice of Proposed Rulemaking (ANPR), the Commission is seeking comments from the public on the extent and effects of various commercial surveillance and data security practices, as well as on various approaches to crafting rules to govern these practices and the attendant tradeoffs. Our goal at this stage is to begin building a rich public record to inform whether rulemaking is worthwhile and the form that potential proposed rules should take. Robust public engagement will be critical—particularly for documenting specific harmful business practices and their prevalence, the magnitude and extent of the resulting consumer harm, the efficacy or shortcomings of rules pursued in other jurisdictions, and how to assess which areas are or are not fruitful for FTC rulemaking. ... At minimum, the record we will build through issuing this ANPR and seeking public comment can serve as a resource to policymakers across the board as legislative efforts continue. ... [categories include (Procedural protections, Administrability, Business models and incentives, Discrimination based on protected categories, Workplace surveillance)]
Rebecca Kelly Slaughter https://www.ftc.gov/system/files/ftc_gov/pdf/RKS%20ANPR%20St...
>Conclusion The path the Commission is heading down by opening this rulemaking process is not an easy one. But it is a necessary one. The worst outcome, as I said three years ago, is not that we get started and then Congress passes a law; it is that we never get started and Congress never passes a law. People have made it clear that they find this status quo unacceptable.46 Consumers and businesses alike deserve to know, with real clarity, how our Section 5 authority applies in the data economy. Using the tools we have available benefits the whole of the Commission’s mission; well-supported rules could facilitate competition, improve respect for and compliance with the law, and relieve our enforcement burdens. I have an open mind about this process and no certainty about where our inquiry will lead or what rules the record will support, as I believe is my obligation. But I do know that it is past time for us to begin asking these questions and to follow the facts and evidence where they lead us. I expect that the Commission will take this opportunity to think deeply about people’s experiences in this market and about how to ensure that the benefits of progress are not built on an exploitative foundation. Clear rules have the potential for making the data economy more fair and more equitable for consumers, workers, businesses, and potential competitors alike. I am grateful to the Commission staff for their extensive work leading up to the issuance
Alvaro Bedoya https://www.ftc.gov/system/files/ftc_gov/pdf/Bedoya%20ANPR%2...
>Our nation is the world’s unquestioned leader on technology. We are the world’s unquestioned leader in the data economy. And yet we are almost alone in our lack of meaningful protections for this infrastructure. We lack a modern data security law. We lack a baseline consumer privacy rule. We lack civil rights protections suitable for the digital age. This is a landscape ripe for abuse. Now it is time to act. Today, we are beginning the hard work of considering new rules to protect people from unfair or deceptive commercial surveillance and data security practices. My friend Commissioner Phillips argues that this Advance Notice of Proposed Rulemaking (“ANPR”) “recast[s] the Commission as a legislature,” and “reaches outside the jurisdiction of the FTC.”1 I respectfully disagree. Today, we’re just asking questions, exactly as Congress has directed us to do.2 At this most preliminary step, breadth is a feature, not a bug. We need a diverse range of public comments to help us discern whether and how to proceed with Notices of Proposed Rulemaking. There is much more process to come.
Noah Joshua Phillips. https://www.ftc.gov/system/files/ftc_gov/pdf/Commissioner%20...
>Legislating comprehensive national rules for consumer data privacy and security is a complicated undertaking. Any law our nation adopts will have vast economic significance. It will impact many thousands of companies, millions of citizens, and billions upon billions of dollars in commerce. It will involve real trade-offs between, for example, innovation, jobs, and economic growth on the one hand and protection from privacy harms on the other. (It will also require some level of social consensus about which harms the law can and should address.) Like most regulations, comprehensive rules for data privacy and security will likely displace some amount of competition. Reducing the ability of companies to use data about consumers, which today facilitates the provision of free services, may result in higher prices—an effect that policymakers would be remiss not to consider in our current inflationary environment.1
Christine S. Wilson https://www.ftc.gov/system/files/ftc_gov/pdf/Commissioner%20...
>Throughout my tenure as an FTC Commissioner, I have encouraged Congress to pass comprehensive privacy legislation.1 While I have great faith in markets to produce the best results for consumers, Econ 101 teaches that the prerequisites of healthy competition are sometimes absent. Markets do not operate efficiently, for example, when consumers do not have complete and accurate information about the characteristics of the products and services they are evaluating. 2 Neither do markets operate efficiently when the costs and benefits of a product are not fully borne by its producer and consumers – in other words, when a product creates what economists call externalities.3 Both of these shortcomings are on display in the areas of privacy and data security. In the language of economists, both information asymmetries and the presence of externalities lead to inefficient outcomes with respect to privacy and data security.
Does no one remember the NSA?
The NSA can also get all the information it wants from companies while they're still banned from selling it.