If you're a developer, please consider replacing reCAPTCHA on your site with an alternative. reCAPTCHA discriminates against people with disabilities and those who seek privacy, and it gaslights you into thinking you did not solve the challenge correctly, which is plain cruel.
Here are some reCAPTCHA alternatives: https://www.w3.org/TR/turingtest/
All of the "interactive stand-alone approaches" from that page can be beaten with run-of-the-mill OCR (other than perhaps the 3d challenge) and with almost any mobile phone speech recognition engine (and, if the attacker has the money, can send it off to Google's cloud speech-to-text).
All of the non-interactive approaches from the page require this constant tuning and upkeep to make sure bots aren't able to sign up/abuse systems. There's also not \that\ secure if your website is targeted and scripts are made specifically to avoid your anti-abuse methods.
"Your computer or network may be sending automated queries. To protect our users, we can't process your request right now".
Is there a solution for this?
It discriminates against people who value their time. Who in the right mind thinks that spending several minutes on captcha is ok?
Ticketmaster uses both recaptcha and a pre-filtering solution they supply based on their own heuristics, as well as a complex user activity tracking system to determine whether you're a bot or not based on the activity you present and traffic you pass, so even if you pass all CAPTCHAs, they still might tell you to pound sand if you try to reserve something.
In the last few weeks, for select sales, they've even required unique phone numbers which they will SMS a number to or call and relay a code to which you need to enter just to get a single place in line for a sale.
I'm not sure of any company more actively on the forefront of prevented automated access than Ticketmaster (which makes it kind of funny when everyone chimes in about how Ticketmaster doesn't do anything to prevent brokers from getting all the tickets).
The problem is that what Ticketmaster is up against is people running specialized software that's able to emulate a browser, which ties into services that are specifically designed to beat CAPTCHAs in an automated manner using mechanical turk type solutions, but at a very low cost.[1] I have reliable testimony that some people spin up the largest AWS instance for an hour or so as needed, run this software, use a proxying service, and make 8k connections to queue up for tickets on a sale. Each AWS machine is another 8k positions in the queue. Every new layer Ticketmaster throws into the verification process knocks these people out for a couple weeks, until the company providing the software (which I believe charges a small percentage for every ticket purchased, so they fix problems fast) works around it. The arms race metaphor is very apt.
That's just one of the companies trying to circumvent Ticketmaster's road blacks for brokers. There are others that try to automate their purchasing to varying degrees. I myself work for a broker that takes a very different approach, where we use (relatively) very minimal automation, and have a person in front of a browser for every purchase (and we don't have many people at all), and instead try to make select purchases based of complex analysis and lots of data. Even that's gotten much harder in the last few years as venues and promoters have learned to play with the allocations of tickets, and hold large chunks of the inventory back to be released later at higher cost. I don't really see anything wrong with that, it's a market response to supply and demand, but it is unfortunately hidden in a purposeful manner, which affects not only brokers but the the end consumer, as market information is purposefully obfuscated (which makes the markets less efficient).
I've written on this multiple times before, so if anyone finds this interesting, just do an HN search for my username and Ticketmaster together.
1: https://anti-captcha.com/ (Scroll down and read their animated infographic for what is possibly the most amazing graphical metaphor of this I can imagine at step 4. It's so disturbing it's funny).
That's interesting. Unless you are talking about having to click on more than one "page" of tiles (as illustrated in the video in the OP) guess I don't run into reCAPTCHA often enough to have noticed this phenomenon. Can you elaborate on what you mean by that?
I second this (for the same reasons that you cite), and it's fresh in my mind as I just recently began reimplementing authentication for my personal CMS. reCAPTCHA is not a nice thing to do to your users. And I also don't want to feed The Beast.
It's good to see some confirmation that you're not insane. Google's ReCAPTCHA is plain EVIL.
Originally it was an awesome solution based on OCR'ing books that usually worked quickly on the first try, and almost never took more than two.
Then it turned into a single checkbox (analyzing mouse movement) so it was even faster... and I remember some simple image-based like "select the images of cats" that were also easy to get right. So even better.
But THEN... in the past couple of years, the image-matching started asking exclusively for analysis of street images, that has two huge problems:
1) The images are so blurry and ambiguous it's really hard to get right, it feels like a test designed to make you fail
2) You never know how far you have to go -- you keep clicking items, they keep replacing them with new ones, and there's zero indication of if you're almost done or if you're getting better or worse.
Once I did one for three minutes straight, neither passing nor failing, until I just gave up and left the page... if it's a bug, that should never happen. If that's supposed to be able to happen, that's the apex of asshole design. Either way, it's a failure in every way.
To me it constantly feels like I'm working for google for free for their AI projects which is very annoying comparing to help a smaller company OCR books.
1) Computer vision got a lot better over the past few years. It's also become way easier for the average Joe bot operator to run cutting-edge stuff. OCR tasks don't cut it for distinguishing people from machines any more. Every time I see a blog post about a new computer vision architecture or how some random developer trained a neural network to get an X% result on benchmark Y, I think to myself CAPTCHAs are going to get more annoying.
2) The frequency at which most people have to solve a CAPTCHA has gone way down. In the beginning, I remember having to solve a CAPTCHA every single time I did anything on some sites. Now, I can't even remember the last time I had to do more than just check the checkbox. So, the amount of annoyance is amortized over a larger number of sessions, and Google probably feels like they can ask the user to complete more tasks as a result.
On top of that, I think some of the training sets are wrong. Multiple times I've been asked to find traffic signs, but it would only let me pass when including street signs.
This data is a few years old but I imagine it's the same based on my experience.
They're using your cookie + IP + your account data to determine if you're probably a human.
A LOT of reCAPTCHA sites never prompt you. You only know if it's there because you're on Tor or something.
Today I feel like Google uses it mostly for their self-driving-car computer vision projects.
I wish more sites would implement a Jigsaw-puzzle-style similar to the Binance login captcha, but I can't speak to the efficiency of that in defeating bots.
People kept trolling it by typing the test word correctly, and random garbage instead of the OCR word. It was easy to spot which one was which. Source: I was one of these people.
Google is a hypocritical pile of burning . They use bots right? They scrape websites, they infest everything from my banking website to console emulators with their tracking, and yet we little people are not allowed to scrape or interface with the web programmatically.
I want them to burn so badly, I hope the EU breaks them up. Screw captcha, screw AWP, screw them.
In some cases, the blame should be put on the site runners. I get a ReCAPTCHA when logging into my Patreon account. I've been paying then $10+/month for years now, they should know by now I'm not a spammer
they're just discriminating
against Firefox users?
At least part of the behaviour shown in the video depends on factors like cookies, IP address, and whether you have features like anti-fingerprinting protection turned on. [1]Recaptcha is frustrating and I dislike it, especially the slow fade-ins and multiple challenges, but if you repeat the test shown in the video you won't find it 100% repeatable just because you're using Firefox.
[1] https://github.com/google/recaptcha/issues/268#issuecomment-...
Their discrimination against FF users has been fairly evident over the past year or so.
It's amazing how my identification abilities improve exponentially by using Chrome instead of Firefox.
While filing taxes, on several occasions I had to just give up and try again after several hours because the Captcha won't let me pass through and after several attempts Turbo Tax will throw an error - to come back later.
It was literally a Nightmare
And I never figure out how to solve the traffic light riddle.
But I can't figure out why they make a 'delay'? Why not just show the next dam image?
For a fair comparison OP would need to use clean browser profiles on fresh IPs. Like this it is just fan-service for Google Captcha victims (like me).
It felt like staring into the soul of evil.
One of the things, if it ever gets there, would be for the anti-trust probe, if any, to look at how Google shares data between its browser, Chrome, and it's other services.
Do other non-American's get this as well?
The captchas are completely non-localised as far as I can tell; as others have pointed out the 'store-fronts' tend to be non-American.
I’ve noticed that in the last week, Google no longer provides a link to the non-amp version of pages. Previously, you could press two button taps to get to the non-amp page, but now that ability has been removed. This sucks because Amp doesn’t always support all the features of a normal site, like Reddit or blogs (commenting).
I worry how Google will abuse this in the future. Right now they control the first page you visit after leaving Google through AMP, but you can usually find a link to the home page of a site. In the future, they may restrict it further.
"Speed Up Google Captcha"
"Makes Google Captcha works faster by removing slow visual transitions and unnecessary delays."
https://greasyfork.org/en/scripts/382039-speed-up-google-cap...
You're moving too fast; your mouse and mouse clicks are "too good" to be human. Try solving the reCAPTCHA slower and you'll see wildly different results, or, purposely fail one reCAPTCHA to get easier ones.
reCAPTCHA tech is crazy; reCAPTCHAs are not simple web forms and Javascript, they're a sandboxed and monitored 'window' to a Google server. If you solve too many reCAPTCHAs too quickly (ie. when you are testing a web page, or are rotating your passwords on many websites) then Google's servers will try to rate limit you with slow animations and harder reCAPTCHAs.
Google should absolutely not be in a position where it can be inadvertently rate limiting your attempts to rotate passwords on different websites across the internet.
1) Try to login
2) Login doesn't show up--go to uMatrix and whitelist some crap.
3) Try to login again.
4) First phase of login completes, now blank when site tries to load Google captcha.
5) Whitelist Google captcha frames in uMatrix and reload again.
6) Login for the third time, Google captcha now displays properly.
7) Spend 10 minutes solving captchas. If I'm lucky, the first "Verify/Submit" will work. If not, I probably need to whitelist cookies for it within uMatrix and reload/try again.
8) Get notification from HumbleBundle that "You have not logged in from this browser before" and wait for a Verification email to hit my inbox.
9) Enter verification code. Site usually then logs me out for some reason, even though it was successful.
10) Login again. Solve Google Captchas again. Finally allowed to login.
11) Finally buy the goddamn thing I was there to buy.
12) Search Amazon for wig.
Edit: This is a joke, I am joking.
Aside from the the obviously concerning censorship that happens if you try to access reCAPTCHA-locked sites over Tor, it is literally forcing internet users to do free labour for Google so that can train their AI for whatever project they're doing.
So not only is it a tax on using the internet (paid in seconds to minutes of human existence each time -- I bet reCAPTCHA has collectively cost humanity thousands of lifetimes of wasted effort solving stupid puzzles) and it creates censorship, it also is an act of charity on our part that we provide Google free work with no benefit for ourselves. Given that they literally pay people to do (something similar to) what we are doing for free, I wonder it there are labour law arguments to be made (we aren't paid anything for this work which Google clearly is willing to employ people to do).
I solved the problem by using an extension that toggle that flag: https://addons.mozilla.org/en-US/firefox/addon/toggle-resist...
I was thinking maybe something that has 10 difference Google sessions, and shards them depending on the website, deciding which to send to the Captcha. You'd build reputation at 1/10th the speed, but you'd still potentially build it. Or, one that allows you to create a random Gmail account and then use that as your identity across the different sites. Perfect privacy would be hard, but improved privacy should be doable.
Alternatively, getting something like blinded identity tokens widely used would be good.
2016-2019: working for google - analyzing street footage for implementing AI for self driving cars.
Maybe I should also invoice google for the effort.
It makes me sad that they are so pervasive or I would categorically refuse to engage with any site that uses reCaptcha.
This whole captcha joke and firefox made me hate Google more than anything else.
If it's your bank's site, move a bank. You say "oh, it's a lot of work just for some captcha"; yes it is, but this is the only way this clowns will learn. When 1000 people leave a bank for a competing one and say "I left because your site employs captcha", it will magically disappear. I've seen it happen.
For reference I post regularly on 4chan (not compulsively but maybe a dozen comments a day on average) and if you don't have a pass you have to fill the captcha every time. I only use Firefox. I definitely experienced what this video shows on Firefox in the past (the super-slow loading images) but it felt more like a bug than anything else and it doesn't represent the typical experience. Maybe I tripped one of Google's bot filters somehow and I ended up with a reinforced captcha, or there was a bug somewhere.
The Chrome section of the video is a lot closer to what I see usually, but they make me go through two challenges in a row typically (although that might be 4chan's settings at play).
I'm all for the Chrome hate if it means that people switch to Firefox but I think we need harder data than a short video to call shenanigans on that one.
Off topic rant: the fact that a post with such lack of substance manages to reach 700 votes in 3 hours is frankly depressing, it has no place on this website IMO.
The starting level, I suspect, is heavily influenced by browser settings and many other factors. With that in mind, and assuming that
1) trust inversely correlates with anonymity,
2) people using Firefox tend to be more tech-savvy and careful about their privacy, and
3) tech-savvy people using Chrome probably won’t bother locking it down, since it “talks to Google anyway”,
I’d be disinclined to believe Google actually discriminates against browsers—no matter how compelling a narrative this may seem—until I have a complete picture of OP’s setup (from browser settings to OS and connection).
[0] Last year there was a period I was getting many captchas (either my location or AWS VPN caused me to be considered “untrusted”); I actively tried to figure out how to get past it without giving the algorithm what it wants, so I could go through a dozen of these captcha screens in one browser window. I use Safari, Firefox and Chrome routinely.
When logging into an account I needed to log into, maybe a couple years ago, they'd jerk me around in the manner of this grumpy.website example, but more. One time, it went on for several topics, for what seemed around 10 minutes. I pay money for that account.
This obnoxious annoyance is in addition to the offense of some company letting third-party code from a mass-surveillance company not only into their pages (which almost every company with a Web site does, sadly) but also into their authentication page. Much more important services on the Web do not need captchas for login to accounts that were paid for. Now, every time I get a hassle to log in to my account I pay for, plus directly leak that info to a surveillance company. It makes me regret paying money for the account, like the company are oblivious or don't care, and I won't have much loyalty when the right competitor appears.
On a different note, this also makes it difficult to use such websites if you block google domains in your adblocker for non-Google sites.
I honestly think this was the reason why Captcha's bot was so passive-aggressive :D
It’s because Google can’t read as much about you in more privacy based browsers, so you have to prove yourself.
Not saying it’s right, but that’s the reason. It needs to be changed.
We've seen this before. We'll probably see it again.
Here's an extension to use those services in the browser so you never have to solve one again: https://addons.mozilla.org/en-US/firefox/addon/recaptcha-sol...
That's assuming you can't get Buster to work.
But in fairness to Google, the promise of their new Captcha system is that it uses all of your previous browsing history across the web to determine how likely you are to be a bot. You can't do a fair apples to apples comparison unless the browsing history and behavior is the same across both browsers.
1) https://www.onlineaspect.com/2010/07/02/why-you-should-never...
I keep seeing reCAPTCHA installed on very low security sites that don't seem like targets for automated bots. I'm wondering if they have some external incentive to install it.
And btw I hate reCaptcha. Is it really only option to fight with spam? When I see it on sites, like dhl parcel tracking, I get mad. I always ask why? Can they just block suspicious traffic, or at least not display captcha on first attempt.
I get the first few selections right, so the algorithm knows I'm trustworthy. Then I purposefully get the last ones wrong. This way, I'm still validated by the captcha and I get to show the middle finger to Google.
Now I smile every time I'm faced with reCaptcha :)
Highly recommend. It does take some time to figure out the patterns (when to get it right and when to get it wrong), but once you do, it just works.
This is why Google should be broken up -- it should be forced to spin off Chrome into a separate company with a business model similar to what Firefox has.
misleading
I am working on a micro-payments system (based on mutual credit) that should allow to pay something like $0.001 instead of solving a captcha. If this would introduce zero extra friction, would you consider using this kind of solution over the traditional captcha?
Funny thing is I haven't used chrome in months so it should be the other way round!
If you’re primarily trying to stop bots and similar take a look at https://www.kasada.io/
Site owners can choose not to use google's recaptcha2 but it has become the de facto standard now so no one cares.
I'm not sure whether I'm glad to find out it's (also? only?) because they hate Firefox.
Also, good to see that it's a more widespread issue with these captchas too, I somehow thought that I am just bad at solving them :)
https://developers.google.com/recaptcha/docs/v3
Of course, you need to have cookies enabled.
If you do any browser in ignonito mode and/or use VPN or Tor you are going to get persona no grata treatment because it is likely your source network and IP address have caused a lot of problems before. The only way to go around is to have some permacookie on your browser saying you are a good citizen.
Has anyone posted a technical analysis of the changes? I’d love to read more about it.
maybe it's because i don't use umatrix (i only use ublock origin)? maybe because i'm always logged-in in at least one google account?
That's likely a primary reason.
Does this mean that Google knows enough about me (ie, privacy leak) that it's choosing to not having infuriating UI?
I feel like every captcha is about a street scene of some sort... house numbers, cars, motorcycles, hydrants, stop lights etc.
After reading comments in this thread, now I realize this is intentional thing against Firefox.
Damn Google. what happened to your "Don't be evil" beginnings ?
That said, it still forces you do to work for its self-driving car effort.
It's thanks to reCaptcha that I know what a 'crosswalk' is.
- is generally easier to solve (download the sound clip using curl or wget, type in the nonsense it says, done)
- does not turn me into a mechanical Turk training Google's AI
- works in 'any browser' by circumventing the browser (by using wget/curl), thereby not allowing Google to punish me for not using their dragnet/browser.
I filed a bug report, only one version of it is fixed, later versions were just displaying same old pages.
It's not Firefox that's the problem; reCAPTCHA works just fine on Firefox. It's all those anti-tracking measures you installed and enabled -- they work by making your browser indistinguishable from a low-quality bot, kicking the website into self-defense mode. The slow fade is a rate-limiting measure. It's annoying to you, but it's more annoying to people trying to automate login attempts.
The site is attempting to protect your account by preventing automated attacks against it. Meanwhile your browser is doing it's best to look like a shell script, refusing to send any sort of behavioral feedback or distinguishing characteristics that might give away the fact that you're a human.
So the question is: is it really worth alienating those quirky, paranoid users who take extraordinary anti-tracking measures, just to protect your normal users from automated attacks?
Yes.
Of course it is.