WebAuthn is an open standard. It's a way for you to prove to a website that you have a specific private key. There's no lock-in, because the key is portable (unless you don't want it to be). There's no privacy issue, because the key is unique per website. There's no security issue, because it's unphishable and can be unstealable if it's in hardware.
If you don't like Google or Apple, use your favorite password manager. All it will have to keep is a private key per website, and you're done. No usernames or passwords. You visit a site and are automatically logged in with a browser prompt.
This is amazing, it's the best thing that's ever happened to authentication. It's something the end user cannot have stolen. Can we be a bit more excited about it?
EDIT: If you want to try it, I just verified that https://www.pastery.net/ works great with Passkeys even though I haven't touched the code in a year.
That means that django-webauthin also works great with Passkeys, for you Django users:
https://pypi.org/project/django-webauthin/
Also, the latest Firefox on Android seems to work great.
There is a certain fiddling-while-Rome-burns quality to this comment. The blog post is not about the open standard, it explicitly focuses on a specific company's products. People are naturally worried about this even though the standard may be open, because we are at historically high levels of platform lock-in from megacorps. Gmail is the new "Blue E". Getting locked out of your Google account in 2022 is probably much worse than not being able to use a different browser in 2001.
Sure, HTTP is also an "open standard". How many real browsers exist that can play DRM-encumbered media? You'll find that the answer is "very few – basically anything made by Apple, Google, or Mozilla" (perhaps Brave as well, which has an ex-Mozilla founder and uses Google-funded tech).
The best way to get people to adopt the open standard is to actually showcase uses of it that are not just a single company's product, not call them names for being worried about lock-in.
1. Absolutely nobody currently uses WebAuthn. I'm extremely excited about the popularization, which, unfortunately, requires big players to get behind it.
2. The comments feel very "oh my God this car I'm driving is heading towards the edge of the cliff even faster". Don't use Google, they're unreliable and evil. While I'm very excited about Passkeys popularizing WebAuthn, I don't think anyone should ever rely on Google for authentication, so just don't use them and use Bitwarden instead, if/when it supports Passkeys.
A vendor coming up with an implementation of a great standard isn't the problem, the fact that people use it is the problem.
Unless the service you are trying to use requires that you use a particular model of authenticator, which the service provider can enforce via attestation.
Android's implementation of passkeys currently does not support attestation. https://groups.google.com/a/fidoalliance.org/g/fido-dev/c/nh...
Neither does Apple's, I believe.
So, there seems no way to use a password manager with passkeys.
If you ask me what's one thing in 2030 people will look back on and say "I can't believe we did that!" I'd have to say "passwords". Passwords need to die and WebAuthn is a great step forward.
But push google / apple about solving mud puddle problems and it's curiously missing from their wallet implementations and they stutter around it when they give talks about FIDO2 and such and people ask them. It's the lock in direction they see everyone going towards that makes people uncomfortable.
[0] https://blog.cryptographyengineering.com/2012/04/05/icloud-w...
> There's no security issue, because it's unphishable and can be unstealable if it's in hardware.
You mean, you can (in theory) choose whether you'd rather have a lock-in or a security issue. Both options are mutually exclusive, you can't have them both at the same time.
It does not work with Chrome on Android.
There is a real privacy issue if online services will now force you to store your "passwords" on a device - whether it be in your phone or a password manager.
> Unless I can back it up and import it into a new device from a competitor, then there is no way I am going to use this unless forced. I do not trust one company anymore.
Which is the same sentiment as this thread. The first comment was just talking about the open standard of Apple's implementation and weakness of 2FA loss/recovery.
If it can be backed up, then a casual bystander/process can also "back up", filch all of your credentials in a few moments with you being none the wiser.
The protocol is open, so I can use one proprietary key from company A, one from company B, and a few open source keys. Keep one for regular use and the rest as backups.
Especially when that company is Google.
Now I'm not sure whether they can help you unlock your Apple ID if you prove to them that you're the owner of the account, but I can at least visualize Apple having the scale to do that.
Google on the other hand has a horrendous reputation for locking out people out of their accounts totally and permanently. Of course everyone has concerns about handing all your account login responsibilities to a company with such terrible customer service.
Apple will happily tell you that if you want grandma to have a whatever color text bubble, you should buy her an iPhone, to name a recent example, rather than adopt the standard everyone else is using. I bought a Macbook last holiday season and couldn't even set it up until my wife set up her iPhone on my account to activate the laptop. I bought my wife a iWatch last week and briefly thought of getting myself one but you can't use it without an iPhone (they have a "kids" feature where a parent can activate it but it basically has no smarts at that point AFAIK).
So, for making an authentication standard, I'd trust Google over Apple.
As an Apple user I find this as frustrating as it is wise. Mostly for future-me who may one day not be as savvy and manage to screw myself.
I don’t fully trust iCloud Keychain and Apple to never lose my data in a “I don’t concern myself with backups” manner. So I opt for using Passkeys where I can also add my FIDO2 tokens.
> Google on the other hand has a horrendous reputation for
Neither are true nor false but definitely exaggerations. All you're doing is displaying personal biases by providing them with benefit-of-the-doubts. They too have a reputation for locking people out, and are well known for turning data over, but one that HN in gernal prefers to ignore.
They are solving a very real problem. WebAuthn uses private keys, but those private keys are tied to the device where they were created. This is a blessing and a curse.
It's a blessing because it eliminates a whole trove of phishing attacks. After all, if no one can get the private key, they can't steal or share it. Well, of course they could steal the actual device, but that's orders of magnitude harder than stealing online credentials (points to https://haveibeenpwned.com/ ). That's a good thing.
It's a curse because the same person logging in from their ipad, android phone, and desktop PC needs to set up WebAuthn three times. For each domain/website (broadly speaking). If they only set it up once and lose the device, well, they are either locked out or need to have another means of account recovery (username/password, calling a customer service rep).
This curse is what passkeys managed by Apple/Google are attempting to solve.
I believe the WebAuthn 3 draft is going to try to address some of this: https://www.w3.org/TR/webauthn-3/ but that's based on what a co-worker said. A quick scan didn't turn up anything.
If you want to know more about WebAuthn, I wrote a lot more here (my company is going to release an implementation Real Soon Now): https://fusionauth.io/learn/expert-advice/authentication/web...
To clarify I am not talking about the issue of syncing the device's private key. I am talking about the artificial problem these walled gardens are creating by having every single domain getting its own randomly generated private key. The only practical way to keep all of these randomly generated keys synced across multiple devices is to use the "cloud".
If instead the per site key was generated using a private key and the domain name, users would only need to transport that one private key to another device and would get syncing for free without the requirement of the "cloud".
Or just get some Yubikeys.
Instead, each website that accepts passkeys should allow you to register multiple devices and probably print out backup codes as well (for the especially important accounts).
If there's no reason to migrate anything then lock-in is irrelevant. Just add more login methods so that when you lose some, you have others.
With backup devices, whenever I upgrade or replace a device, I need to go to each of the 500+ online accounts and register the new device. This is much more work than a quick login to each site via my password manager (which can happen on-demand, only as I need to use the services).
Websites already have a hard time to get users to sign up, so requiring them to enroll backup authenticators (which they won't have) is not going to work. Printing or writing down backup codes is even worse from a UX point of view.
IIRC the spec has a flag to hint that the passkey is backed up (in iCloud or your Google account) so the relying party (website) knows whether backups are mandatory but that means the secret doesn't stay on your device and goes to the mothership. Then I don't see why the spec wouldn't standardize the transfer of secrets from one company to the other.
What drives me nuts is how little discussion of this I've seen. People don't even seem aware of the implications of it. It's being pushed hard as a boon to security, which it is in some cases, but at a cost that nobody is even considering or talking about.
The implications are pretty profound: large companies having the power to lock you out of everything on a whim (even your own systems and unaffiliated third party services), levy taxes on the use of everything (e.g. Google starts charging you or sites to log in with Google), surveil literally everything (including logging into everything you have as you and sucking down data), and if a big identity provider gets seriously hacked it'll be an epic security apocalypse. Imagine someone stealing the master keys for a provider and pushing ransomware to millions of companies at once.
... and don't forget the obvious: "Oops I got locked out of Google and now I'm locked out of 50 SaaS services, my company's bank, my VPN, and my remote servers."
It just totally blows me away that these systems have no privacy protection at all, no portability provision for me to select or change my provider built into the protocol, no built-in support for third factor auth that I can control (e.g. FIDO2), no built in provision for recovery codes, and so on. These kinds of things didn't even seem like they were considered in the design of things like OpenID/OIDC. It's just a big "oh hey lets give god level access with no recourse to third parties and implement it so there's total lock-in... what could go wrong?"
Edit: yes some well-implemented systems offer their own built-in support for some of those things (recovery codes, changing your auth provider, reverting to password, etc.) but in my experience it's a minority and there is obviously nothing in the standard to encourage it or provide any guidance on how to do those things securely.
Why is everyone yelling about the sky falling down when this is the best thing to happen to authentication since ever?
If they could just use their fingerprint/faceID to login (after initial registration on the device) they would be super happy.
Rest of us should be happy there will be less exploits where people give up the keys to their kingdom by clicking on a random email.
At no time am I even likely to rely on Google for anything this important; every other week there's a thread about Google killing off accounts for no reason. No way would any sane person allow Google access to this with their track record. And this isn't even considering my suspicion that Google only wants to "help" with this so you're locked into their services and they are better able to track your activity.
https://blog.1password.com/1password-is-joining-the-fido-all...
If Google did an about face and started providing reasonable escalation mechanisms for when they lock you out of your account based on a faulty decision of their algorithm I'd consider it.
More or less the same, except that I haven't found good TOTP solutions for the desktop, to the tune of KeePass (something that can run on Windows/*nix instead of making me use something like FreeOTP, Google Authenticator or other Android/iOS apps; or in addition to the mobile apps).
That said, even with multiple Google accounts for different things (e.g. personal e-mails, file storage, cloud services etc.) it feels like eventually you might want something like Qubes OS, another way to run multiple separate VMs, or just use separate devices for separate use cases.
Much like how some orgs have separate laptops for accessing prod environments, that are more tightly controlled, even though that's not convenient enough for most people.
https://github.com/browserpass https://www.passwordstore.org/
It works pretty well with pass (password manager) that stores each individual entry in GPG encrypted file. GPG is pain, but if you happen to use it already then it works.
I’m curious though, what’s preventing you from using a password manager on your phone? I use KeePass, and I’m able to use my password DB on any device I want.
Also are you worried about the security of the DB on your phone? My password DB's passphrase is a good 50+ characters long, which I can type quickly on my laptop, but I can't imagine pecking that out on a phone. And I feel like I would not want the DB unencrypted/unlocked all the time on my phone; given the possibility of my losing it or it getting stolen, I'd want it to re-lock immediately after each use.
Provided Apple has fewer services and less surface for your account to get banned, but that's still a valid concern.
"Only on the user's device", right.
Of course, one doesn’t need to utilize this, but you’re SOL without a recovery mechanism of last resort (unless individual sites and services have their own recovery processes to re-provision a user who no longer has access to their cryptographic credentials).
For all the talk of "one app to rule them all" (which is an awful idea) this is a step closer to that.
For all it's faults, crypto has one thing right -- not your keys, not your stuff. I get that doing keys/passwords is hard, but the best thing in the long run is for them to stay in the hands of the user.
And if not, the holder of the keys needs to be someone you can easily hold accountable, i.e. either fire, or arrest, or sue if they get it wrong.
Erm, this isn't really an aspect of cryptocurrency, per se. It's more of a general rule that informed the initial thinking around cryptocurrency. In fact, most users of cryptocurrency seem quite content to give up cryptographic custodianship.
If you went back a similar time to the nascent web/cloud/etc, you'd find plenty of similar sentiment about remote software and storage. It's just that individual autonomy loses out over time due to convenience created by the massive investment in the surveillance economy.
I use Krypton but that's not maintained (and already broken on some websites like Github). I trust the secure storage module of my phone and I trust my computer's TPM, unlike many other Linux users; surely it should be possible to integrate with the OS somehow to make it secure, right? The last example I saw used USB over IP to inject a virtual FIDO device, which works great, but the implementation is clearly not ready for prime time.
How do you back them up locally?
The code is pretty simple and a good place to start, as well as AuthCompation, if you wanted to roll your own library in your language of choice or whatever, or something very custom. I found both useful recently.
> Since passkeys are built on industry standards, this works across different platforms and browsers - including Windows, macOS and iOS, and ChromeOS, with a uniform user experience.
I see no mention of Linux in these examples, which tells me that users having access to their keys is not a primary concern for these implementations?
https://github.com/bulwarkid/virtual-fido/ https://news.ycombinator.com/item?id=32881956
Since I already use a phone capable of doing the same thing, let my phone be my main authenticator, and then I can use a Yubikey as a backup.
It's not like one is necessarily better than the other, except that you already carry a phone and they're capable of being a hardware device that works with Webauthn. No need to carry a second device or, pay for one, for that matter. Since at least with Apple's solution it'll sync over iCloud Keychain.
If you're happy with Yubikey's, nothing changes. But for the average person, this makes Webauthn an option without having to buy any hardware or carry something you are more likely to lose because you don't understand the intricate details of how the thing works. I wouldn't expect my parents to understand how a Yubikey works well enough to know it should be used as a pair, for backup purposes, but that is a barrier to entry for them that they don't need to worry about now.
Once passkey support comes to bitwarden I'll be a little more comfortable I think.
Thus, unlike a FIDO2 key, you don't have to visit every online service to tell it about the new redundant keys you add.
The rest of the security article linked by madjam002 goes into detail how Google implements their version of that backup. It's a bit like Keybase in the sense that your other devices act as keys to unlock the backup for new devices.
Yubikeys are great but they're super niche. Among Android users alone there might be a billion people who will never buy one.
Right now, I offer a classic login, and a few social providers. You'd think this is straightforward to support, but about 70% of support requests consists of the endless ways in which users can mess this up.
"Can't get in"
Try recover password. Email didn't come. Because they entered the wrong email. Correct email this time. No wait, think I signed up with a social account, not sure which one, have many. Login worked. Wait now it doesn't again (saved browser password did not update).
This is just the tip of the iceberg. This new solution, whatever merit it has, is going to be additive. It won't replace anything, it's yet another way to log in, if at all, as it depends on websites implementing it and about 90% of the web is basically not maintained.
So it's only adding complexity/confusion specifically to these users, which I consider to be the vast majority. In turn leading to more support headaches.
Passkey? What's that? New word thus meaning unclear.
Doesn't seem to ask for an actual pass-anything, so more confusion.
No email identifier or thing to remember. How can I know log on at my other device?
With a QR code? What on earth is that?
I believe we in letting the user choose whatever way is best for them to login -- and to take that burden off of the developer. If you want to learn more, check out the Show HN post on Hellō I wrote this morning. https://news.ycombinator.com/item?id=33177705#33182379
For a while this was largely built around XMPP but now the stock Google implementation is custom.
I'd love a refresher crash course on what's in Chrome that's not in Chromium. It's been a long time since I used Chromium but I think when I did it seemed to have a as-best-I-could-tell working Google Sync implementation.
It's hard to imagine a scarier project to fork. I dont think there's a lot of resources out there for DIY'iny a Chromium fork.
https://github.com/w3c/webauthn/issues/1255 https://github.com/w3c/webauthn/issues/1616
https://www.imperialviolet.org/2022/09/22/passkeys.html https://news.ycombinator.com/item?id=32946750
[1] https://phys.org/news/2005-12-biometric-expert-easy-spoof-fi...
The first thing to remember is that your fingerprints / face scan are not the identifier for your passkey. They are used by the local device to unlock its secret store but the actual keys are regular crypto keys and the remote website never sees any of them. The interface also does not provide access to the private keys ever, and it should be rate-limited so it's not “get all of the keys” but the much slower “use the phone I stole to hammer out requests to different websites, mashing that sensor every second or two”. That means that whoever stole your phone & forged your biometrics is in a race with you revoking their access, but when you do it won't matter that they have your biometrics unless they can also steal your new phone (stop pissing off the Mossad).
The other thing to consider is what your threat model is. If you're worried about someone stealing your phone and building a realistic model of your fingerprint or scan of your facial structure, you have to ask what the alternatives are. For example, it'd be a LOT easier for an attacker to use a hidden camera or drone to record you entering your password — not using biometrics means you're typing it frequently, for example — and you're also at risk for all of the scenarios which passkeys are immune to (credential reuse, phishing, weak passwords), which happen to be by far the most common way people are compromised. Very few of us have to worry about targeted attacks by skilled adversaries, and if you are worried about that you probably need to move or hire a bodyguard more than anything involving infosec.
There are both statements all over this thread, "the private key is inextricably bound to the device" and "the private key can be backed up". This seems to be an attempt to declare both the problem with theft and the problem with lost devices as solved. Except, they are mutually exclusive: If the key is bound to the device, I can't back it up. If I can back the key up, it can't be bound to the device and can be stolen.
There is one way to get both, which I assume Google is offering here: Have the key inaccessible to the user of the device, but allow some cloud provider access, so it can be automatically backed up to the cloud.
That's all fine and good, but now you're dependent on the cloud provider not playing games with you. And you still have to authenticate to the cloud provider in some way, e.g. to add a new device. (And have to do that in a way that someone who stole your phone couldn't just do the same)
> The other thing to consider is what your threat model is. If you're worried about someone stealing your phone and building a realistic model of your fingerprint or scan of your facial structure, you have to ask what the alternatives are. For example, it'd be a LOT easier for an attacker to use a hidden camera or drone to record you entering your password
You can also view this from a different angle: Phones get lost all the time, sometimes stolen. It's statistically not that unlikely that your phone will get lost at some point and the people who find it may not always have the best intentions.
If a phone is in the wrong hands, it's much easier to gain access to the phone using fingerprints than it is using passwords: There is a good chance some usable fingerprints from you are still on the device cover itself. Sure, an attacker would need some resources to take the fingerprints and build something that can be used with the phone's sensor, but if enough people use fingerprints for auth, attackers will streamline this step quickly.
That's bad enough if it grants "only" access to the phone itself, but now it would also grant the attacker access to each and every online account.
Sure, you could race to deactivate keys - if you remembered to setup alternative login methods before, because otherwise it's you now who cannot login anymore.
I'm struggling to see your complaint being a valid one. This is basically webauthn, so use a Yubikey or similar device if you wish.
> provides organizations deploying FIDO Authentication with a centralized and trusted source of information about FIDO authenticators.
The aim of this centralized system is to allow revocation of hardware that doesn't meet their unchallengeable opinion of whether you've spent enough money on your device or not. They can similarly require that devices do biometric scanning[1], and be issued by your government, and require you to agree to lengthy (and self-updating) terms of use.
There are actually (at least) two different types of device attestation that FIDO support[2]. One uses a hardcoded on-device private key, that's common between 100,000 devices of the same model, which means that an attacker can brick 99,999 other people's devices just by extracting the key from their own device. The other method requires a certificate from a "trusted third-party Attestation CA", which presumably allows a malicious (perhaps government-mandated) CA to spy on (and filter) every login request you make.
This system is like a dystopian parody of the traditional model of web security, which had no need for "authenticators that have a Trusted Platform Module (TPM) onboard", and which only required CAs to be on a list that the user agent is in control of (and users can add their own CAs to). Instead, what FIDO are building is basically DRM for human identity, with all the corruption and failure modes that entails.
[0] https://fidoalliance.org/metadata/
[1] https://fidoalliance.org/specs/biometric/requirements/
[2] https://research.kudelskisecurity.com/2020/02/12/fido2-deep-...
Passwords suck. Password managers make passwords more manageable but they still suck. Why not move on?