Does external cert validation for onion domains even make sense? I thought the "domain name" was already the hash of some public key that is used in the normal encryption of the onion router - so there is already a mandatory cryptographic proof that the service you're talking with "owns" the domain. What additional security benefit would CA-signed certs bring?
>I thought the "domain name" was already the hash of some public key
With v3 it's the ed25519 key with a checksum.
For something like a Cwtch address or your personal dissident blog criticizing Emutopia, it's enough your contacts get the address from you personally or that they find it some other way and pin the site to bookmarks for TOFU.
But with public services like Duckduckgo onion service, it's possible for people to trivially spin up their own unique per-target MITM proxy server instance, and share the link to their friends, bookmark it to their SO's Tor browser and MITM their connections, poison link repositories, or official links on wikipedia pages etc.
Having a CA validate you own the clearweb site first helps mitigate this stuff to some extent. Problem is of course, will the user know if they're supposed to be expecting a cert for a page they visit the first time.
(I wonder if Tor browser could have a list of pinned onion addresses with "clearweb_equivalent_of" field for this, and you could easily check that from the site security badge.)
A bit circular. How do you know the domain? Trust.
I like this “onion pinning” possibility:
There is value in exposing the existence of an onion site via CT Logs. If someone navigates to the plain web version of a site, and is presented with a certificate containing a Subject Alternative Name (SAN) for both the plain web and the onion site that provides a strong cryptographic guarantee that they are the same site. Effectively this would replace the Onion-Location header with something more authenticated.
Item 9 on their list of reasons is going to be applicable for the set of Onion services that overlap things that should exist conventionally. It says you can use this proof-of-control for an .onion name, then use conventional methods for any other name, then bind both names to the same TLS public key and thus to each other.
So that means e.g. www.facebookwkhpilnemxj7asaniu7vnjjbiltxjqhye3mhbshg7kx5tfyd.onion can have a certificate which also names www.facebook.com giving you confidence that it's not just some crook who wanted to impersonate facebook and also generated a key to match the name prefix.
The site where you can hire assassins for Bitcoin presumably does not have a legit web presence, but the site where you can order abortion pills in unmarked packaging sure does even if you must use Tor to access it from where you live.
They write the following reason in the article:
But as the web and other internet technologies mature, certificates are starting to be a requirement in order to unleash functionalities, especially in web browsers, such as the faster connection protocol HTTP/2 and payment processing.
This seems really sad. But I guess it depends what the goal is. If you want to integrate onion purely on a DNS resolver and network interface level and then use a stock browser for accessing the services, yes, you'd need that.
(Then you'll also have to fight with the stock browser for using your special DNS resolver, not leaking info to Google, Cloudflare or whoever else, etc etc, tho)
But don't most people use custom browsers with built-in support for onion anyway? If that's the case, the easiest solution would seem to just declare .onion a "secure origin" like localhost and patch the browser accordingly.
> But don't most people use custom browsers with built-in support for onion anyway? If that's the case, the easiest solution would seem to just declare .onion a "secure origin" like localhost and patch the browser accordingly.
Indeed, the use of the onion TLD has been standardised in RFC 7686 [1], so browsers should really treat it as secure and stop the usual plaintext HTTP shenanings.
The article has a long list of reasons for certificates. Here's another reason:
>4. It also opens up new opportunities such as payment processing, "as current PCI DSS requirements do not allow non-standard TLS"2 and may only work with certificates having some sort of validation3. Payments card networks require HTTPS for a payment to be taken. So if someone wants to do that over an onion site they would need a TLS certificate.
> Does external cert validation for onion domains even make sense? […] What additional security benefit would CA-signed certs bring?
Yes, and the page/documents explain some use cases:
> The two ACME-defined methods allowed by CA/BF described in Sections 3.1.2 and 3.1.3 (http-01 and tls-alpn-01) do not allow issuance of wildcard certificates. A ".onion" Special-Use Domain Name can have subdomains (just like any other domain in the DNS), and a site operator may find it useful to have one certificate for all virtual hosts on their site. This new validation method incorporates the specially signed Certificate Signing Request (CSR) (as defined by Appendix B.2.b of [cabf-br]) into ACME to allow for the issuance of wildcard certificates.
> Some Hidden Services do not wish to be accessible to the entire Tor network, and so they encrypt their Hidden Service Descriptor with the keys of clients authorized to connect. Without a way for the CA to signal what key it will use to connect, these services will not be able to obtain a certificate using http-01 or tls-alpn-01, nor enforce CAA with any validation method.
> To this end, an additional field in the challenge object is defined to allow the ACME server to advertise the Ed25519 public key it will use (as per the "Authentication during the introduction phase" section of [tor-spec]) to authenticate itself when retrieving the Hidden Service Descriptor.
A certificate is just a fancy way of saying one key pair signed a message containing another public key. It is a link in a cryptographic chain of trust.
Given that you already trust a public key, you should also trust some other public key with certain caveats because the first public key signed a message containing the second key and whatever else. A list of allowed domain names is a possible caveat.
An onion address is more like an IP address, except it is stable for hosts across time, and it contains enough information to cryptographically prove identity. It may be true that the browser interprets it as a domain name, but it is really operating at the network level where Tor is the network.
A certificate for a Tor address means you can go from DNS to a Tor address because the certificate will contain a list of allowed domains and the tor address contains a public key, same as usual.
"Doesn't make sense for us but mandated by policy" is a super common phenomenon that you'll sadly encounter all the time in the industry. Especially when it comes to security. In this case it's at least motivated by something as peripheral as onion services wanting to fit in with the browser ecosystem, which, fair, maybe it doesn't make sense for browsers to bloat their designs by taking onion services into account, and then onion services have to adapt to modern browser standards.
The section Benefits (after Introduction) lists 9 reasons why it makes sense. Some of them are about working around a mismatch with existing standards, but not all.
This is another example of why requiring TLS everywhere doesn't make sense. Onion traffic is already encrypted, but because software demands TLS everywhere, we add TLS on top, even when unnecessary.
The same happens with 1:1 tunnels, or even localhost. None of these need TLS, and I should be able to tell my browser "enable all features on this site, consider it fully secure".
TLS everywhere isn’t a sane default designed to coddle power users and security pros, but to protect your everyday netizen. You suffer because of your competency.
For your arcane Wizardry you might want: —-unsafely-treat-insecure-origin-as-secure
Like using the inherently unsafe language this only makes sense when you are the wizard and that's not always so perhaps better not.
I think a lot of wizards have bad days. On Thursday morning, with a fresh cup of coffee and a gleam in their eye they can write a thousand lines of tricky x86-64 assembler and every single instruction is perfect like God wrote it.
But on Friday evening, after getting only one hour's sleep because Theresa is teething and won't settle, and a screaming match with the CFO who says we have to re-use the old secretary's Dell for the new hire because "money is tight", the wizard just typo'd their own email address twice when filling out a form. On Monday when somebody else looks at it, it will be apparent that neither of the SSE instructions the wizard just wrote actually exists or has ever existed, which reminds us that the wizard might also have forgotten to check their new code even builds...
Can't this be used to ensure you're communicating with who you think you are? Either in a TOFU (trust on first use) approach like SSH fingerprints are in practice, or with external verification like SSH fingerprints can be in theory.
The .onion name can't exist without having the private key for it, that's kind of the point.
There is already a private key needed to prove that who you're talking to is the right person: otherwise the request can't be routed to it. That's pretty fundamental to how Tor hidden services work actually.
Does external cert validation for onion domains even make sense? I thought the "domain name" was already the hash of some public key that is used in the normal encryption of the onion router - so there is already a mandatory cryptographic proof that the service you're talking with "owns" the domain. What additional security benefit would CA-signed certs bring?
>I thought the "domain name" was already the hash of some public key
With v3 it's the ed25519 key with a checksum.
For something like a Cwtch address or your personal dissident blog criticizing Emutopia, it's enough your contacts get the address from you personally or that they find it some other way and pin the site to bookmarks for TOFU.
But with public services like Duckduckgo onion service, it's possible for people to trivially spin up their own unique per-target MITM proxy server instance, and share the link to their friends, bookmark it to their SO's Tor browser and MITM their connections, poison link repositories, or official links on wikipedia pages etc.
Having a CA validate you own the clearweb site first helps mitigate this stuff to some extent. Problem is of course, will the user know if they're supposed to be expecting a cert for a page they visit the first time.
(I wonder if Tor browser could have a list of pinned onion addresses with "clearweb_equivalent_of" field for this, and you could easily check that from the site security badge.)
This seems like a general problem of using search on onion. I don't really understand how this is supposed to work at all, honestly.
Either you already know the domain you want to visit or you don't.
If you do, you don't need search.
If you don't, how could you be sure that any search results are for the real site and not an MITM proxy?
A bit circular. How do you know the domain? Trust.
I like this “onion pinning” possibility:
There is value in exposing the existence of an onion site via CT Logs. If someone navigates to the plain web version of a site, and is presented with a certificate containing a Subject Alternative Name (SAN) for both the plain web and the onion site that provides a strong cryptographic guarantee that they are the same site. Effectively this would replace the Onion-Location header with something more authenticated.
Item 9 on their list of reasons is going to be applicable for the set of Onion services that overlap things that should exist conventionally. It says you can use this proof-of-control for an .onion name, then use conventional methods for any other name, then bind both names to the same TLS public key and thus to each other.
So that means e.g. www.facebookwkhpilnemxj7asaniu7vnjjbiltxjqhye3mhbshg7kx5tfyd.onion can have a certificate which also names www.facebook.com giving you confidence that it's not just some crook who wanted to impersonate facebook and also generated a key to match the name prefix.
The site where you can hire assassins for Bitcoin presumably does not have a legit web presence, but the site where you can order abortion pills in unmarked packaging sure does even if you must use Tor to access it from where you live.
Well said. That’s kind of what this is about. Redundancy of access.
They write the following reason in the article: But as the web and other internet technologies mature, certificates are starting to be a requirement in order to unleash functionalities, especially in web browsers, such as the faster connection protocol HTTP/2 and payment processing.
This seems really sad. But I guess it depends what the goal is. If you want to integrate onion purely on a DNS resolver and network interface level and then use a stock browser for accessing the services, yes, you'd need that.
(Then you'll also have to fight with the stock browser for using your special DNS resolver, not leaking info to Google, Cloudflare or whoever else, etc etc, tho)
But don't most people use custom browsers with built-in support for onion anyway? If that's the case, the easiest solution would seem to just declare .onion a "secure origin" like localhost and patch the browser accordingly.
> But don't most people use custom browsers with built-in support for onion anyway? If that's the case, the easiest solution would seem to just declare .onion a "secure origin" like localhost and patch the browser accordingly.
Indeed, the use of the onion TLD has been standardised in RFC 7686 [1], so browsers should really treat it as secure and stop the usual plaintext HTTP shenanings.
[1]: https://datatracker.ietf.org/doc/html/rfc7686
The article has a long list of reasons for certificates. Here's another reason:
>4. It also opens up new opportunities such as payment processing, "as current PCI DSS requirements do not allow non-standard TLS"2 and may only work with certificates having some sort of validation3. Payments card networks require HTTPS for a payment to be taken. So if someone wants to do that over an onion site they would need a TLS certificate.
> Does external cert validation for onion domains even make sense? […] What additional security benefit would CA-signed certs bring?
Yes, and the page/documents explain some use cases:
> The two ACME-defined methods allowed by CA/BF described in Sections 3.1.2 and 3.1.3 (http-01 and tls-alpn-01) do not allow issuance of wildcard certificates. A ".onion" Special-Use Domain Name can have subdomains (just like any other domain in the DNS), and a site operator may find it useful to have one certificate for all virtual hosts on their site. This new validation method incorporates the specially signed Certificate Signing Request (CSR) (as defined by Appendix B.2.b of [cabf-br]) into ACME to allow for the issuance of wildcard certificates.
* https://datatracker.ietf.org/doc/html/rfc9799#name-new-onion...
> Some Hidden Services do not wish to be accessible to the entire Tor network, and so they encrypt their Hidden Service Descriptor with the keys of clients authorized to connect. Without a way for the CA to signal what key it will use to connect, these services will not be able to obtain a certificate using http-01 or tls-alpn-01, nor enforce CAA with any validation method.
> To this end, an additional field in the challenge object is defined to allow the ACME server to advertise the Ed25519 public key it will use (as per the "Authentication during the introduction phase" section of [tor-spec]) to authenticate itself when retrieving the Hidden Service Descriptor.
* https://datatracker.ietf.org/doc/html/rfc9799#name-new-onion...
A certificate is just a fancy way of saying one key pair signed a message containing another public key. It is a link in a cryptographic chain of trust. Given that you already trust a public key, you should also trust some other public key with certain caveats because the first public key signed a message containing the second key and whatever else. A list of allowed domain names is a possible caveat.
An onion address is more like an IP address, except it is stable for hosts across time, and it contains enough information to cryptographically prove identity. It may be true that the browser interprets it as a domain name, but it is really operating at the network level where Tor is the network.
A certificate for a Tor address means you can go from DNS to a Tor address because the certificate will contain a list of allowed domains and the tor address contains a public key, same as usual.
"Doesn't make sense for us but mandated by policy" is a super common phenomenon that you'll sadly encounter all the time in the industry. Especially when it comes to security. In this case it's at least motivated by something as peripheral as onion services wanting to fit in with the browser ecosystem, which, fair, maybe it doesn't make sense for browsers to bloat their designs by taking onion services into account, and then onion services have to adapt to modern browser standards.
>"Doesn't make sense for us but mandated by policy"
It's way worse in the physical world than in the software world IMO.
The section Benefits (after Introduction) lists 9 reasons why it makes sense. Some of them are about working around a mismatch with existing standards, but not all.
"Automated Certificate Management Environment (ACME) Extensions for ".onion" Special-Use Domain Names", June 2025:
* https://datatracker.ietf.org/doc/html/rfc9799
This is another example of why requiring TLS everywhere doesn't make sense. Onion traffic is already encrypted, but because software demands TLS everywhere, we add TLS on top, even when unnecessary.
The same happens with 1:1 tunnels, or even localhost. None of these need TLS, and I should be able to tell my browser "enable all features on this site, consider it fully secure".
TLS everywhere isn’t a sane default designed to coddle power users and security pros, but to protect your everyday netizen. You suffer because of your competency.
For your arcane Wizardry you might want: —-unsafely-treat-insecure-origin-as-secure
https://peter.sh/experiments/chromium-command-line-switches/...
Like using the inherently unsafe language this only makes sense when you are the wizard and that's not always so perhaps better not.
I think a lot of wizards have bad days. On Thursday morning, with a fresh cup of coffee and a gleam in their eye they can write a thousand lines of tricky x86-64 assembler and every single instruction is perfect like God wrote it.
But on Friday evening, after getting only one hour's sleep because Theresa is teething and won't settle, and a screaming match with the CFO who says we have to re-use the old secretary's Dell for the new hire because "money is tight", the wizard just typo'd their own email address twice when filling out a form. On Monday when somebody else looks at it, it will be apparent that neither of the SSE instructions the wizard just wrote actually exists or has ever existed, which reminds us that the wizard might also have forgotten to check their new code even builds...
Aye, even wizards needs protection. And we are back to the start
Can't this be used to ensure you're communicating with who you think you are? Either in a TOFU (trust on first use) approach like SSH fingerprints are in practice, or with external verification like SSH fingerprints can be in theory.
The .onion name can't exist without having the private key for it, that's kind of the point.
There is already a private key needed to prove that who you're talking to is the right person: otherwise the request can't be routed to it. That's pretty fundamental to how Tor hidden services work actually.