Encryption over Hidden Services
Nathan Freitas
nathan at freitas.net
Fri Aug 6 16:35:44 UTC 2010
Thanks for the reply, Mike. More below...
On 8/06/2010 06:07 AM, Mike Perry wrote:
>> Is transport or message layer encryption redundant between a tor client
>> and a hidden service?
> The short answer is yes, we believe it is redundant and unnecessary.
Yay!
> The long answer is "My god man, hidden services? Nobody touches that
> code. Who knows what lurks in the deep..."
Boo! Well, considering the power that hidden services brings to mobile
devices, I think it may well be worth getting someone to return to the
deep.
Here's a quick rundown of why I want to use them on Android:
- As a mobile device is always switching between carrier networks, wifi,
moving around the world, it is obviously constantly changing its IP
address. Having an onion address be a consistent way of addressig a
device is a very useful tool.
- Currently, to get "push" updates, a mobile device has to either poll,
keep a socket open, or receive some sort of SMS or other network native
push. With a hidden service, we can hold a server socket open on the
device, and let Tor handle the inbound traffic.
- Tor has proven insanely more resilient than traditional VPNs when it
comes to switching between 3G, 2G, wifi and no network at all. I have
been very impressed in field testing. The fact that hidden service based
solutions would seamlessly become available on to of this reliability is
another good reason to use them on mobiles.
- I/we/us mobile developers would prefer to spend our time focusing on
the specific usability and user experience issues unique to these
devices, and not reinventing, rolling our own secure messaging
protocols. For us to be able to rely on Tor for this is a great benefit,
but obviously we'd want to make sure we weren't worshipping false gods.
> "usability" reasons in onion urls. These urls are the only real means
> a user has of authenticating their destination.
Right.
> In an ideal world, users are perfectly capable of memorizing every
> character of the 80bit base64 key hash in the .onion url.
What if used QRCodes or some other digital format to exchange the full
key hash, so that we don't have to shorten, vanitize or otherwise weaken
them?
> In the real world, it is disturbingly practical to compute .onion urls
> that have a significantly large number of characters in common with an
> arbitrary target url, in arbitrary positions of the url.
We would never do this... our focus is not on the hidden service web so
to speak, and more on the hidden service as infrastructure for
messaging. This removes a ton of issues around friendly URL schemes.
> So, while in theory, hidden services are probably just barely beyond
> the horizon of the cryptographically insecure, in practice they are a
> usability nightmare to use for things like chat, where humans need to
> establish visual continuity for the endpoint they are communicating
> with.
> Sadly, this issue may just be the tip of the iceberg in the hidden
> service design, and there may be other edge cases that could be a huge
> problem for uses like chat. Denial of service conditions are
> particularly susceptible to gaming for chat, and the hidden service
I need to think about this more... in terms of availability of services,
this is definitely an increased case with a mobile device to goes up and
down quite a bit. However, our perspective on that was to use that feature
As for the visual continuity and social engineering impersonation
aspects, this has been a huge problem with services like Skype, and they
use human readable, friendly names.
The other approach, used by Drop.io, is to create temporary hash-based
URL "rooms" or "drops", that are an anonymous place to temporarily
connect, share files, talk, etc, and then move on. Less permanent, more
transient, and more difficult perhaps to impersonate, since no one is
claiming a permanent .onion. Every chat, every session a new hostname
and port would be used for the hidden service.
We just need to figure out how to bootstrap the discovery bit..
> In the real world, secure distributed dictionaries for this may not
> exist, and/or may have subtle vulnerabilities of their own. They probably
> do exist for our centralized directory model, but we haven't really
> devoted enough thought into the hidden service design to really bother
> with them. Perhaps you could build one as part of your chat protocol.
Would love to, with some help. We have initially been thinking about a
centralized directory server, a basic DNS type service, mapping
nicknames to .onions. This would be optional.
> All that said, hidden services do have one signifcant advantage over
> regular Tor usage that is probably worth mentioning: Hidden services
> overall are generally more secure than using Tor to contact an
> arbitrary Internet endpoint using an arbitrary application.
YES! That is what I am most excited about, especially paired with the
P2P nature of what we are designing. No XMPP/Jabber hosted Gmail chat
service to worry about, no exit nodes... just two Tor client talking to
eachother in the most direct manner possible.
> Have you seen torchat? https://code.google.com/p/torchat/
Yup we are checking it out. It is an interesting start, and we are
trying to decide if we want to be interoperable with it or not.
Currently, we have a small embedded HTTP server that runs in the app and
were planning on basically using more of a text message style
connectionless model of interaction. We would also be able to support
file transfer.
> Probably your primary threat at this point is user error, more so than
> interception or the direct subversion of the hidden service protocol,
> which most likely is end-to-end secure...
Right, and that is where we want to focus our energy, while also
encouraging Tor core to continue work on improving hidden services for
applications just like these.
> If your secondary crypto system could somehow improve upon the utterly
> broken usability model of hidden services, then it would be a good
> idea to add it. Most likely though, a secondary system would just add
> confusion, and therefore insecurity. Ultimately it depends on the
> userbase and the UI design.
We have already implemented the chat OTR protocol using the OTR4Java
library, and it seems to work well enough. My experience has been that
users can be trained to press the "encrypt chat" button, but they very
rarely bother to verify their key fingerprints via a separate channel.
Thanks again for the reply. A lot to think about here. I will share a
larger design document for comment next week, and perhaps a prototype.
+n
More information about the tor-dev
mailing list