[tor-dev] Memorable onion addresses (was Discussion on the crypto migration plan of the identity keys of Hidden Services)
George Kadianakis
desnacked at riseup.net
Sun May 19 11:37:22 UTC 2013
> adrelanos:
>> George Kadianakis:
>> > If we move to the higher security of (e.g.) 128-bits, the base32
>> string
>> > suddenly becomes 26 characters. Is that still conveniently sized to
>> pass
>> > around, or should we admit that we failed this goal and we are free to
>> > crank up the security to 256-bits (output size of sha-256) which is a
>> 52
>> > character string?
>>
>> In doubt: if possible, maintainable, not too much work, you name it...
>> When having the less secure version as default, please let the hidden
>> service hosts decide if they want to use the more secure version by
>> using an option.
>>
>> I don't know if the petname system is an completely orthogonal issue or
>> if it could be considered when you decide this one.
>>
>> >> Or have an option for maximum key length and a weaker default if
>> common
>> >> CPU's are still too slow? I mean, if you want to make 2048 bit keys
>> the
>> >> default because you feel most hidden services have CPU's which are
>> too
>> >> slow for 4096 bit keys, then use 2048 bit as default with an option
>> to
>> >> use the max. of 4096 bit.
>> >>
>> >> Bonus point: Can you make the new implementation support less painful
>> >> updates (anyone or everyone) when the next update will be required?
>> >> (forward compatibility)
>> >
>> > I was also trying to think of a solution to this problem, but I
>> failed.
>
> I think you were heading in the right direction with the petname idea.
> What if we deployed a potentially shitty naming layer that "probably"
> won't break within the next 6-12 months, but *might* last quite a bit
> longer than that, for backward compatibility purposes.
>
> This naming layer could allow interested parties to sign registration
> statements using their current onion key with an expiration time,
> satisfying our deprecation desires for the 80 bit name. If the naming
> layer actually survives without visible compromise until that point, we
> could allow it to store signed statements about translations between the
> new keys and their desired name (first-come, first-serve; names are
> reserved for N months until resigned).
>
> A more specific version of this question is: How readily could we hack
> Namecoin or some other similar consensus-based naming system[1] into
> Tor?
>
> Such a mechanism would obviously provide enumeratability for hidden
> services that chose to use it, but hopefully it would be optional: you
> can still use IP addresses in browsers, after all.
>
>
> In terms of verification, it would be trivial to alter the browser UI to
> display the actual key behind the hidden service (ie: through a control
> port lookup command and some kind of URL icon that varied depending on
> consensus naming status).
>
> We could also provide a hacked version of CertPatrol that monitors the
> underlying public keys for you, and it would also be relatively easy to
> add a "second-look" authentication layer through the HTTPS-Everywhere
> SSL Observatory, similar to what exists now for SSL public keys.
>
> In fact, if we can agree on a solid consensus-based naming scheme as a
> valid transition step, I think it is worth my time to let the rest of
> the browser burn while I implement some kind of backup authentication +
> UI for this. After all, memorable hidden service naming would be a
> usability improvement.
>
>
> Should we try it?
>
> The major downside I am seeing is PR fallout from the hidden services
> that chose to use it.. They might be a unrepresentative subset of what
> actually people need hidden services for. I think the real win for
> hidden services is that we can turn them into arbitrary private
> communication endpoints, to allow people to communicate in ways that do
> not reveal their message contents *or* their social network. There
> probably are other uses whose promise would be lost in the noise
> generated from this scheme as well...
>
>
>
> 1. https://en.wikipedia.org/wiki/Namecoin.
>
> We don't have to choose Namecoin, though. Another alternative is for the
> dirauths to add a URI for an "official" naming directory file as a
> parameter in the consensus consensus, and also provide its SHA256/SHA-3.
> A flatfile might be less efficient than Namecoin in terms of storage and
> bandwidth requirements, though. It's probably also easier to censor
> (unless it is something like a magnet link).
>
> For all you Zooko's Triangle[2] fans: The Namecoin mechanism attempts to
> "square" the triangle with a first-come first-serve distributed
> consensus on the pet names document, but still fall back to
> "Secure+Global" at the expense of "Memorable". The interesting bit is
> that in this case, the browser UI can help you on the "Memorable" end,
> should the consensus mechanism fail behind your back.
>
> 2. https://en.wikipedia.org/wiki/Zooko%27s_triangle
>
>
(I forked the thread, since this is hopefully orthogonal to HS identity
key migration.)
Chuff chuff! Train of thought coming up, since this is a problem I've also
been thinking about lately...
Mike, I like the simplicity and implementability of your idea. Giving
signed (<name> to <onion address>) mappings to the directory authorities
(in a FIFO fashion) and then publishing them as a directory documents is
effective and easy-ish both to implement and understand.
That said, I wonder what's actually going to happen if we implement and
deploy this. I imagine that scammers will try to win the race-condition
against the legitimate hidden services, and they will flood the directory
with false mappings. For example, scammers might claim all memorable names
for the Tor website hidden service, like "torproject" "torpoject1" etc.
(and I doubt that anyone can win a race against a well equipped
scammer...) In the end, many legit hidden services might need to register
names like "t0rproj3ct1" and "123duckduckg0" which will be lost in the
noise of that directory document. Then people might think that searching
for "torproject" in the TBB petname tool ought to return the official
torproject website, but instead the first results will be scammer
websites.
Of course, the current situation, where people get their onions using
pastebins and the "Hidden Wiki" (lulz), is not any better. Although I hope
that when all URLs look random, people don't consider a URL being more
official than other URLs (whereas in the above idea "torproject" might
look a bit more official than "t0rpoj3ct"). Still, even with a current
situation, a shallot-generated "torpr0jectkakqn.onion" might look more
official than "idnxcnkne4qt76tg.onion" (which is the actual onion address
for the torproject website)... I really don't know what's the best way to
proceed here, it's tradeoffs all the way down.
If I could automagically generate secure technologies on a whim, I would
say that some kind of decentralized reputation-based fair search engine
for hidden services might squarify our Zooko's triangle a bit.
"decentralized" so that no entities have pure control over the results.
"reputation-based" so that legit hidden services flow on top. "fair" so
that no scammers with lots of boxes can game the system. Unfortunately,
"fair" and "reputation-based" usually contradict each other.
In any case, Mike, your idea is definitely worth considering, but before
designing and implementing it we should think how to mitigate most easy
attacks against it.
Also, thankfully the idea in its basic form is orthogonal to the identity
key migration project.
More information about the tor-dev
mailing list