Obfuscated URLs?
mogulguy at yahoo.com
mogulguy at yahoo.com
Wed Jul 1 20:27:03 UTC 2009
--- On Tue, 6/30/09, Freemor <freemor at gmail.com> wrote:
> Martin Fick <mogulguy at yahoo.com> wrote:
>
> > In my scenario, the point of hard coding the path is
> > to obfuscate the final URL, ...
> ... But hidden services provide this functionality
> already.
They provide an obfuscation to a service that you must
run, not an obfuscation of a URL, this is not the same
functionality. It's like the difference between an
email address and an email server. Related, but very
different.
> I do understand the potential difficulties in setting up a
> hidden service. But I think it would be easier to automate
> this aspect of Tor then to write a new protocol. (some
> more thoughts on this below)
Well, I do not think that you are really giving
hidden services proper consideration. While setup
is certainly part of the problem, the real kicker,
of course, just like running your own web server,
is running it!
Maintenance and availability are the real issues
(hosting). Even many people with the skills to setup
their own servers (email, http...) still pay for
someone to host them (even though they might
personally admin it remotely). So the biggest
barrier is hosting, not setup. Hosting requires the
publisher to be permanently active, to have a PC
always connected to the internet, and to personally
be linked to the content. The last part is not
just an issue of anonymity, suppose that my identity
is already known, how could I ever host content that
someone wants to censor without the risk of being
raided (and the content thus censored) since my
identity is already known?
The alternative, remote hosting anonymously is
hard, as soon as you add payment to the deal, it adds
a whole new level of risk to blowing you or your
hosts anonymity, again making censorship more
likely.
Risks aside, if you look at non-tor stuff, how
many individuals are willing to run their own email
server, web server? Very few. Why would adding an
extra layer (tor) to the mix, even if setup became
much easier, suddenly make people more likely to host
their own server? Of course, it wouldn't. This
means your are starting with a too small pool
of potential personal hidden service hosts in the
first place.
However, how many people have email accounts
(virtually everybody)? How many are subscribed to
some forum, or internet group, or blogging site
with some mechanism to publish to free accounts?
Not quite as many as the people with email
accounts, but quite a large number nevertheless.
These are the people that this targets as a
potential publishing base. This number is much
higher than those who run personal web servers.
This is an entirely different "demographic" than
those running hidden services.
Hidden services are a genius hack, they provide
interactivity, something all the other censor free
systems do not. But, interactivity is not always
needed. Why force a mechanism designed for dynamic
content (and thus more difficult and riskier to
implement) onto static content when their is an
easier (*1) and potentially more robust solution
that tor could implement for static content?
*1 If indeed it is easy, of course, the devil is
in the details. :(
> > As for use cases, I envision that as a simple whistle
> > blower or reporter, I would post my content on various
>
> [snip]
>
> OK I now have a clearer idea of what you are wanting to
> do:
>
> 1). Simple anonymous publishing
> 2). Remove the single point of failure that a hidden
> service may represent
> 3). Plausable deniability by not having the information
> hosting tied to you.
Those are some of the things that I am suggesting, yes.
But, I am not limiting it to those. If you limit your
thinking to that, then, of course, systems that only
implement that will look appealing. :)
> 1). Someone sets up a hidden service that automatically
> re-directs to the content hosted on non-Hidden sites ...
I think you are effectively talking about a hidden
remapping (not redirecting) proxy. Yes, this would
work, but it is not a "good" solution. Again, it
suffers from the same hosting requirement and does
not reduce risk much. Additionally, this would
require each implementer to come up with their own
implementation, make bugs and disclosure that more
likely (not peer reviewed...)
> 2.) GnuNet may be much better suited to what you are
> looking to do. It already has a lot of these features
> (see http://gnunet.org ) Once you inserted the
> information into GnuNet you could share the hash for
> it in as many open sites as you wanted.
Thanks for the link, I will investigate more. But,
again, this looks like a distributed storage
mechanism, not a URL obfuscator. It is useful and
likely better at doing some of the things that
Obfuscated URLs could do, but not all of them.
What I am proposing does not push the content into
the tor system as most distributed storage systems
do. It is an access, not a storage method.
And, of course, I am proposing using tor for my
purposes, I do not want to add another system to
my arsenal. As soon as you do, it becomes very
complicated to work with 2, 3... systems
simultaneously.
As a resource contributor, do I contribute to
tor, to freenet, to gnunet, to offsystem...?
Each one of these systems that I contribute to
takes bandwidth from the other. This is less
than ideal. I believe in many of the objectives
of many of these systems and dream of a unified
way to achieve and contribute to them all.
Fragmentation has its advantages, particularly
during the research phases. tor is maturer
than most technologies, but eventually many of
the other systems will implement more and more
common functionality. If they can do more
than tor, even if they are not quite as good
at tor's specialty, eventually they might
obsolete tor and everyone will suffer since we
will be using sub par implementations. Tor
needs to continue getting better at what it is
good at, but if it can also expand its
capabilities it will likely help expand its
user, contributor, and developer bases also.
> As for making the content password protected GnuPG
> would work wonders for this (prior to insertion of
> course)
Yes, in fact, I would suggest that documents be store
as .pgps at the real location. But to be clear, the
intent of the encryption that I was talking about was
to make the obfuscation better, not to control access.
In other words, it prevents someone from guessing the
location of the obfuscated URL and seeing if their
guess is correct by attempting to compare the data
retrieve via the obfuscated URL and via going directly
to their guessed location. Obviously they can still
compare file sizes...
One additional reason that storing documents as pgps
would be nice is because then the documents do not
appear to be tor specific documents, they do not
shout: "here is an obfuscated URL document!" :)
I hope that some of this makes more sense, cheers,
-Martin
More information about the tor-talk
mailing list