[tor-dev] Improving Private Browsing Mode/Tor Browser
Mike Perry
mikeperry at fscked.org
Thu Jun 23 01:34:42 UTC 2011
Thus spake Georg Koppen (g.koppen at jondos.de):
> Thus, first I am not sure about the relationship of the improved private
> browsing and the anon mode. It seems like the former is kind of
> precondition of the latter and the latter adds some special anon
> features (or just layout stuff??): "We would love to be able to ship a
> vastly simplified browser extension that contains only a compiled Tor
> binary and some minimal addon code that simply "upgrades" the user's
> private browsing mode into a fully functional anonymous mode." What
> shall the upgrade do exactly?
What the upgrade does depends on how good the private browsing mode
is. Historically, browser makers have been very conservative, and are
reluctant to implement new features if there is any possibility of
site breakage.
Additionally, we expect that fingerprinting resistance will be an
ongoing battle: as new browser features are added, new fingerprinting
defenses will be needed. Furthermore, we'll likely be inclined to
deploy unproven but better-than-nothing fingerprinting defenses (so
long as they don't break much), where as the browser vendors may be
more conservative on this front, too.
> And why having again add-ons that can probably be toggled on/off and
> are thus more error-prone than just having an, say, Tor anon mode?
> Or is this already included in the Tor anon mode but only separated
> in the blog post for explanatory purposes?
If we operate by upgrading private browsing mode, we'll effectively
have the "toggle" in a place where users have already been trained by
the UI to go for privacy. Torbutton would become an addon that is only
active in private browsing mode. We expect that the browser vendors
will perform usability studies to determine the best way to provide
users with the UI to enter private browsing mode easily.
We also expect that if browser vendors become serious enough about
privacy, they will be the ones who deal with all the linkability
issues between the private and non-private states, not us.
> Sticking to the blog post (one of) its central idea seems to be to
> isolate the identifiers and state to the top-level domain in the URL bar
> as "activity in Tor Browser on one site should not trivially
> de-anonymize their activity [i.e. the activity of Tor users, G.K.] on
> another site to ad networks and exits". I am wondering whether this idea
> really helps here at least regarding exit mixes. If one user requests
> google.com, mail.google.com and other Google services within the 10
> minutes interval (I am simplifying here a bit) without deploying TLS the
> exit is still able to connect the whole activity and "sees" which
> services that particular user is requesting/using. Even worse, if the
> browser session is quite long there is a chance of recognizing that user
> again if she happens to have the same exit mix more than once. Thus, I
> do not see how that helps avoiding linkability for users that need/want
> strong anonymity while surfing the web. Would be good to get that
> explained in some detail. Or maybe I am missing a point here.
We also hope to provide a "New Identity" functionality to address the
persistent state issue, but perhaps this also should be an explicit
responsibility of the mode rather than the addon..
I hear that Google has actually done some studies of Incognito mode,
and users do expect that they have to close the Incognito mode windows
to clear the Incognito cookies and state from memory. They may only
expect this because it's clear that they're not entirely exiting the
browser via this action, though...
> Now something to the proposed behavior of the referer and window.name.
> It is said that they should adhere to the "same-origin policy where
> sites from different origins get either no referer, or a referer that is
> truncated to the top-level domain". Assuming I understood TorButton's
> Smart-Spoofing option properly: Why is it not applied to the
> referer/window.name anymore? In other words: Why is the referer (and
> window.name) not kept if the user surfs within one domain (let's say
> from example.com to foo.example.com and then to foo.bar.example.com)?
I don't really understand this question. The referer should be kept in
these cases.
> Before we implemented almost the same algorithm than Torbutton's
> smart-spoof algo in our own extension a while ago we had some
> compatibility issues (I remember yahoo.com that needed to have the
> referer untouched while surfing within the domain) that got fixed by it
> and never popped up again. Why stepping back? The idea "sites should
> also be allowed to request an exemption to this rule on a per-site basis
> using an html attribute, which could trigger a chrome permissions
> request, or simply be granted automatically (on the assumption that they
> could just URL smuggle the data)" seems rather awkward and not a good
> solution to a problem that is not really one.
>
> One final point: Leaving my previous section aside: Why is the referer
> and window.name not treated in the same way as cookies and others in the
> proposal. Why having two different policies for identifiers? I am not
> sure whether there could emerge some attacks out of that distinction but
> my feeling tells me that there should ideally be just one policy that
> governs all those identifiers. At least it would probably be easier to
> implement and to audit them.
Neither of these properties are really identifiers (yes yes,
window.name can store identifiers, but it is more than that). Both are
more like cross-page information channels.
Hence it doesn't make sense to "clear" them like cookies. Instead, It
makes more sense to prohibit information transmission through them in
certain cases. I believe the cases where you want to prohibit the
information transmission end up being the same for both of these
information channels.
To respond to your previous paragraph, it is debatable exactly how
strict a policy we want here, but my guess is that for Tor, we have
enough IP unlinkability such that the answer can be "not very", in
favor of not breaking sites that use these information channels
legitimately.
The fact is that other information channels exist for sites to
communicate information about visitors to their 3rd party content. If
you consider what you actually *can* restrict in terms of information
transmission between sites and their 3rd party elements, the answer is
"not much".
So in my mind, it becomes a question of "What would you be actually
preventing by *completely disabling* referers (and window.name)
entirely?"
It seems to me that the answer to this question is "You only prevent
accidental leakage", because bad actors can use URL params as an
information channel to their 3rd party elements just fine, and
tracking and ad-targeting will continue. In a world without referers,
sites would actually be incentivized to do this information passing,
because ad networks will be able to serve better ads and pay them more
money.
If someone did a crawl of the top 10k sites and found that none of
them would break by disabling or restricting referers, I might change
my mind for Torbutton, because it is unlikely that sites will adapt
just for Torbutton users. However, you still have the property that if
the browser vendors decided to disable referers, sites would build
mechanisms to transmit referer-style information anyway. Hence, when
talking to browser makers, it doesn't make sense to recommend that
they disable referer information. They should instead simply allow
sites to have better privacy controls over them if they wish.
Does this reasoning make sense? I suppose it is somewhat abstract, and
very conditional.
--
Mike Perry
Mad Computer Scientist
fscked.org evil labs
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: not available
URL: <http://lists.torproject.org/pipermail/tor-dev/attachments/20110622/1b3c2151/attachment.pgp>
More information about the tor-dev
mailing list