RSS Atom Add a new post titled:
How I set up my websites with Tor and Nginx

I was recently asked how I setup my websites to:

  1. Redirect HTTP to HTTPS when not accessed via an onion service.
  2. Serve the website over HTTPS when not accessed via an onion service.
  3. Serve the website over HTTP when accessed via an onion service.

I will further explain:

  • How the .onion available button is obtained in my setup.
  • How to add an onion Alt-Svc that works.

I have a very simple setup. I have a tor daemon running on the same machine as nginx. As most of my websites are static, nginx serves their files directly in most cases. There is no software between tor and nginx; if there is for you, that drastically changes things and this post may be of little use to you. If you have extra software "behind" nginx (e.g. a python app generating a dynamic website), most likely this post will still be useful to you. For example, instead of telling nginx this like I do:

location / {
        try_files $uri $uri/ =404;

You might be telling nginx this:

location / {
        include proxy_params;
        proxy_pass http://unix:/path/to/some/app.sock;

I use Certbot (Let's Encrypt) for my CA, and it automatically generates some of the nginx config you will see below.

All of the nginx config blocks are in one file, /etc/nginx/sites-available/ As is standard with nginx on Debian, there's a symlink to that file in /etc/nginx/sites-enabled/ and /etc/nginx/nginx.conf was already set to load files in /etc/nginx/sites-enabled/.

This post uses and its onion address as an example. Whenever you see or its onion address, mentally replace the domains with your own.

Redirect HTTP to HTTPS when not accessed via an onion service.

This is entirely handled by nginx and uses a server {} block automatically generated by Certbot. It is this:

server {
    if ($host = {
        return 301 https://$host$request_uri;
    } # managed by Certbot
    listen 80;
    listen [::]:80;
    return 404; # managed by Certbot

All this block does is redirect to HTTPS. It is used when the user is visiting on port 80, as indicated by the server_name and listen lines.

Serve the website over HTTPS when not accessed via an onion service.

This is entirely handled by nginx. Again as the server_name and listen lines indicate, this block is used when the user is visiting on port 443 (using TLS). This is overwhelmingly automatically generated by Certbot too.

I slightly simplified this block as presented here. We will edit this block later in this post to add Onion-Location and Alt-Svc headers.

server {
    root /var/www/;
    index index.html;
    location / {
        try_files $uri $uri/ =404;
    listen [::]:443 ssl; # managed by Certbot
    listen 443 ssl; # managed by Certbot
    ssl_certificate /etc/letsencrypt/live/; # managed by Certbot
    ssl_certificate_key /etc/letsencrypt/live/; # managed by Certbot
    include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
    ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot

Serve the website over HTTP when accessed via an onion service.

This is the nginx config block. It is a simplified version of the previous one, as it is also actually serving the website, but with plain HTTP and when the user is visiting the onion service, not

server {
    listen 80;
    listen [::]:80;
    server_name jsd33qlp6p2t3snyw4prmwdh2sukssefbpjy6katca5imn4zz4pepdid.onion;
    root /var/www/;
    index index.html;
    location / {
        try_files $uri $uri/ =404;

These are the relevant lines from the tor's torrc. We will edit this block later in this post to add Alt-Svc support.

HiddenServiceDir /var/lib/tor/flashflow.pastly.xyz_service
HiddenServicePort 80

In this post I've shared two server {} blocks that tell nginx to listen on port 80. Nginx knows to use this block for onion service connections because the server_name (the hostname that the user's browser is telling nginx it wants to visit) is the onion service. Nginx uses the other server {} block with port 80 when the user's browser tells nginx that it wants to visit

After adding those lines to the torrc, I reloaded tor (restart not required). Then I could learn what the onion address is:

$ cat /var/lib/tor/flashflow.pastly.xyz_service/hostname 

And from there knew what to put on the server_name line.

Whenever I edited nginx's config, I reloaded nginx when done (systemctl reload nginx) and verified it didn't say there was an error.

Whenever I edited tor's config, I reloaded tor when done (systemctl reload tor@default) and verified by checking tor's logs that there was no error (journalctl -eu tor@default) and that tor is still running (systemctl status tor@default).

How the .onion available button is obtained in my setup.

Verify that the preceeding steps are working. Verify that:

  1. Visiting redirects to to serve the website.
  2. Visiting http://jsd33qlp6[...]d.onion serves the website.

This button advertises the fact that the website is also available at an onion service, which improves users' security and may even improve their performance. Further, if they've configured Tor Browser to do so, Tor Browser can automatically redirect to the onion service instead of presenting a button for the user to maybe click.

Find the 2nd server {} block you added, the one that listens on port 443. We are now going to add a single line to it that instructs nginx to add an HTTP header in its responses.

server {
    [... lines omitted ...]
    location / {
        try_files $uri $uri/ =404;
        add_header Onion-Location http://jsd33qlp6p2t3snyw4prmwdh2sukssefbpjy6katca5imn4zz4pepdid.onion$request_uri;
    listen [::]:443 ssl; # managed by Certbot
    listen 443 ssl; # managed by Certbot
    [... lines omitted ...]

Reload nginx and verify it didn't say there was an error.

Visiting should now result in a purple .onion available button appearing in the URL bar when the page is done loading. Clicking it will take the user from to http://jsd33qlp6[...]d.onion/foo/bar.

How to add an onion Alt-Svc that works.

Verify that the preceeding steps are working. Verify that:

  1. Visiting redirects to to serve the website.
  2. Visiting http://jsd33qlp6[...]d.onion serves the website.
  3. (Optional) visiting results in a purple .onion available button in the URL bar.

This is another HTTP header that tells the browser there is another way to fetch the given resource that it should consider using in the future instead. The Alt-Svc header is used in contexts entirely outside of Tor, but it can also be used to tell Tor Browser to consider secretly fetching content from this host from an onion service in the future.

Common gotcha: The onion service must also support HTTPS. The onion service does not need a TLS certificate that is valid for the onion address: it should just use the same certificate as the regular web service, even though it is invalid for the onion service. The browser verifies that the certificate it gets from jsd33qlp6[...]d.onion is valid for when using the .onion as an Alt-Svc for the .xyz.

Add to the torrc the following line:

HiddenServiceDir /var/lib/tor/flashflow.pastly.xyz_service
HiddenServicePort 80
HiddenServicePort 443

Reload tor when done (systemctl reload tor@default) and verify by checking tor's logs that there was no error (journalctl -eu tor@default) and that tor is still running (systemctl status tor@default).

Find the 2nd server {} block you added, the one that listens on port 443. We are now going to add a single line to it that instructs nginx to add an HTTP header in its responses, and edit the server_name line to list the onion service.

server {
    server_name jsd33qlp6p2t3snyw4prmwdh2sukssefbpjy6katca5imn4zz4pepdid.onion;
    [... lines omitted ...]
    location / {
        try_files $uri $uri/ =404;
        add_header Alt-Svc 'h2="jsd33qlp6p2t3snyw4prmwdh2sukssefbpjy6katca5imn4zz4pepdid.onion:443"; ma=86400;';
    listen [::]:443 ssl; # managed by Certbot
    listen 443 ssl; # managed by Certbot
    [... lines omitted ...]

Reload nginx and verify it didn't say there was an error.

You can verify the Alt-Svc header is being sent by, well, inspecting the headers that nginx sends when you request either or https://jsd33qlp6[...]d.onion.

$ curl --head
HTTP/2 200 
server: nginx/1.14.2
[... lines omitted ...]
onion-location: http://jsd33qlp6p2t3snyw4prmwdh2sukssefbpjy6katca5imn4zz4pepdid.onion/
alt-svc: h2="jsd33qlp6p2t3snyw4prmwdh2sukssefbpjy6katca5imn4zz4pepdid.onion:443"; ma=86400;
[... lines omitted ...]

# the --insecure flag tells curl to keep going even though it will see a
# cert that isn't valid for the onion service. This is expected, as
# explained previously.
$ torsocks curl --insecure --head https://jsd33qlp6p2t3snyw4prmwdh2sukssefbpjy6katca5imn4zz4pepdid.onion
HTTP/2 200 
server: nginx/1.14.2
[... lines omitted ...]
onion-location: http://jsd33qlp6p2t3snyw4prmwdh2sukssefbpjy6katca5imn4zz4pepdid.onion/
alt-svc: h2="jsd33qlp6p2t3snyw4prmwdh2sukssefbpjy6katca5imn4zz4pepdid.onion:443"; ma=86400;
[... lines omitted ...]

Verifying that Tor Browser actually uses the headers is harder and beyond the scope of this post. The basic idea is to abuse Alt-Svc to serve something different up via the onion service and check that you get the different content after a couple of page refreshes.

Enough about Hacker Factor's '0days'

Last summer Dr. Neal Krawetz AKA "Hacker Factor" made a series of posts on his blog about Tor "0days." This post is a summary of Tor Project's response to one of his posts. Neither this post nor Tor Project's tweet serve as a perfect point-by-point rebuttal of everything he claims in the post, nor all of his "0day" posts. The things that he says that are skipped over here are not automatically valid just because they are skipped. The theme of the responses hold for just about everything he ever says about Tor. As they say, it's easier to spread bullshit than it is to refute it.

Okay wait. Many of the things he says aren't bullshit. He has some valid points. He just can't express those points in a productive manner anymore. His Tor posts are riddled with phrases that instantly put Tor people on the defensive, so it's a masochistic exercise to review them again every time someone asks "hey, what's your thoughts on this HF guy's post from last year?"

The Tor Project tweet is a level headed response (that I helped write); again, this is just a summary of that response, and I'm taking the opportunity to vent while writing it. I will take no questions or comments, nor read emails about this post. I'm freely using inflammatory, emotionally charged language because-- unlike HF--I do not expect, or want, a conversation to come out of this. This is a crass cathartic exercise for me.

The title of the HF blog post this post deals with is "Tor 0day: Burning Bridges." You can find it with your favorite search engine; I'm not going to help drive traffic to his site.

Here we go. The actual content of this "short" post I'm writing for my own reference. Links to additional anti-HF texts on the Internet are at the end of this post.

Use of the word 0day

HF knows exactly what he's doing when he uses the term "0day." He's not stupid. He knows what people immediately think when they here that term. He knows 0day sounds scary and gets people excited about a dangerous new discovery. He knows he'd get media attention.

He hides behind "well technically I'm correct because one of the little-used definitions of 0day includes things that aren't fixed and exist in the wild." You're technically and pointlessly correct, HF. And every time someone calls you out on this inflammatory word choice, you get the free rebuttal of "you're not even addressing the real issues! You just don't like my (perfectly valid!!!1!) word choice. You clearly have nothing." No. Fuck you. Use this excuse again if/when you see this, then fuck right off.

Scrollbar width

This is (and was) a publicly documented information leak. There are many ways the user's OS can be leaked (one of them is even on purpose!), and fixing just one of them without fixing many of the others is pointless.

People should report bugs like this so they can be documented and fixed in batches. People should not throw a hissy fit when the bug isn't fixed right away in order to validate their sense of self-importance.

Tor's TLS fingerprint

The way Tor uses TLS between relays and between a client and their relay is (and always has been) fingerprintable. This has been publicly known since 2007. Before HF, it was brought up in 2018 in a much more slanderous and make-a-name-for-yourself-at-the-cost-of-others tone (archive copy).

HF's proposed solution is the wrong one. Tor Project has decided on a better one: bridges with pluggable transports.

Obfs4 is identifiable

Perhaps surprisingly, this is known. It's also an important problem. It's being worked on at a pace slower than HF finds acceptable.

But HF presents variations on known attacks without evidence that they work at a large scale. Two possible issues: too much state to keep track of, or too many false positives such that the adversary is unwilling to deploy it. Luckily for HF, the bar for publishing "science" in a blog post is on the ground. He can say things confidentially and non-experts believe him. Shame on you, HF.

He further shows that he barely looked into this before putting pen to paper (or fingers to keyboard?) by

  • admitting to not knowing of any prior work (in response Tor Project points him to some),

  • citing a paper to support the claim that the Great Firewall can detect obfs4 when the paper say the opposite,

  • citing a blog post about obfs4 bridges being blocked in China, then ignoring that the issue discussed therein is about bridge distribution. Remember HF, in this section you were talking about fingerprintable network activity.

Additional links

Tor is not 'TOR' nor is it 'The Onion Router'

Warning: pedantry. I'm writing this down once so I have something to refer to in the future when I want to find this PDF again.

Dr. Paul Syverson is "the father of onion routing." He and his colleagues at NRL 20 years ago created onion routing, and he plus Nick Mathewson and Roger Dingleline wrote the origin tor code (adapted from code Matej Pfajfar wrote) in the early 2000s.

In short: Dr. Syverson is an authority figure in this space and knows what he's talking about. He was there and he is a primary source.

In 2011 he gave a keynote at ACSAC about the history of onion routing. The PDF is located here. The paragraphs before section 4.1 on page 129 explain how

  1. It's not "TOR" and never was.

    It was also [Roger's] decision that it should be written ‘Tor’ not ‘TOR’. Making it more of an ordinary word in this way also emphasizes the overlap of meaning with the German word ‘Tor’, which is gate (as in a city gate).

  2. It does not stand for "The Onion Router."

    Thus, when [Roger] told people he was working on onion routing, they would ask him which one. He would respond that it was the onion routing, the original program of projects from NRL. It was Rachel Greenstadt who noted to him that this was a nice acronym and gave Tor its name. Roger then observed that it also works well as a recursive acronym, ‘Tor’s onion routing’.

If you must consider "Tor" to be an acronym, it stands for "the onion routing," not "the onion router." To stay internally consistent, I begrudgingly admit that Tor is an acronym, since this post is framed as pedantic and I'm-more-technically-correct-than-you.

My personal opinion: the """war""" on what the acronym stands for has been lost. The spelling """war""" is still worth fighting.

Debunking 'OSINT Analysis of the TOR Foundation' and a few words about Tor's directory authorities

The following post was not written by me. It was written by Julien Voisin and posted on his blog in October 2018. I am sharing it here, unedited except as noted below, according to the CC BY-SA license of the post.

Edits made:

  • Add table of contents.
  • Change local links to point to my copies of the paper and its figures, not Julien Voisin's copies.

The paper it talks about is old news at this point (from 2018), but I see someone stumble upon it every few months ... instances that are just spread out enough I can never remember where this amazing post is on the web. Now I can't lose it.

Title: Debunking "OSINT Analysis of the TOR Foundation" and a few words about Tor's directory authorities
Date: 2018-10-04 15:00

I have spent years on Tails' IRC channel answering questions from various users, amassing a pile of personal notes about the internals of both Tor and Tails in the process.

A friend of mine linked me an "interesting" paper (local mirror) entitled OSINT Analysis of the TOR Foundation, and was wondering how much trust to put in it. I read it, and decided that it was so hilariously bad that it deserved a blogpost. It's also a nice opportunity to explain a few things about the directory authorities (dirauth).

The post is in two parts: first, a rough explanation about what the dirauth are and how resilient is the tor network with regard to them, then a complete review of the paper.

Tor and the dirauth

The Tor network is mainly composed of relays run by volunteers, with various attributes: exit, fast, guard, hsdir, running, stable, valid, badexit, v2dir, … but also authority.

Tor 0.0.2, released in 2004, introduced Directory Authorities, servers that served (duh.) cryptographically signed directory documents, containing a list of all relays along with their associated metadata (capacity, version, uptime, …) and status.

But the first version of the directory protocol didn't prevent a lying authority from providing a distorted view to some clients. This is why the second iteration implement cryptographic signature, to allow the client to only trust the directory documents signed by strictly more than half of all the dirauth.

The third version (which happens to be the current one) provides support for offline storing of critical cryptographic material for the dirauth, so that keys don't have to be stored in plain-text on the machines anymore. Furthermore, it introduced a nicely constructed consensus for the dirauth, instead of asking the client to aggregate all the separate data, to fight partitioning attacks.

It's possible to take a look at what the consensus looks like here.

How are new relays registered?

When a new relay comes online, it uploads its relay descriptor to a dirauth, to register itself with the tor network. Each dirauth is taking its view of the network and every hour gossip their 'vote' of the view of the network (which is essentially all the relay descriptors that they are aware of, and the bwauth measurement results) with the other directory authorities, and that is all merged into one 'consensus document' that is the global view of the network for a certain duration. That consensus document is fed to the fall-back authorities, who are the frontlines to clients coming online and needing to load the current state of the network.

Who controls the dirauth?

There are currently 10 relays with this flag:

Additionally, there is a bridge authority, that isn't a v3 directory one, listed here only for completeness' sake:

All of them are either in North-america or in Europe. I'm not in the business of doxing people, but it's pretty easy to find the social graph and nationality of all the admin in the list, their relationship to the Tor project, and even to have a beer with some of them :)

What happens to the network if the dirauth goes down?

If the authorities were all shut down, clients would still be able to download the list of relays: your client doesn't actually get the relay documents directly from the authorities, but from caches from Tor nodes with the V2dir flag. Your tor client has as well a local cache anyway.

As for compromised directory authorities, starting from the version 2 of the directory protocol, on top of downloading the actual relay documents, your client is also getting hashes of the relay documents signed by other authorities: relay documents will only be trusted if they are signed by at least half of the authorities. If one or two or three authorities were to be compromised, they won't be force clients to accept a distorted versions of the consensus.

Can't we replace the authorities with something more distributed?

It's a non-trivial problem.

Attacks on the dirauth

I discussed a bit with nextgens and others about the dirauth, and it's actually not that trivial to influence them. The goto way would be to simply pop them, but I do trust their respective maintainer to have deployed a bunch of fancy mitigations and monitoring.

An other way would be to influence them, by taking control of the pipes of the majority of the dirauth to influence their measurements. Fortunately, the dirauth aren't really doing measurements on their own: bandwidth authorities (bwauth) are, and those are transmitting their calculations to the dirauth they have a pre-established relationship with. Some bandwidth authorities are being run by dirauths, but most of them are not being run on the dirauth machine itself, but are 'hidden' elsewhere on the network

The paper

Author and context

The paper was written by Maxence Delong, Eric Filiol, Clément Coddet, Olivier Fatou and Clément Suhard, from the ESIEA, in Laval, more specifically, from the Operational Cryptology and Virology Laboratory (C + V)O. At this time, everyone but Eric Filiol was a student

Eric Filiol is known for pretending to have broken AES in 2002 (he didn't), and in 2003 (he still didn't) and Tor in 2011 (he didn't either), and for being the architect and designer of DAVFI, a French "new generation anti-malware solution", known for a being a phenomenal (and extraordinary expensive) source of fun.

The paper was presented at the 13th International Conference on Cyber Warfare and Security (ICCWS 2018), and apparently underwent a "double-blind peer review process". The conference is organised by Academic Conferences and Publishing International Limited, organizers of a bunch of conferences.

The blog of the Operational Cryptology and Virology Laboratory (C + V)O published a blogpost entitled "OSINT on the TOR Foundation (Update)", by Eric Filiol, containing two exaggerations (amongst, as usual, various typos):

As we shown on our paper “OSINT Analysis of the TORFoundation”, we worked on the funds and proved that the US government is deeply involve with arpproximatly 85% of the funds in 2015.

The paper states the following:

As we can see, at least 58.20% of the total funds are coming from different departments of the US government. The status of RFA (Radio Free Asia) Contract is unclear and there are persistent allegations and testimonies (Prados, 2017; Levine, 2015) or even suggestions that it could be strongly connected to the CIA more than expected (Levine, 2015). Would this suspicion be true, the rate of funds from US government-related entities would grow up to 85.24%.

There is a difference between a suspicion, and the blogpost's affirmation, especially when it changes a number from 58.20% to 85.24%.

Secondly, we had some reasons to believe that the US government has strong links with The TOR Project Inc. via Roger Dingledine who made an internship in NSA and with some presentations in front of high authorities like the White House and the FBI.

I don't think that doing an Summer internship at the NSA qualifies as a "strong link". About the presentations, it's well known that R. Dingledine does a lot of them to law enforcement entities, to improve their view on the network, and more broadly the Tor ecosystem. A lot of his bio for various conferences are ending with this:

In addition to all the hats he wears for Tor, Roger organizes academic conferences on anonymity, speaks at a wide variety of industry and hacker conferences, and also does tutorials on anonymity for national and foreign law enforcement.

I don't think that this could be viewed as a credible connection to the US government.

Form and sources

It's worth noting that while all the figures used in the paper are unreadable, it's possible to extract them with pdfimages (or to check the sources) to see that they are in pretty high-resolution, and actually readable: Figure 1., Figure 2., Figure 3. and Figure 4..

The figure 3. doesn't come with any legend with regard to the used currency, but since its point is to show a ratio, it doesn't matter much.

Despite a second revision to improve the English and remove the typos, the paper is still full of typos, frenchisms, and oddly worded sentences. Amusingly, this is the diff between the second and the third (and final at this time) revision of the paper:

-\author{Maxence Delong$^{1}$, Eric Filiol$^{2}$, Clément Coddet$^{3}$, Olivier Fatou$^{4}$, Clément Suhard$^{5}$}% <-this % stops a space
+\author{Maxence Delong, Eric Filiol\thanks{Contact author: \url{}}, Clément Coddet, Olivier Fatou, Clément Suhard\\
+        ESIEA Laval, Operational Cryptology and Virology Laboratory $(C + V)^O$ \\ 38 rue des Drs Calmette et Gu\'erin 53000 Laval France}% <-this % stops a space

E. Filiol is the only one with an email address, and apparently the main author of the paper.

About the sources of the papers, almost a third (3/10) of them are from "Filiol et al."


The paper is making several baseless/inflated insinuations, also known as loaded questions, a classic fallacy technique.

Officially, this foundation has no link with US government (any other one) and is independent (Dingledine, 2017). There is a growing feeling that this may not be the case.

Recurrent questions arise that put this apparent independency into question: what if the US government was behind the TOR network and somehow controls it?

In fact, the TOR project is an implementation of a concept born in the US Naval Research Laboratory (Goldschlag et al., 1996; Syverson et al., 1997). Paul Syverson is the designer of the routing protocol and was part of the original development team of the TOR network. Hence the TOR infancy was clearly linked with the US government and still is.

Furthermore, Roger Dingledine spent a summer in internship in the NSA, so we can suppose that he has kept a few contacts in there

The owner is Roger Dingledine, one of the three creators of the TOR Project (and a former NSA employee).

Roger only did a Summer internship at the NSA, I wouldn't call him a "former NSA employee".

Sloppy research

The title of the paper is "OSINT Analysis of the TOR Foundation", and refers the "TOR foundation" or "foundation" at least 40 times in the paper, as well to a company and a firm, but there are no such things: The Tor Project, Inc. is a "Massachusetts-based 501(c)(3) research-education nonprofit organization". Moreover, the proper capitalization is Tor to refer to the project, and tor to refer to the client or the network.

The authors didn't do a proper job to find the current Tor specification:

In this part, we will talk about the directory authorities (see tor-0_2_1_4_alpha/doc/spec/dir-spec.txt for details).

The canonical link for it is The linked Tor was released in 2008-08-04, ten years before the publication of the paper.

The article doesn't understand the concept of pseudonymity:

It is a real problem for the network: do users can trust people they do not know? Where do these people come from? What is their background?

The Tails developers are all pseudonymous, it doesn't prevent the project from being used and trusted by thousands of people around the world, and endorsed by many.

Some famous projects have (or used to have) pseudonymous contributors: Bitcoin, Truecrypt, DOTA, … most of Wikipedia's contributors are too, and all of those projects are used and trusted.

I'm way more comfortable knowing that the directory authorities aren't all managed by Tor employees. Moreover, only a single authority (two when the paper was written) is managed by well known collectives/pseudonymous people.

All of them are well established entities, known and trusted by many. Saying that they are unknown and with a mysterious background is a pretty bold statement. Moreover, "where do these people come from" is a pretty irrelevant question.

Peter Palfrader was the owner of tor26 (the first directory authority which does not belong to Roger Dingledine). Released in the version tor- in October 2004, the directory authority is not working anymore.

This is a plain lie: tor26 is working continuously since at least 5 years.

If a few people need to be on the Core People page, it will be the founder of the TOR Foundation and the people running a directory authority. With this disappearance, the customers have less information about the people who actually handle the network.

Although Paul Syverson worked with Roger Dingledine and Nick Mathewson, he never was part of the Tor Project Inc. He's still doing research on Tor, anonymity and onion-routing though.

On a side note, using the term "customers" instead of "users" is interesting: Tor has nothing to sell, everyone can use the tor network for free.

There are at least 25 research papers coming from Paul Syverson for the TOR network. The last example in date was the 18th of September 2017 for the version tor- which was imple- mented by following a paper wrote by Paul Syverson and his team from the US NRL only.

Saying that there are "at least 25" papers without naming a single one of them is not a correct way to provide sources. Referring to a paper by its date of publication isn't either. The paper in question being likely Never Been KIST: Tor’s Congestion Management Blossoms with Kernel-Informed Socket Transport by Rob Jansen, John Geddes, Chris Wacek, Micah Sherr and Paul Syverson, followed by Tor's Been KIST: A Case Study of Transitioning Tor Research to Practice by Rob Jansen and Matthew Traudt.

The first paper wasn't written by "Paul Syverson and his team form the US NRL only": only Syverson and Jansen are from the U.S. Naval Research Laboratory; Geddes is from the University of Minnesota while Wacek and Sherr are from the Georgetown University.

Officially, TOR is not developed anymore by the US government but a major part of changes was designed and developed by Paul Syverson through the US NRL and some people have work closely for the US government (not only among founders).

This is a bold statement without any kind of proof, but because the Tor Project has a lot of code split in different projects, a simple git shortlog on tor's source code shows that this is completely wrong:

$ git show | grep '^Date'
Date:   Fri Sep 21 09:54:22 2018 -0400
$ git shortlog -s | sort -nr | head -n 25
 16963  Nick Mathewson
  6245  Roger Dingledine
   715  Peter Palfrader
   678  David Goulet
   546  George Kadianakis
   502  Sebastian Hahn
   492  teor
   417  Andrea Shepard
   362  Karsten Loesing
   322  Mike Perry
   300  Andrew Lewman
   268  teor (Tim Wilson-Brown)
   234  Robert Ransom
   221  rl1987
   150  Alexander Færøy
   145  Isis Lovecruft
   137  cypherpunks
   111  Linus Nordberg
    87  Steven Murdoch
    83  Taylor Yu
    77  Yawning Angel
    73  Cristian Toader
    47  Neel Chauhan
    47  Jacob Appelbaum
    46  Paul Syverson

In this list, only Paul Syverson has (public) affiliations with the US government.

We note that the Core People page is not containing infor- mation about a few important people in the TOR Foundation. This page is not sufficient to have an idea of who are the true leaders of the foundation. We have explained who are the leaders of the network (directory authorities) but not those of the foundation.

The board of director of the Tor project is public, and apparently, the authors of the paper forgot to check the Past Contributors, because it documents the role of every single significant past contributor to the Tor Project.

Some contractors were hired, Pearl Crescent for example (a developer), and were “hidden” by the foundation. The TOR foundation asks indirectly a blind trust on the source code (due to the huge amount of line) and they give the development to people we do not even know.

Pearl Crescent isn't a developer at all, it's a company, referred as Pearl Crescent LLC. in the report. Its activity was thoroughly documented on the tor-reports mailing list, and their patches publicly (like any other ones) reviewed.

We discover a few names that are not on the Core People page. Rob Thomas, Meredith Dunn, Andrew Lewman, Mike Perry and Andrea Shepard are still unknown.

Rob Thomas is the founder and CEO of Team Cymru.

Meredith Hoban Dunn is an accountant, advisor, and banker. She's the one that signed the financial audits reports, and is designated as the treasurer of The Tor Project, Inc in it.

Andrew Lewman, as indicated on the past contributors, is the former Executive Director. He managed the business operations of The Tor Project, Inc. Played roles of finance, advocacy, project management, strategy, press, law enforcement liaison, and domestic violence advocacy. He was (likely, I don't have much details) fired, and is now running a shady company that does darknet-related-intelligence-magic-stuff.

A quick glance to the Which PGP keys sign which packages page shows that Mike Perry is/used to be the Tor Browser's lead developer. The financial report indicates that he's a developer, and a quick glance to the commits history of tor quickly confirms this. He was my mentor during my Google Summer of Code, in 2011, when I wrote the first iteration of MAT. I'm not surprised that he doesn't want to appear on the "Core People" page: he's a very private person.

Andrea Shepard was a Tor developer, as shown by a quick git shortlog, and as indicated in the 2015's financial report. She was brought to the fore during the Jacob Appelbaum events.

The TOR Foundation is regularly claiming that the US government is not funding anymore the TOR Project (Din- gledine, 2017)

This is a plain lie: the document to source this affirmation is Roger's DEFCON 25's presentation, which actually shows that Dingledine actually debunked the following "myths" during his talk, along with several other ones listed in the paper:

  • “I heard the Navy wrote Tor originally, so how can I trust it?”
  • “I heard the NSA runs half the relays.”
  • “I heard Tor gets most of its money from the US government.”
  • “I heard 80% of Tor is bad people.”

The table 2. is right (notwithstanding the typos), but since it's mostly copy/pasted data from the financial report, it's not surprising.

We will not develop most of the technical aspects that could suggest or confirm that somehow the TOR network has been designed or is managed in such a way that a few “facilities” are possible and would enable to take control over it. As a consequence, taking the control of a reduced number of TOR relays (from 450 to 1400 only) would enable to reduce the TOR traffic of at least 50 % and would greatly ease correlation attacks (about 35 % of the traffic) or eavesdropping (about 10 % of the traffic).

Yet an other loaded question, and references to other papers from Filiol; I might publish my lecture notes about them at some point in the future too.

As far as the relay bridges management is concerned, it has been possible to extract slightly more than 2,500 such bridges thus compromising the alleged ability to bypass censorship.

This has been debunked several times.

During our study, in September 2017, we were contacted by a user of a custom TOR library. This library is the “node- Tor” written in JavaScript and allows the user to create and run a node or connect to the TOR network. Further exchanges with this person have shown a lot of inconsistency and irregularities.

The person here is actually Aymeric Vitte. I sent him an email, and he felt that Filiol's paper deserved a public response on tor talk. I do recommend its reading ;)

At first, we talk about the way that his node was added to the network. For this custom library, the user asked the TOR foundation to add a node with this library and after an exchange of a few mails the node was accepted and run. The library is very different from the original source code. To compare very simply those two codes, we just compared the number of code lines. We know that the number of code lines does not really reflect the effect of the code but between the original source code (several hundreds of thousands code lines) and the library (only fifteen hundred code lines), we can assure that it is very likely that a number of options or securities are missing.

They are comparing the number of lines in a minimal javascript (a high-level language) implementation of Tor, and the official full-blown implementation, written in C (a kind of low-level language): this comparison metric doesn't make any sense.

Moreover, implying that Tor-node is only 1500 lines of code is a ludicrous claim, given how much it does.

Anyone can add a node to the network, there is no such thing like "ask the TOR foundation (sic.)" to add one.

It is not the designer of this code who is responsible but rather the TOR foundation for accepting a node on the network with this kind of library. The first problem is that no one is warned that this node is special and is not running the official source code. This node owned by a user is not controlled by the TOR foundation. So if the user is malicious, he could modify his node and make every change he wants. If a government wants to include this kind of node to log the traffic and gather it, he can do it very simply and without triggering any alert.

Having several implementations of tor relay running on the network is a actually a great idea: this improves the security of the network (a bug found in an implementation might not be present in an other one), and helps to find bugs or specification issues, which is a great thing in my opinion.

For example, CVE-2018–17144 was likely found due to implementation disparities between different bitcoin clients.

The tor network doesn't put much trust into relays themselves: any entity is free to run whatever nodes it wants, this is how the network is designed to work. Although, abuses might happen, and this is why there as several documented countermeasures and monitoring projects: Volunteers are running continuous checks to measure the integrity and trustworthiness of exit-nodes: are they tampering with the traffic or running active analysis? Malicious nodes are flagged and blacklisted from the network on a continuous basis.

If the security of the network is ensured by the fact that all the nodes run the same source code, with the same security level, the same options and so on. . . this fact proves that the TOR network is not so secure.

It's absolutely not the case, as explained in the previous paragraph. It seems that Filiol et al. have no idea about the threat model nor implementation of the Tor network at all.

We have discovered that with only few exchanges with the TOR foundation, we can add a custom node (possibly malicious). As for every node, no systematic control is possible by the TOR foundation, once accepted within in the network, we can do what we want with this node, log the traffic, insert biases in the creation of circuits etc. . . In summary, we think that the TOR project should not accept custom codes in order to respect the uniformity of the network that ensures “security”.

As previously explained, the only way to add a node to the network is to register it to the authorities, there is no such thing as "few exchanges with the TOR foundation", since the network isn't managed by it, nor by anyone, expect the authorities.

The very fact that anyone can run a relay ensures the security and anonymity of the network: imagine if a single entity would approve or reject who could join tor…

As far as confidence is concerned, nobody (except state organization) has the courage/time to read the source code and no one is paying attention to the designer of the changes on the TOR source code

A quick glance at the git showlog gives a rough estimate (there might be duplicates) of the number of committers:

$ git log --format='%aN' | sort -u | wc -l

This is a conservative estimation of the people that not only bothered to read the code, but even contributed to it.

As a comparison this is the same command run on GNUPG's git repository, the library that everyone uses to encrypt emails and sign software in the Linux world:

$ git log --format='%aN' | sort -u | wc -l 

An other indicator of the attention that Tor is getting might be activity on Tor's bugtracker timeline, where it's not uncommon to have more than 100 different actions per day, by a lot of different people.

Paul Syverson (from the US NRL) is the original designer (not developer) of most of implementation. The last version of TOR is the perfect example: all major changes are coming from the US NRL.

We already debunked this by looking at the git commit history.

No official statement revels that the US government is helping the TOR network but all the information gathered during our study seems to confirm that the US government is still deeply involved in the TOR project

The sponsors page is public, and lists every major sponsors. The fact that the US government is giving grants to researcher to study anonymity and resilience is pretty healthy for the Tor Project, and doesn't mean, at all, that the US government is "deeply involved" in the project. At least not significatively more that the other major donators like the EFF, Human right Watch, Google, the Freedom of the Press Foundation, Reddit, …

This study is not claiming breaking the TOR network or affirms that the US government is the real organization behind the TOR project.

This blogpost is not claiming that E. Filiol is a clown, nor affirms that he hasn't done any worthy contribution to computer science in years.

However favoring such a network would be a clear violation of the Wassenaar Agreement ( unless some sort of control is in place in a way or another (Filiol, 2013).

The paper cited here (Filiol, 2013) is "The Control of Technology by Nation States – Past, Present and Future – The Case of cryptology and Information Security”, Journal in Information Warfare, vol. 12, issue 3, pp. 1—10, October 2013.", published behind a paywall. Fortunately, it's possible to access it via Google books. In this paper, Filiol is speaking mostly about France, while The Tor Project, Inc. is an American entity, but this doesn't matter much in our case.

Since I'm not a lawyer, I asked a good friend of mine, who happens to be a legal advisor, specialised in international and French business' Law, to help me with this part.

The List of Dual -Use Goods and Technologies and Munitions List states that "Controls do not apply to "technology" "in the public domain", to "basic scientific research" or to the minimum necessary in formation for patent applications.".

A quick looks at the definition part of the document shows the following: "In the public domain": This means "technology" or "software" which has been made available without restrictions upon its further dissemination. Note: Copyright restrictions do not remove "technology" or "software" from being "in the public domain".

This is the case of Tor, and other Free (as in freedom) software, that are thus not subject to the Wassenaar Agreement, at all. A quick glance at the comprehensive FAQ from rapid7 about the Wassenaar Arrangement, or the small blog post from GNU confirms our interpretation.

This study aims at informing TOR users and to make them aware of network like the TOR network and the possible reality behind. Customers need to be informed before using any network who claims to protect your privacy and anonymity.

This blogpost aims at informing the public and to make it aware of charlatans like E. Filiol and the possible reality behind. People need to be informed before citing any work from this person, inviting him at conferences, or asking his opinion.


This is a botched paper in broken English, filled with approximations and sheer inventions about Tor.

Tracking Tor's network-wide V3 onion service outages

Major update 28 Jan 2021 (UTC): It's happening again, but this time the large amount of directory traffic is coming from exits. We've missed three consensuses, so v3 onions will be going down. Dirauths are already discussing and trading patches to mitigate the issue in the short term. The long-term solution for not allowing people to use exits to do this is tracked here. Read the main body of this post for more information on, e.g., what a "consensus" is and how not having them affects onion services.

It is January 13th, 2021 as I finish writing these initial words. Major updates may get a date stamp next to them.

Bottom line up front

  • Someone is sending the directory authorities (and fallback dirs) lots of traffic.
  • This causes the dirauths to no longer be able to reliably communicate.
  • This means consensuses are no longer reliably produced every hour.
  • No new consensus three hours in a row means new connections to v3 onion services stop working because of a bug. Existing connections survive, and no other part of Tor breaks at the three hour mark.
  • There is an alpha release for experts who know what they are doing. It is making its way into all supported stable Tor versions.

Please keep these facts in mind:

  • It is unknown if the traffic hitting the dirauths is maliciously motivated. People keep calling it an attack. I don't think we have the evidence to back that up at this time.

  • There is no evidence that the traffic overload is actively trying to hurt v3 onions. A similar situation existed last year and onions didn't go down then. Claims that it is "the" government or rival drug markets are not backed up with any evidence that I've seen.

If you have evidence of who is behind this traffic, please let someone know. Tor Project or me (blog comment, an email (listed on About Me), or IRC message)

While I currently work on Tor-related stuff for my job, nothing contained in this post has anything to do with my work. Everything contained in this post is public unclassified knowledge. Opinions expressed, if any, are my own.

Traffic starts hitting dirauths (again)

Roger points out on January 6th that "the overload is back".

It's not OMGWTFBBQ levels of traffic. It's not from one IP nor is it from IPs all over the Internet. One dirauth says it seems to be a poorly written custom Tor client requesting directory information too often.

Three missed consensuses

On January 9th, 10th, and 11th, there are repeated instances of 3 or more consensuses in a row that are not generated.

This is the trigger for v3 onions no longer working. Consensuses are generated every hour and are valid for 3 hours. Most parts of Tor (v2 onions, general circuit building, etc.) do not require a live (currently valid) consensus, but can get by just fine with a recently valid consensus (expired less than 24 hours ago).

January 12th saw a few missed consensuses, but never 3 in a row. No consensus has been missed so far on the 13th.

The bug and its fix

The v3 onion service code was written to require a live consensus, and it didn't need to be (devs are verifying this). The fix for this bug changes the requirement to just a recently valid consensus. It's getting tested as I write these words on January 11th. The fix, or something very similar to it, will be merged and backported in the coming days, at which point it's up to the packagers for your OS or your tor-derived software (e.g. Tor Browser) to notice the update and distribute it to you. I would expect Tor Browser to be updated very quickly. Debian will probably take a day or two. Other distros, I have no idea.

If you're watching tor's logs, the current date includes "January" and "2021", and you see the following message, then you have most likely hit the bug.

Tried for 120 seconds to get a connection to [scrubbed]:6697. Giving up. (waiting for rendezvous desc)

The 6697 is not important. The "waiting for rendezvous desc" is important.

Status of the fix making it into Tor

Primary sources for this section: bug #40237.

  • Jan 12th: the fix is merged into [blog post]

Upcoming events:

  • backports to other supported versions of Tor
  • packaging

Fallback dirs getting hit too

On the 11th we notice the fallback dirs are also failing. This is major evidence in my opinion that this is not a purposeful attack on the dirauths. I, and at least two dirauths, think it is most likely a bad custom Tor client implementation that requests directory information too often.

We see the same failing of fallback dirs on the 12th.

This graph shows how the entire network is fielding more directory requests these days. This graph shows more context for where load usually is. The 1.5 Gbit/s the dirauths saw for ~half of 2020 is talked about in this ticket.

An actual attack could disguise itself like this, but if this were an attack, I would expect it to be more consistently effective at preventing consensus generation.

It might be over now

(Last updated 28 Jan 2021)

The 14th saw no missing consensuses. If you had trouble reaching v3 onion services on the 14th, your issue is unrelated to the topic of this blog post.

The 15th saw no missing consensuses.

The 16th probably didn't miss any consensuses (I waited too long to check. Go look in the archive if you care enough).

The 17th saw none missing.

The 18th saw none missing.

So far the 19th saw none missing.



The 28th of January: lots of directory traffic at dirauths again. It is unknown if fallback dirs see it too.


What Tor relays/clients need to be updated?

No relays need to be updated. Tor clients hosting v3 onion services and Tor clients wishing to visit v3 onion services will need to be updated when the fix is released.

How do I update my Tor?

(Last updated 13 Jan 2021)

You don't yet, unless you're willing to compile and use a version of Tor that isn't considered stable yet. If you're willing, then see

Should we temporarily downgrade to v2 while waiting for a fix?

If absolutely necessary, sure. Please keep in mind v2's issues (briefly described below in the glossary) and be aware that "temporary" probably means ~1 week (crossing my fingers). I personally will just suffer and wait for the fix to be released.

Are v3 onion services fundamentally broken?


Is this really major?

Eh ... yes because of the impact, but no because the fix is easy and will be out quickly.

Who was it?

One possibility is people using some 3rd party Tor client called "torpy." See the 5+ messages in this tor-dev@ thread, especially Sebastian and Roger's responses.

Glossary and preemptive rebuttals

Directory authorities / dirauths

There are 9 relays operated by highly trusted individuals that decide the state of the network. They decide what relays are a part of the network and what certain network parameters should be set to.

In a way they are a "single" point of failure and make the Tor network "centralized." Decentralizing their role to 100s, 1000s, or every relay would:

  1. require massive fundamental changes to how Tor works. By itself this probably is not a convincing reason to not do it.

  2. open Tor up to new attacks it currently isn't vulnerable to. This should be a bit more convincing. The keyword to Google for most of them is "Sybil".

Having a "single" high quality root of trust is a valuable property that "decentralize all the things!" people do not generally appreciate enough, in my opinion.

You: This v3 onion fix doesn't actually address the root problem: the dirauths weren't able to communicate and create consensuses.

Yes! You are absolutely right that something about how the dirauths work should change such that they can continue communicating with each other even in the presence of malicious (purposefully malicious or not) traffic! This ticket is one idea and a good place to start your research if I say nothing more and you want to research what is being done yourself. They might also update this ticket.

V3 onion service

The new type of onion service. Names are 56 characters long, like tv54samlti22655ohq3oaswm64cwf7ulp6wzkjcvdla2hagqcu7uokid.onion. V2 onions are 16 characters long, like 3g2upl4pq6kufc4m.onion.

V2 onion services use old crypto and old protocols that are obsolete and dangerous now or will be soon. The code for v2 is messy and the protocol is not extensible. V2 onions are vulnerable to harvesting by malicious HSDirs, and these malicious HSDirs exist today (and are removed as soon as they are detected). Support for v2 onion services will be removed from Tor soon. V3 onions are the future despite the current events.

Fallback directory mirror / fallback dir

Tor clients don't usually directly fetch consensus information from a dirauth anymore. There's too many people using Tor and only 9 dirauths. Instead, a large number of relays that are high quality have opted in to also be hardcoded into Tor source code so clients can usually fetch consensus data from them on first run. After first run and successful connection to Tor, clients get this stuff from their guard instead.


The dirauths vote on what relays are currently in the network and what certain network parameters should be set to. The "average" of their votes become the consensus. They make a consensus every hour and each is valid for three hours. Clients typically fetch a new consensus every two hours.

You can see information from the current consensus here. Recent consensuses are archived here and older ones archived here.

This blog is powered by ikiwiki and uses CSS based on this.