[tor-bugs] #13616 [Onionoo]: define jmeter testcase(s) ant ant task(s)
Tor Bug Tracker & Wiki
blackhole at torproject.org
Sat Apr 18 10:05:07 UTC 2015
#13616: define jmeter testcase(s) ant ant task(s)
-----------------------------+-------------------------------
Reporter: iwakeh | Owner: iwakeh
Type: enhancement | Status: needs_information
Priority: major | Milestone:
Component: Onionoo | Version:
Resolution: | Keywords:
Actual Points: | Parent ID: #13080
Points: |
-----------------------------+-------------------------------
Comment (by karsten):
I finally started working on this, because I want to have some baseline
for considering switching to a database (#11573). I briefly looked a
JMeter, but found it too heavy-weight for our purposes. I decided to
instead use `httperf` and a simple shell script around it:
{{{
#!/bin/bash
URIS=(
"/summary?limit=1&"
"/summary?limit=1&type=relay"
"/summary?limit=1&type=bridge"
"/summary?limit=1&running=true"
"/summary?limit=1&running=false"
"/summary?limit=1&search=moria1"
"/summary?limit=1&search=ria"
"/summary?limit=1&search=a"
"/summary?limit=1&search=9695DFC35FFEB861329B9F1AB04C46397020CE31"
"/summary?limit=1&search=9695DFC3"
"/summary?limit=1&search=969"
"/summary?limit=1&search=DD51A2029FED0276866332EACC6459E1D015E349"
"/summary?limit=1&search=DD51A202"
"/summary?limit=1&search=DD5"
"/summary?limit=1&search=lpXfw1/+uGEym58asExGOXAgzjE"
"/summary?limit=1&search=lpX"
"/summary?limit=1&search=128.31.0.34"
"/summary?limit=1&search=128.31.0"
"/summary?limit=1&search=128.31"
"/summary?limit=1&search=128"
"/summary?limit=1&lookup=9695DFC35FFEB861329B9F1AB04C46397020CE31"
"/summary?limit=1&lookup=DD51A2029FED0276866332EACC6459E1D015E349"
"/summary?limit=1&country=us"
"/summary?limit=1&as=3"
"/summary?limit=1&flag=Running"
"/summary?limit=1&flag=Authority"
"/summary?limit=1&first_seen_days=0-2"
"/summary?limit=1&first_seen_days=0-3"
"/summary?limit=1&first_seen_days=3"
"/summary?limit=1&contact=arma"
"/summary?limit=1&contact=arm"
"/summary?limit=1&contact=a"
"/summary?limit=1&order=consensus_weight"
"/summary?limit=1&order=-consensus_weight"
"/summary?limit=1&family=9695DFC35FFEB861329B9F1AB04C46397020CE31"
"/details?limit=100"
"/details?limit=100&offset=500"
"/details?limit=100&fields=fingerprint"
"/bandwidth?limit=100"
)
for (( i = 0 ; i < ${#URIS[@]} ; i++ )); do
if [ ! -d ${URIS[$i]} ]; then
httperf --server=onionoo.thecthulhu.com --uri=${URIS[$i]} \
--port=443 --ssl --num-calls=10 --verbose >> perf.log
fi
done
}}}
The output is a verbose log containing lines like this:
{{{
Reply time [ms]: response 48.0 transfer 0.1
}}}
Here are results starting with highest response times:
{{{
121.9 /summary?limit=1&order=-consensus_weight
121.4 /summary?limit=1&order=consensus_weight
}}}
Looks like ordering results is a really expensive operation, which is
something I didn't expect. But I do expect that a database would be
better at this.
{{{
106.8 /summary?limit=1&search=9695DFC35FFEB861329B9F1AB04C46397020CE31
}}}
Searching by such a long string (a full hex fingerprint) shouldn't be as
expensive.
{{{
105.8 /details?limit=100&fields=fingerprint
}}}
This one is expensive, because we need to parse 100 JSON documents and
produce 100 new JSON documents on-the-fly. This won't get any faster when
moving the search index to a database, unless we use a database that can
store and process JSON documents. Though I'm not certain whether that
will be faster and worth the effort.
{{{
105.3 /summary?limit=1&search=DD51A2029FED0276866332EACC6459E1D015E349
89.1 /summary?limit=1&search=lpXfw1/+uGEym58asExGOXAgzjE
88.5 /summary?limit=1&search=9695DFC3
85.4 /summary?limit=1&search=969
72.7 /summary?limit=1&search=DD51A202
71.3 /summary?limit=1&search=lpX
70.2 /summary?limit=1&search=moria1
69.9 /summary?limit=1&search=DD5
68.7 /summary?limit=1&search=ria
68.6 /summary?limit=1&search=128.31.0.34
67.2 /summary?limit=1&search=128.31.0
66.9 /summary?limit=1&search=128.31
65.0 /summary?limit=1&search=128
61.3 /summary?limit=1&search=a
}}}
All these searches should be faster. What's interesting is that longer
search terms take longer than short ones, even though shorter terms
produce many more intermediate results (and we don't stop after the first
result, even though we could). My hope is that a database will make all
these searches faster, though I'm slightly concerned that substring
searches with very few characters (like `ria` in `moria1` or even `a` in
`arma`) might not be as fast. I could imagine changing the protocol to
require at least three characters in these searches to make use of trigram
matching. After all, what do people expect when they search for `a`?
{{{
54.4 /summary?limit=1&first_seen_days=0-3
54.2 /summary?limit=1&first_seen_days=0-2
53.9 /summary?limit=1&first_seen_days=3
52.8 /summary?limit=1&running=false
}}}
These look okay.
{{{
52.2 /details?limit=100&offset=500
}}}
This includes transfering 100 details documents, so should be fine.
{{{
51.9 /summary?limit=1&flag=Running
51.0 /summary?limit=1&running=true
50.2 /summary?limit=1&contact=arma
}}}
These look fine.
{{{
49.9 /details?limit=100
}}}
This is a tiny bit faster than the search above with offset 500, which
seems reasonsable.
{{{
49.4 /summary?limit=1&contact=arm
49.1 /bandwidth?limit=100
48.0 /summary?limit=1&
47.5 /summary?limit=1&as=3
47.2 /summary?limit=1&country=us
47.1 /summary?limit=1&flag=Authority
46.9 /summary?limit=1&contact=a
46.5 /summary?limit=1&family=9695DFC35FFEB861329B9F1AB04C46397020CE31
45.8 /summary?limit=1&type=relay
44.2 /summary?limit=1&type=bridge
42.2 /summary?limit=1&lookup=DD51A2029FED0276866332EACC6459E1D015E349
42.2 /summary?limit=1&lookup=9695DFC35FFEB861329B9F1AB04C46397020CE31
}}}
No surprises here, these all look good.
So, what other searches did I miss?
--
Ticket URL: <https://trac.torproject.org/projects/tor/ticket/13616#comment:6>
Tor Bug Tracker & Wiki <https://trac.torproject.org/>
The Tor Project: anonymity online
More information about the tor-bugs
mailing list